Why The Feds Still Haven’t Replaced Old Legacy Systems

legacy computer systems
Written by Dennis D McDonald

It’s high time the feds replace its many antiquated legacy computer systems. But there’s a mindset problem …

aNewDomaindennis d. mcdonald kallstadt Large and old organizations typically are slow to change. They invest in systems and processes. And if those legacy systems and processes work, they maintain and upgrade them as things change and new requirements (and competitors) emerge.

Management knows they need to update or even replace such entrenched legacy systems, but when they see the price tag and what it will mean in terms of changes all the systems and processes that touch the legacy system, they balk. “One more year,” management says.

And as all good systems integration consultants know, “one more year” too often morphs into “one more decade.”

One result: there are still many COBOL-and-green-screen systems running even as more modern web based systems have put a new face on them.

How about the federal government’s legacy systems?

John Kamensky’s review of the federal government’s progress in modernizing its IT resources reviews what steps have been taken since 2008 to transform federal IT.

He describes the modernization efforts now underway, as envisioned in the recently released Federal CIO Council’s State of Federal Information Technology.

legacy computer systemsThat document provides a road map for transforming IT away from a decades-long proliferation of custom applications built on top of legacy systems.

As he demonstrates, the groundwork for transformation has been laid in three areas:

  1. Establishing “digital services” teams to help agencies modernize and upgrade citizen facing government services.
  2. Upgrading and adapting government procurement processes to streamline and accelerate IT resource acquisition.
  3. Developing and promulgating improved processed and standards for creating digital services (for example, the Digital Services Playbook).

How successful have these efforts been in preparing for continued “digital transformation” and the replacement of outmoded government legacy systems with more modern and efficient resources?

There has been progress but it is slow. Perhaps it is significant that the three system success stories mentioned by Kamensky — a faster VA disability compensation process, a streamlined certification process for small businesses, and an improved college scorecard tool — seem not to replace existing legacy systems but to make better use of data provided by existing systems.

The major efforts reviewed involve people, procurement, and process, not necessarily technology. That’s all very positive. Still, adding more layers on top of existing legacy systems does not necessarily move us closer to the goal of replacing those systems.

This is a typical system integration dilemma that is often faced by organizations that have not been fortunate enough to grew up in the “digital natives” era.

Looking over the GAO’s 2016 report Federal Agencies Need to Address Aging Legacy Systems illustrates the challenge:

Federal legacy IT investments are becoming increasingly obsolete: many use outdated software languages and hardware parts that are unsupported. Agencies reported using several systems that have components that are, in some cases, at least 50 years old. For example, Department of Defense uses 8-inch floppy disks in a legacy system that coordinates the operational functions of the nation’s nuclear forces. In addition, Department of the Treasury uses assembly language code—a computer language initially used in the 1950s and typically tied to the hardware for which it was developed.

Even though OMB and GAO have provided much-needed management and oversight for efforts at federal computer system “transformation,” the devil is in the details. It always is with system integration efforts that touch on a wide variety of technical systems and architectures. If you have ever studied a huge wall chart illustrating the multiple interconnected systems that support major federal programs at agencies such as EPA, VA, or NOAA, you’ll know what I mean.

Also, just keeping legacy systems running in the background to feed more modern user facing systems is, as pointed out by the GAO report, a risky and expensive proposition. Agile development and procurement methods alone will not solve such problems. What’s needed is a continuous effort — supported at the top — to promote a unified development vision that supports both modernization as well as retirement and replacement of legacy systems.

Will the Trump administration sustain such an effort?

That’s hard to say, especially given emerging funding priorities away from non-defense budgets along with a possible reduction in fed-sponsored data collection and data availability.

Maintaining a strategic focus in such a rapidly changing environment may be difficult, considering the natural tendency of stressed out organizations to focus more on short-term survival and keeping the lights on at the expense of long-term goals.

Eventually, the old systems will break down. They will die when patching and repairing no longer are viable. Even venerable B-52 Stratofortresses will have to be replaced some day at which time the current patchwork of logistics, maintenance, and weapons systems upgrades will go away.

The same will inevitably happen to all the federal legacy systems that need replacement.

The question is, will replacement occur rationally, or only after disaster or organizational turmoil?

As for me, I vote for a rational approach, one that ties open data access and intelligent data governance to systems and processes that effectively and efficiently serve the American public.

For aNewDomain, I’m Dennis D. McDonald.

Cover image: Prweb, All Rights Reserved; Inside image: DennisHouse.TV, All Rights Reserved.

An earlier version of this column ran on Dennis D. McDonald’s DDMCD blog. Read it here. -Ed.