But thats not a simple solution for an IT industry facing a shortage of COBOL programmers. Demand for COBOL training has fallen to the point where colleges are not even offering it. This despite the fact, that there are 180 billion lines of COBOL code in use today and that COBOL programmers are in such demand they can command big-time money.
For most companies, the COBOL brain-trust is currently focused on things like pension payouts, retirement villages and golf communities. In nine years, where will they be?
If youre talking IT, this has to be a bit troubling. And not just for COBOL shops. Im talking about the many elements that go into what are euphemistically referred to as "legacy" systems.
Unplanned, Layered Progression
Mark Lillycrop of Arcati Research points out an interesting phenomenon that is somewhat peculiar to the IT industry: These so-called legacy systems have not really evolved in the sense they developed and changed over the years. It would be more accurate to say these systems have been built one on top of the other.
Each succeeding technological development, fad or trend has been laid down in layer after layer of developed code and system architecture making up the whole of the deployed system.
One is reminded of an ancient city that has been built and rebuilt many times over the ages. Archeologists can dig down through the layers to find all manner of things. On top may be the asphalt or concrete of our age, but not far beneath that, you will find cobblestones and beneath those you will find gravel and finally dirt. Each layer is supporting the other. You cant pull out the cobblestones without destroying the asphalt on top.
In a sense, you are just as dependent on the gravel and cobblestones below as were the Romans who built the road in the first place.
Forrester analyst, Phil Murphy, expanded on this in a January 2006 interview in Optimize magazine. The issue is not that a single ubiquitous language will suddenly be without a population of programmers to maintain it, rather the issue is this phenomenon is repeated over and over with multiple generations of languages, architectures and strategies.
We would like to think of our systems as being unified, consolidated and efficiently interfaced with one another, perking along while a few systems engineers closely monitor their performance. We visualize the clichéd white-coated computer operators adjusting this knob, tweaking this bit of code, squeezing every MIPS of performance out of the hardware, minimizing the consumption of resources and maximizing the response time for our end-users. The reality is quite different.