Keep an eye on legacy apps, COBOL’s not dead!

Third-generation programming languages (3GL) like COBOL or PL/1 are seen as outdated languages for “has-been” developers, and do not interest new ones anymore (there were even predictions saying that COBOL was going to die in mid-term.) These new developers prefer more modern technologies, like J2EE or .NET, and, worryingly, educational organizations provide few learning opportunities for 3GLs.

As a consequence, people who can create and maintain applications implemented in COBOL or PL/1 are not as common. Some shops have had to organize quick training sessions to maintain their manpower. In other words, there are few experts but the need is there! Fortunately, there are initiatives like those done by platform and compiler vendors:

  • Micro Focus, which develops COBOL academic program in universities.
  • IBM and the “Banque Postale” train tomorrow’s specialists in large information systems at SUPINFO. The objective is clearly to train COBOL and mainframe specialists to supply the company’s demands.

These applications were created many years ago and are now called “legacy apps,” although they are still alive (yes, COBOL’s not dead!) and continue to manage many businesses around the planet. Analyzing legacy code is like an archeological mission. I have often experienced such situations where I’ve gone back in time discovering different layers of code, implemented by multiple waves of developers (some are already retired!).

We can see the different ways and practices that were used to produce code with the same programming language. Program headers look like books of history—some are so long they give a preview about the size, complexity, and the probability of finding error-prone code, potential bugs, and bad coding practices.

If experienced 3GL developers are not numerous, then knowledge regarding these apps becomes quite low in terms of:

  • Technology: COBOL provides some special behavior regarding data management. Operations on files must be done the right way, and databases cannot be accessed without respecting some best practices.
  • Architecture: Portfolios of applications are often vast, and specific layers can be defined to manage data access or program execution through complex technical environments. Moreover, nowadays, applications are often connected together and it is necessary to have an understanding of the overall system.
  • Business rules: What is really implemented in the programs? How does it works? In some shops, the specification is the source code itself and it is necessary to retro-engineer the programs to know how information is calculated.

Even if there aren’t many changes, quality measurement is particularly important to avoid having these central applications out of control. This can help developers produce robust and efficient code more easily, and enable application owners to check if updates completed by outsourcers respect coding guidelines.

Also, these applications generally have a sizeable technical debt, due to their many years of service. Even if it is not possible to resolve the whole debt, it is important to keep it under control and  avoid increasing its size. Maybe in 100 years they will be replaced by new technologies.

Legacy apps are no longer on the main stage, but they do generate huge amount of work for development teams. So keep an eye on the code!

Get Your Free White Paper And Learn How Software Analysis Can Help Your Business

“The Science of IT Planning and Budgeting” helps senior leaders challenge the status quo and to understand how a research-based, automated solution transforms this important process by injecting it with fact-based, objective insight.

Your Information will be kept private and secure.

Comments

comments