The Insurance Industry Challenge: Improve Software Risk Management

Insurance organizations have reached a tipping point. Historic institutions, with in some cases hundreds of years of service, they are being forced to transform due to changing consumer demands and nimble, technology-centric startups bringing innovative products to market. No stranger to regulatory and privacy concerns, Insurance carriers have overcome many roadblocks throughout their lifetime of doing business. Now they must tackle their legacy IT systems and improve software risk management to deliver the value today’s market is after.

As recently reported by CAST Research Labs, Insurance organizations face four main challenges to their IT transformation:

  • Insurers still maintain large numbers of COBOL applications.
  • North American Insurers, in particular, have bigger, more complex applications to maintain.
  • Outsourcing applications – which can be attractive from a budgeting perspective – tends to involve lower quality applications.
  • A high percentage of Insurance applications still rely on legacy technology developed and written by people who have since retired from the workforce.

Summarized below, the 2016 CRASH Report on Insurance, evaluated global trends in the structural quality of business applications in the Insurance industry. CAST Research Labs measures application structural quality against five primary health factors: Security, Performance Efficiency, Robustness, Transferability and Changeability.

CRASH Report on Insurance Infographic

The CRASH Report on Insurance was developed with contribution from our partner, CGI. The CRASH findings below also reflect the insights of CGI’s insurance industry experts.

Insurers Maintain Large Numbers of COBOL Applications

Insurance companies continue to utilize core insurance applications that were originally built in the 1970’s and 1980’s. These systems are never really retired but they are “run off”. Because insurance policies (contracts) on the life insurance side of the industry can be greater than 20 years in term, the systems are preserved to process older policies – hence the high number of COBOL applications.

Additionally, the data stored in the system from release to release often changes in definition and the older system is still required to interpret the data. As insurers modernize and re-platform applications for greater agility, it is useful to assess software risk as these re-worked applications move to production. Dealing with legacy code in older systems that have been updated in a piecemealed manner can present security and privacy risks, not to mention an increased possibility of system outage.

Overall Quality and Security is Lacking with North American Insurers

Insurers, particularly those in the US and North America, are among the least secure when compared to other businesses in the financial services sector. For example, our CRASH Report found 6.8 million violations of code unit level quality rules in the insurance sample, of which 28% received the rating of “high severity.” It can be said the security problem is worsened by US state-based regulatory system, which subjects insurers to 50 sets of differing rules that must be addressed within the legacy systems, often with deadlines that approach far too swiftly for software engineers to update the technology in a secure way.

As my colleague, Dr. Bill Curtis stated in a recent press article, “these systems are huge and not well-documented, so it becomes harder to make changes to them. With required regulatory changes, it becomes harder and takes longer to create solutions than it does in the EU, where there is more consistency in regulations.”

Outsourced Applications Tend to be of Lower Quality

Given the length of time most Insurance applications have been running in-house, the code base has become much larger, and it is frequently customized with ‘bolt-ons’ to address the need for new functionality and regulatory change. There has been minimal investment to optimize or streamline this code that is then outsourced.

As IT departments look for cost efficiencies, the size of the code base can become a challenge and precipitate the move to outsourcing. Mainframe costs are significant, and more MIPS means significantly more costs for hardware and operating systems. But as applications are outsourced, the need for transparency and measurement does not dissipate.

It is important for the client and vendor to set goals and priorities against nonfunctional requirements that might drive business risk and long-term cost. The best practice is to establish metrics against corporate or industry quality and software risk standards, like those published by CISQ.

Legacy Technology and Attrition Challenges

Insurance carriers are aware that their legacy environments are more costly to operate and maintain than modern systems. However, much of the data stored in these systems is difficult to analyze and migrate to newer environments due to the attrition of staff with knowledge of the legacy systems and data structures of many releases of the software.

Migration is costly. As an alternative, carriers opt to buy time and run off rather than retire the systems. System-level analysis and software measurement can play an important role in helping insurers understand the true cost of these legacy systems as well as the benefit of transition to modern IT environments.

To download the full 2016 CRASH Report on Insurance, click here.

Get Your Free White Paper And Learn How Software Analysis Can Help Your Business

Learn why you need to build security into your applications and how it will help improve and protect your business. Click the button below to get our FREE copy today.

Your Information will be kept private and secure.

Comments

comments