Over the past decade, advancements in static analysis tools from both commercial and open source communities have dramatically improved the detection of developer violations of good coding practices. The ability to detect these issues in coding practices provides the promise of better software quality.
Yet many of these static analysis tools cannot detect the critical violations that exist in multilayer architectures, across transactions and multi-technology systems. These are the violations that lead to 90% of a systems reliability, security and efficiency issues in production.
(Figure 1 illustrates these rules at the Unit and Technology/System Levels.)
CLICK THE IMAGE FOR A LARGER AND CLEARER VIEW.
For the last half-decade, a debate has raged over which project management method reigned supreme – Agile or Waterfall. To determine which held the advantage, some looked at the management techniques and fluidity with which projects were completed, others judged the debate by pointing to the structural quality of the applications being developed.
Digital transformation is a project many business executive leaders have recently taken on, especially those in banking and financial services. These organizations are competing to digitally transform front-end systems that are connected to brittle legacy systems. The subsequent failure to identify the structural vulnerabilities in combined applications, produces security and reliability issues the negate the value of digital transformation.
If you read the news these days, one would think that software security is something that is layered on top of existing software systems. The truth is, however, that software security needs to be woven into the very fabric of every system and this begins with eliminating vulnerabilities by measuring software quality as the system is built.
During the CAST Software Quality Fall Users Group, Dr. Carol Woody, PhD, senior member of the technical staff at the Software Engineering Institute (SEI) at Carnegie Mellon University, whose research focuses on cyber security engineering, discussed the importance of software quality as a basis for security.
Last month in this space I wrote about the importance of optimizing the cost-effectiveness of Captives (i.e., Global In-House Centers) by setting metrics and enhancing process transparency for better management of them. For these management methods to work, though, an organization needs to employ automated function points as a way to way to gain insight about current costs and supplied value, which can then be used to enhance received output from current or future providers.
They say “if something works, don’t fix it.” This old adage may be the reason behind why some organizations hold onto legacy systems longer than they should, but it is also the reason why these same organizations struggle with software complexity. In fact, according to the GAO, Uncle Sam spends 80 percent of its $86.4 billion IT budget on legacy systems.
Some organizations choose not to struggle, though. Take for example the story of Pennsylvania-based underwriter NSM Insurance Group. It was reported recently on SearchCIO.com that NSM last year purchased a company that still had a COBOL-based back-office system from the 1990’s.
Barbara Beech, an expert in the field of IT development for telecommunications companies, recently spoke to CAST in a video chat about her experience using software analysis and measurement as well as automated function points to gain visibility into IT vendor deliverables.
As a solution to gaining visibility into IT vendor deliverables, Beech points to the CAST Automated Function Points (AFP) capability – an automatic function points counting method that is based on rules defined by the International Function Point User Group (IFPUG). CAST automates the manual counting process by using the structural information retrieved by source code analysis, database structure and transactions.