For years refactoring software has been a common process used to improve the quality, efficiency, and maintainability of an application. However, a recent article by IT World discusses how CIOs may not be getting a valuable return on their investment of time and effort into the refactoring process. While many believe refactoring reduces the risk of future headaches, new findings acquired through a study by Sri Lanka researchers suggests code quality is not improved significantly by refactoring.
IT leaders from throughout the federal government discussed the value of how software measurement can positively impact their development process at CAST’s recent Cyber Risk Measurement Workshop in Arlington, VA – just outside of the Washington, D.C. area. The event brought together more than 40 IT leaders from several governmental agencies, including the Department of Defense and Department of State, system integrators and other related organizations. The group shared their experiences in how their respective organizations are driving value to end users and taxpayers.
Measuring and managing software quality is not just about compliance with government mandates, but rather around the proposition that strong software quality, security and sustainability are paramount. However, compliance remains essential. Three primary points around software compliance voiced by attendees were:
Government mandates point to the fact that software must have a measurement component
Industry standards, such as the Consortium for IT Software Quality (CISQ) and The Object Management Group (OMG) are available and should be leveraged
Technology solutions exist to help public sector firms address these mandates
Application portfolio analysis was at the center of discussion as Forrester Research Vice President and Principal Analyst, Margo Visitacion, presented how Agile development is affecting the application development process and IT’s portfolio planning. Ms. Visitacion explained that in the “Age of the customer,” they want more for less and expect companies to fluidly change based on their needs and demands. As companies shift their attention to customers’ experiences rather than production figures, it’s leading directly to higher revenue and a longer-lasting relationships.
So how do organizations remain agile to customer needs? They employ an Agile portfolio management process that collects metrics while aligning with the budgeting process; understanding that requirements will change. Using this strategy, companies gain clear visibility into their portfolio to measure risk, cost and complexity based upon objective measurements. The data collected during development enables them to defend current positioning and communicate more effectively with the business.
Here are some recent thought provoking questions, along with supporting answers, which we received during the Forrester webinar:
Software risks to the business, specifically Application Resiliency, headline a recent executive roundtable hosted by CAST and sponsored by IBM Italy, ZeroUno and the Boston Consulting Group. European IT executives from the financial services industry assembled to debate the importance of mitigating software risks to their business.
In the current tech scene, it has become common practice to refer to programmers as engineers. It seems that if you aren’t part of sales or marketing teams you are now entitled to being designated as an engineer. However, what has been forgotten over the 50 years of looking to turn software development into a legitimate engineering practice, is that we still haven’t reached the aspiration of being just that: a legitimate engineering practice. Traditional engineers have to go through stringent regulation, certification, and apprenticeships in order to gain the title. This creates an implicit responsibility of providing reliability and public safety. Software development hasn’t reached this point yet – software quality and standards are not universally valued.
So why is the tech industry using the engineering title to describe its technical workers?
In business, measurement is key. It’s not a new concept, of course, but it’s one that information technology has enabled to be implemented to a higher degree than ever before. Function point analysis is one of those areas where, like initiatives such as Six Sigma, the ability to measure can help insure ultimate success.
Over the past decade, advancements in static analysis tools from both commercial and open source communities have dramatically improved the detection of developer violations of good coding practices. The ability to detect these issues in coding practices provides the promise of better software quality.
Yet many of these static analysis tools cannot detect the critical violations that exist in multilayer architectures, across transactions and multi-technology systems. These are the violations that lead to 90% of a systems reliability, security and efficiency issues in production.
(Figure 1 illustrates these rules at the Unit and Technology/System Levels.)
CLICK THE IMAGE FOR A LARGER AND CLEARER VIEW.