In business, measurement is key. It’s not a new concept, of course, but it’s one that information technology has enabled to be implemented to a higher degree than ever before. Function point analysis is one of those areas where, like initiatives such as Six Sigma, the ability to measure can help insure ultimate success.
The use of function points in application analysis and measurement was a key topic at last week’s seminar in Washington, DC, attended by officials and experts from federal, state and local governments as well as the private sector. Attendees discussed how to measure an application using function points and improve it by creating metrics to produce a clearly defined trend over time.
The automation of processes, including quantifying function points during the software development process, delivers numerous benefits to companies:
- It’s consistent, reliable and objective: you can apply the same rules and assumptions from version to version, and in the case of function points, provide consistent measurement without limiting the size of the application being measured.
- It’s cost-effective: unlimited measurement snapshots can be done as needed at almost no added cost.
- It’s reliable: it meets the latest standards from the Consortium for IT Software Quality (CISQ).
Those in attendance heard how one major telecommunications company made function point analysis work by taking specific steps to implement it. The company had more than 3,000 applications, using multiple technologies. It developed an initiative, using function points, to focus on expanded productivity.
Now, here’s an important aspect, one that cannot be overlooked — the company deployed its application analysis initiative, but only after achieving C-level buy-in, by providing an understanding of the business benefits the project would produce. Without achieving that kind of involvement, asset-intensive projects like these are inevitably doomed to fail.
With it, the mission proceeded. The company realized it simply couldn’t analyze all 3,000 applications; some were far more important than others. So it defined the metrics and measures it wanted. Using those definitions, it eliminated some of its applications from consideration, created a CIO scorecard, and onboarded the rest into CAST’s Application Intelligence Platform (AIP) to obtain the biggest bang for its buck. Perhaps most importantly, it created a stringent timeline for the submission and validation of data.
It was a significant project. AIP was set up for ongoing analysis and measurement of 200 of the company’s largest and most important applications, reviewing 28.7 million function points across 560 million lines of code, with an average (an average!) of one million lines per application. But it all worked. The company found that its enhancement cost per function point dropped, and its function point performance per full-time employee increased. We don’t know to what extent the measurement program contributed to that result – but the key is they could actually measure it.
The takeaways from the project were relatively straightforward:
- Onboarding is toughest step in process.
- Automation comes over time.
- Accuracy is a must.
- Expect resistance to change…sell it over time.
- Focus on the areas of greatest impact.
Seminar attendees were told that if they could do all that, the the use of function point analysis would result in a better handle on the amount of software produced. If they can do that, they can determine how productive their organizations are, and then use that data to find efficiencies, improve productivity, manage vendor relationships, and ultimately gain better control over cost and speed of applications’ work.
For more insight into how function point analysis supports better measurement, improves estimation and vendor relationships, listen to Barbara Beech, a long time vendor management professional who shared her insights with us last week: http://www.castsoftware.com/customers/interviews/telecom-company