Crash Course on CRASH Report, part 2: New Insights

I confess – I’m an “Urban Myths” junkie. That doesn’t mean I believe in every Urban Myth that comes across my email inbox; much to the contrary, in fact, I’m a born skeptic. I snicker at the widespread beliefs and openly wonder how someone could believe that Bill Gates would send them a check for forwarding an email or that Mr. Rogers was a Marine sniper or that some currently popular entertainer was born a different gender.

This is why I have Snopes.com on Internet speed dial and I DVR the television show “Mythbusters”. I guess not only am I a born skeptic, I appreciate knowing when myths and legends are busted or perhaps do have an ounce of truth to them somewhere.

My interest in confirming and busting ideas was one of the things I liked about last month’s CAST Report on Application Software Health (aka CRASH). The study was hailed by many as an enlightening real-world evidentiary look at the application quality of software being used by businesses today of which Charlie Babcock from InformationWeek noted, “Because it is based on real-world apps, the Cast Report on Application Software Health is likely to find a permanent place in the current DevOps debate.”

My first breakdown blog on the report looked at the findings that the 2012 report confirmed from a similar report released a year earlier by CAST. Among those confirmed findings were proof of Java apps carrying more technical debt, .NET apps scoring lower in security, and COBOL apps being both more robust and more secure than nearly all other platforms.

Today, though, I’m going to look a bit into my favorite part of the report – the new insights, some of which fly in the face of what many of us believe is the conventional wisdom of the industry.

Busted! In-House vs. Outsourcing

That loud boom you heard when you picked up the CRASH report was perhaps the biggest surprise of this year’s report – evidence that regardless of whether developed in-house or outsourced, there is little to no difference in the quality of applications.

Let that sink in for a moment – Total Quality Index (TQI) scores for in-house developed applications were no better than they were for outsourced applications. Nearly everybody in the industry believes that when you outsource applications, you lose some measure of control and therefore run the risk of quality not being on par with what could have been created in-house. But the numbers in the CRASH report show this is not the case – the two methods of getting projects done are on a practically level field quality-wise.

To the CRASH report’s credit, however, it offers a possible explanation of why the TQI for in-house and outsource are so closely related. It notes:

“One possible explanation for these findings is that many of the outsourced applications were initially developed in-house before being outsourced for maintenance. Consequently, it is not unexpected that their structural quality characteristics are similar to those whose maintenance remained in-house.”

When we realize that it’s basically the same application, just a different take on it, it makes sense that the quality would stay roughly the same because like so many other things in life, you get out what you put in.

The same was discovered for off-shoring versus near-shoring. While some would have you believe keeping things close to home improves control and therefore improves quality, CRASH shows that there is little to no difference between near-shoring and off-shoring…in other words, China and India offer quality on par with Mexico.

Custom Doesn’t Always Mean Quality

A couple other busted beliefs among the new findings in the CRASH report came at the expense of “Custom Development” methods. The first reminds me of Winston Churchill’s statement, “democracy is the worst form of government except all those other forms that have been tried from time to time.”

For all the times I’ve taken Agile development to task in this blog for needing to “slow down” and take the time to perform structural analysis of its parts and interfaces to ensure optimal application quality, I still believe it is a better way of doing things than many others. It came as little surprise, therefore, when the CRASH study revealed that the TQI for established methods – mainly Agile and waterfall – were higher than for custom development methods.

The report did show one differentiating trend between Agile and waterfall, however:

It appears from these data that applications developed with agile methods are nearly as effective as waterfall at managing the structural quality affecting business risk (Robustness, Performance, and Security), but less so at managing the structural quality factors affecting cost (Transferability and Changeability).

Custom development methods also took a hit when it came to the quality of applications with more frequent releases. Most of those applications with more frequent releases were found to have been developed using custom methods. The report pulls no punches here:

…the sharp decline [in TQI scores] for projects with more than six releases per year may be due in part to less effective development methods.

Scratching of Chins

Two other new findings in the report left me with a “Basil Rathbone as Sherlock Holmes-like” desire to scratch my chin.

The first was that when broken down by industry, the discipline with the lowest scores for security was IT Consulting…HUH? Maybe that wasn’t my chin I was scratching, but rather my head. I thought that consulting companies would need to take some of the greatest care to keep their applications secure. Fortunately, CAST’s Dr. Bill Curtis offered a very reasonable explanation for this, noting that most IT consultancies are at the mercy of the software their clients give to them. Therefore, they tend to adopt security issues rather than create or foster them.

The other chin-scratcher was that maintainability – the ability to change or adapt an application – was actually better for applications with the greatest number of users. Once again, you would think the opposite would be true – that “the bigger they are, the harder they fall.” Evidently, though, when an application gets a large following, it needs to incorporate the ability to adapt and cater to its users’ needs…and the more users there are, the more adaptable the application needs to be.

So we’ve confirmed, we’ve busted, I guess the only thing left is to leave you gasping for air by breaking down how much companies are wasting on fixing issues that should have been caught during the application build process. Stay tuned for my look at what CRASH had to say about technical debt!

Get Your Free White Paper And Learn How Software Analysis Can Help Your Business

“The Science of IT Planning and Budgeting” helps senior leaders challenge the status quo and to understand how a research-based, automated solution transforms this important process by injecting it with fact-based, objective insight.

Your Information will be kept private and secure.

Comments

comments