The rate at which security issues have plagued businesses lately is staggering. Every week we hear of multiple vulnerabilities, millions of personal data records being exposed and corporations watching profits dwindle as reparation costs for these breaches extend into millions and even billions of dollars.
What’s worse than hearing about these things in the media is that the perception apparently is not even as bad as the reality of the situation.
George Hulme recently reported in CSO magazine that Veracode, a software services provider, released some pretty staggering findings based on security analyses it performed on more than 4,800 applications submitted to the firm. The findings, published in Veracode’s “State of Software Security Report,” showed that 58 percent of the applications submitted to the firm were of “unacceptable security quality.”
Now, you would think that companies that customize their “off-the-shelf” software might artificially inflate the number, but Hulme reports a rather shocking statistic from the report:
“The report found that 66 percent of applications developed by the software industry had unacceptable security quality, and a surprising 72 percent of security software met the same poor ranking.”
It’s kind of scary to think that security software is insecure. Clichés about “the fox watching the hen house” and “snake oil” would come immediately to mind if I were not at least relatively certain that security vendors really do mean well.
The apologetic innocence of each software vendor in the wake of discovering a breach might make one think their mantra should come from the lips of Jessica Rabbit of “Who Framed Roger Rabbit” fame, “I’m not bad, I’m just drawn that way.”
Nevertheless, security is a key health factor of software and the failure of companies across the globe to ensure complete security of their applications is a key contributor to the spiraling technical debt that exists – currently $500 billion globally according to Gartner and over $1 million per company on average according to CAST’s Annual Worldwide Application Software Quality Study.
When it comes to technical debt, security vendors appear to be suffering from the same issues as everyone else out there – there is a great deal of risk that exists within their application software that should have been identified before it was deployed. It’s not their intent to roll out vulnerable software, but just how should security vendors find the one line of code out of every 4,000 that could lead to failure?
Automating Security Quality
As well-intentioned as security vendors may be, it’s their job to get security right, just as it’s the job of Sony to keep its users’ financial data confidential and it’s the job of GlaxoSmithKline to keep confidential the types of prescription medications being taken by their customers. Regardless of how many mea culpas they offer and how sincere they may be, something needs to be done to shore up the vulnerabilities that leave application software open to being breached.
What needs shoring up is the structural quality of the software.
Vendors should increase the use of static analysis to measure the structural quality of applications by using automated analysis and measurement to vet all the health factors of software – security included. Only automated analysis and measurement can dig deep into application software and assess it against thousands of industry standards and norms to identify those elements within the application that pose significant risk of failure to users or expose the software to possible breach by hackers.
Muhammad Ali once said, “You can’t hit what you can’t see”; that applies to software, too. Only once issues are visible can application developers fix the problems and prevent security breaches from happening. And if companies cannot uncover these areas of vulnerability, they will continue to be left feeling insecure about their security.