If you’ve read the news lately, you’ve seen headline after headline (some, even on our blog) about computer glitches, technical failures, software risk, and hacks. The health of applications is now under more microscopic attention than ever before – because no matter whether internal or external causes prompt a software outage, the security and stability of your applications are paramount.
The Economist wrote an interesting article on cybersecurity in the age of networked and connected devices and how firms are unsure how to react to this new technological environment. The article focuses on the vulnerabilities of certain products, ranging from the new chip-implanted talking Barbie doll to any modern car, that are in some way connected to a remote network. These vulnerabilities can be anything from a hacker remotely disabling a car’s brakes or making a little girl’s Barbie doll utter an obscenity. Ultimately, the stakes are high.
But for many firms, security or any software quality assessment are of little importance, aside from what affects the user experience. Large tech firms are already instilling visibility into their technical assets, seriously considering the consequences of buggy code, poorly built architectures, and the lack of safeguards in their code.
So why are so many others slow on the uptake?
It’s the nature of new technology, and always has been. When railroads first began to gain steam as a method of transport, there were years of boiler explosions and derailments that killed people. It took many years of such accidents until the industry begin to take safety seriously. But why are we letting history repeat itself?
There’s hesitation among companies to let “white-hat” hackers come in and find vulnerabilities for them (look at Volkswagen’s appeal to an English court to block the publication of work by a researcher at Birmingham University who uncovered a serious problem with the remote key fobs that lock VW’s cars). Although some firms have welcomed this practice – United Airlines recently rewarded two hackers a million miles for finding security breaches – it is not the only solution to finding vulnerabilities or defects within software. There are software analytics solutions and static-code analysis tools available for those who want to keep things in-house.
Clearly, whatever the method of identification, the status quo is untenable. A response to the failures of technology rests on firms’ willingness to address the problem head on until mishap after mishap begin to take their toll on their reputation.
Right now, it just seems that software failure is something to be expected, a normal part and cost of doing business. It shouldn’t be. In any business, acting quickly on your challenges to solve them can improve your standing and maintain your competitive advantage. Why aren’t we doing more in this direction?
Watch this webinar to see how you can manage your software risk: