Panel Discussion at the 2016 Software Risk Summit Software risk has historically been overlooked as a security concern by business leaders, and companies have paid a high price as a result. Remember the JPMorgan hack of 2014? That cost the bank more than $6 billion. RBS has paid £231 million for their IT failures as of two years ago. The Target breach? The retailer posted a write down of $152 million. Or, more recently, Jeep controls being taken over by hackers, and a similar incident with Toyota-Lexus having to fix a software bug that disabled cars’ GPS and climate control systems? That costs the manufacturers valuable consumer confidence points and can seriously damage sales.
So I was thrilled to know that the topic for the first annual Software Risk Summit in New York was indeed just that, software risk. I had the pleasure of moderating the panel discussion with esteemed guests from BNY Mellon, the Software Engineering Institute at Carnegie Mellon, the Boston Consulting Group and CAST. But beforehand, I was able to sit-in on the keynote by Rana Foroohar.
High-capacity network bandwidth has become more widely available, and we have quickly tapped into every last inch of its capacity. More devices are built with wi-fi capabilities, the costs of mobile devices are going down and smartphones are in the hands of more people than ever before. In fact, Apple might have already exhausted the market and is seeing drastically lower sales forecasts for the iPhone.
We are moving into an era in which virtually any device will connect to the Internet. Phones, fitness trackers, dishwashers, televisions, espresso machines, home security systems, cars. The list goes on. Analyst firm Gartner estimates that over 20 billion connectable devices will exist worldwide by 2020. Welcome to IoT—the Internet of Things. A giant network of connectable things.
In April, Google experienced a fairly significant cloud outage, but it was hardly news at all. In fact, it was likely the most widespread outage to hit a major public cloud to-date. The lack of coverage is strange, considering the industry’s watchful eyes like Brian Krebs and others. The even more recent Salesforce service outage seems to have received more attention. But despite the fact that Google seems to have gotten away with a “pass” this time, the glitch brings renewed attention to the fact that tech players large and small are continuing to deal with software robustness issues.
Google Compute Engine was down for a full 18 minutes around the 7 o’clock hour Pacific Time on April 11, disconnecting all users in all regions. This was a Google cloud outage, and the root cause was a network failure. Network outages appear to be an ongoing challenge for Google, this one being the biggest yet.
Today, CAST is meeting hundreds of Enterprise Architect aficionados, gurus, practitioners and professionals in National Harbor at the Gartner EA Summit. When glancing at the agenda, it is evident that EA has become omnipresent and is interacting either directly or indirectly with 100% of hot IT challenges such as Digital Transformation, Cloud Readiness, Internet of Things, Cyber Security and Innovation – the topics that are keeping many executives up at night.
The intent of this post is to share “one” view of the EA journey and provide some personal insight into software risk management and what I think will be the upcoming challenges in our favorite discipline.
On March 15, CISQ hosted the Cyber Resilience Summit in Washington, D.C., bringing together nearly 200 IT innovators, standards experts, U.S. Federal Government leaders and attendees from private industry. The CISQ quality measures have been instrumental in guiding software development and IT organization leaders concerned with the overall security, IT risk management and performance of their technology. It was invigorating to be amongst like-minded professionals who see the value in standardizing performance measurement.
Software Risk Management in Digital Transformation was the focus during the 4th edition of the Information Technology Forum, hosted by International Institute of Research (IIR). Massimo Crubellati, CAST Italy Country Manager, discussed how Digital Transformation processes are changing the ICT scenario and why software risk management and prevention is mandatory.
Massimo shared our recipe for Digital Governance evolution: including a specific ICT Risk chapter in the design of the governance structure of the digital transformation, whose most relevant aspect is to determine which methods and through which key performance indicators to measure the operational risk inherent in the application portfolio. Measurement needs to be continuous and structural, it must include the assessment of application assets inherent weaknesses, through the analysis of correlations between the layers composing them. Thus obtaining, not only an effective prevention of direct damage ensuring the service resilience, but a reduction in maintenance and application management costs.
IT leaders from throughout the federal government discussed the value of how software measurement can positively impact their development process at CAST’s recent Cyber Risk Measurement Workshop in Arlington, VA – just outside of the Washington, D.C. area. The event brought together more than 40 IT leaders from several governmental agencies, including the Department of Defense and Department of State, system integrators and other related organizations. The group shared their experiences in how their respective organizations are driving value to end users and taxpayers.
Measuring and managing software quality is not just about compliance with government mandates, but rather around the proposition that strong software quality, security and sustainability are paramount. However, compliance remains essential. Three primary points around software compliance voiced by attendees were: