Last week, CAST attended the Gartner EA Summit, held at National Harbor. It was two days of jam-packed sessions and workshops about Enterprise Architecture, but what stuck out the most was the value of this very unique discipline as a catalyst for Digital Transformation.
EA and Digital Transformation were the core focus of many presentations, including Mike J. Walker’s session “Leverage EA to Understand the Value and Impacts of Digital Disruption.” Mike stressed that this ever-evolving discipline is becoming a vital component to corporate strategy, delivering high-performing and sustainable business outcomes.
In April, Google experienced a fairly significant cloud outage, but it was hardly news at all. In fact, it was likely the most widespread outage to hit a major public cloud to-date. The lack of coverage is strange, considering the industry’s watchful eyes like Brian Krebs and others. The even more recent Salesforce service outage seems to have received more attention. But despite the fact that Google seems to have gotten away with a “pass” this time, the glitch brings renewed attention to the fact that tech players large and small are continuing to deal with software robustness issues.
Google Compute Engine was down for a full 18 minutes around the 7 o’clock hour Pacific Time on April 11, disconnecting all users in all regions. This was a Google cloud outage, and the root cause was a network failure. Network outages appear to be an ongoing challenge for Google, this one being the biggest yet.
Today, CAST is meeting hundreds of Enterprise Architect aficionados, gurus, practitioners and professionals in National Harbor at the Gartner EA Summit. When glancing at the agenda, it is evident that EA has become omnipresent and is interacting either directly or indirectly with 100% of hot IT challenges such as Digital Transformation, Cloud Readiness, Internet of Things, Cyber Security and Innovation – the topics that are keeping many executives up at night.
The intent of this post is to share “one” view of the EA journey and provide some personal insight into software risk management and what I think will be the upcoming challenges in our favorite discipline.
IT leaders from throughout the federal government discussed the value of how software measurement can positively impact their development process at CAST’s recent Cyber Risk Measurement Workshop in Arlington, VA – just outside of the Washington, D.C. area. The event brought together more than 40 IT leaders from several governmental agencies, including the Department of Defense and Department of State, system integrators and other related organizations. The group shared their experiences in how their respective organizations are driving value to end users and taxpayers.
Measuring and managing software quality is not just about compliance with government mandates, but rather around the proposition that strong software quality, security and sustainability are paramount. However, compliance remains essential. Three primary points around software compliance voiced by attendees were:
Application portfolio analysis was at the center of discussion as Forrester Research Vice President and Principal Analyst, Margo Visitacion, presented how Agile development is affecting the application development process and IT’s portfolio planning. Ms. Visitacion explained that in the “Age of the customer,” they want more for less and expect companies to fluidly change based on their needs and demands. As companies shift their attention to customers’ experiences rather than production figures, it’s leading directly to higher revenue and a longer-lasting relationships.
So how do organizations remain agile to customer needs? They employ an Agile portfolio management process that collects metrics while aligning with the budgeting process; understanding that requirements will change. Using this strategy, companies gain clear visibility into their portfolio to measure risk, cost and complexity based upon objective measurements. The data collected during development enables them to defend current positioning and communicate more effectively with the business.
Here are some recent thought provoking questions, along with supporting answers, which we received during the Forrester webinar:
For many IT-intensive enterprises, the bloating cost of maintaining software applications may be the biggest elephant in the room. Software maintenance costs typically comprise up to 75% of the total cost of ownership of each application. With so much investment and energy dedicated to keeping the lights on, finding a way to better allocate IT resources — even just by a marginal amount — can have significant impact on the enterprise’s capacity to innovate.
CAST’s research into this area has uncovered some provocative findings. As we’ve discussed previously on the On Quality blog, the cost of maintaining a software application is directly proportional to its size and complexity. IT organizations can take several steps using static code quality analysis to reduce size and complexity, and thus diminish their software maintenance costs.
In a merger, integrating company names is hard enough — imagine having to integrate massive application portfolios?
As the Justice Department and the FCC evaluate the proposed merger between corporate behemoths Time Warner Cable and Comcast, I wonder if the C-suite at both companies are investing as much time evaluating the health and security of one another’s application portfolio. Historically, technical due diligence has lagged greatly behind the financial due diligence.