If the cloud did not ever have the potential to return ROI back to the business, nobody would use it. However, there are businesses that are very successful with cloud, even changing the business around the use of cloud computing. These companies are leveraging cloud as a true force multiplier to build innovative solutions, as well as to provide agility and scalability. However, many cannot find business value with cloud computing. Most disturbing, they are not finding value while spending about the same amount of money as those who are finding value. We must therefore conclude that bad decisions are being made. Cloud computing technology has been relevant for about 15 years. We understand it’s what you do and your company culture that makes you truly successful with cloud computing, not what you spend. Why are we still seeing winners and losers? ... First, bad architectures need to be fixed before they can operate properly. You can have a disciplined and highly automated operations team and technology stack, but if the solution is poorly designed, the result is going to be less than stellar, no matter what.
Innovation has always been essential to long-term value creation and resilience because it creates countercyclical and noncyclical revenue streams. Paradoxically, making big innovation bets may now be safer than investing in incremental changes. Our long-standing research shows that innovation success rests on the mastery of eight essential practices. Five of these practices are particularly important today: resetting the aspiration based on the viability of current businesses, choosing the right portfolio of initiatives, discovering ways to differentiate value propositions and move into adjacencies, evolving business models, and extending efforts to include external partners. ... In times of disruption or deep uncertainty, companies have to carefully balance short-term innovations aimed at cost reductions and potential breakthrough bets. As customers’ demands change, overindexing on small product tweaks (that address needs which may be temporary) is unlikely to boost long-term performance. However, “renovations” to designs and processes can produce savings that help fund longer-term investments in innovations that may create routes to profitable growth.
In general, healthcare IT has accrued technical debt for more than 25 years. Everywhere you look, whether it’s at the doctor’s office, hospital, or an urgent care facility, you see disparate and often dated IT systems. It’s not as rare as you’d think to see WindowsXP–based computers at the check-in desk and throughout the facility. Many of the most common pieces of equipment and attached computer systems run outdated operating systems, unpatched and archaic software, and have little security on them. I promise you it’s not for lack of trying by the IT and cyber-security team. So much outdated software exists largely because the vendors that support these systems focus on the healthcare aspect, rather than upkeep and security. In other instances, some devices were never intended to be connected to a network — thus rendering them vulnerable to remote attacks because they aren’t configured to be protected from network-based attackers. Finally, there is certainly some “if it ain’t broke, don’t fix it” mentality. Walking around you’ll find computer systems under people’s desks that have served a single purpose for a very long time.
It is time to stop searching for non-existent profiles, expecting the CISO to be credible one day in front of the Board, the next in front of hackers, the third in front of developers, and all the way across the depth and breadth of the enterprise and its supply chain. Those profiles don’t exist anymore, given the transversal complexity cyber security has developed over the past two decades. The role of the CISO has to be one of a leader, structuring, organising, delegating and orchestrating work across their team and across the firm — and across the multiple third-parties involved in delivering or supporting the business. In essence, knowing what to do is reasonably well established and cyber security good practice — at large — still protects from most threats, and still ensures a degree of compliance with most regulations. But by focusing excessively on purely technical approaches to cyber security challenges, large organizations have failed to protect themselves effectively and efficiently, in spite of massive investments in that space over the last two decades.
MACH is an acronym for Microservices, API-first, Cloud-native, and Headless. It’s a modern approach for building and deploying software applications that can help organizations to be more agile, scalable, and flexible. In a MACH architecture, software applications are built as a collection of independent, self-contained microservices that communicate with each other through APIs (Application Programming Interfaces). The front-ends and back-ends components are separated and the entire solution is designed to be deployed in the cloud. ... There are several benefits of using a MACH architecture for building and deploying software applications:Agile development: MACH architectures allow different parts of an application to be developed and deployed independently, which can make it easier to make changes and updates without disrupting the entire system. This can help organizations be more agile and responsive to changing business needs. Scalability: MACH architectures are designed to be deployed in a cloud computing environment, which can provide the scalability and flexibility needed to support rapid growth or spikes in demand.
Many organizations stumble and fail because they lack complete visibility into all data assets in clouds and beyond. To take visibility to a higher level, it’s vital to have a catalog of all managed and shadow assets, along with their owners, locations, security and governance measures enabled for the data. Without a central repository and a single view, there’s no way to know what data exists, how it’s stored, where it’s used and how it’s shared. Essentially, an organization winds up flying blind. Yet the advantages of robust discovery and visibility don’t stop there. With this information it’s possible to adapt and expand security profiles as needs and conditions change. ... Sharing data in the cloud involves complexity and risk. That’s a given. To maximize the opportunity—including harnessing the full functionality of cloud-native tools—an organization must know who is accessing data and how they are using it. Therefore, a robust identity management framework is crucial. Administrators and others must be able to analyze roles and permission settings in data assets that reside in clouds and across multi-cloud frameworks.
Automating a bad process can make things worse as it can magnify or exacerbate underlying issues, especially if humans are taken out of the loop. In some cases, a process is automated because the technology is there, even if automation isn’t required. For example, if a process occurs very rarely, or there’s a great deal of variation in the process, then the cost of setting up the automation, teaching it to handle every use case, and training employees how to use it may be more expensive and time-consuming than the old manual approach. And putting the entire decision into the hands of data scientists, who may be far removed from the actual work, can easily send a company down a dead end, or to end users who might not know how automation works, says James Matcher, intelligent automation leader at Ernst & Young. That recently happened at a company he worked with, a retail store chain with locations around the US. The retailer approached people on the front lines, and employees and managers working on the shop floors, for suggestions about manual processes that should be automated.
A CTO needs to take on the role of the ‘bridge builder’ between the strictly technical components of a transformation strategy and how they can apply to people and process in the specific context of an organisation. Digital transformation is a team activity. Each role needs to bring to the process their full insights and experience for the CTO to manage. The CTO has specific technological insight and therefore needs to be directly involved in helping the entire organisation identify where technical systems are simply obsolete and not fit for purpose so as well as being a bridge builder, CTOs naturally lead the charge when dealing with a technology-led approach. They must be able to explain where the value is in the application of technological change in context – too often we see visions that are de-contextualised from the reality on the ground. This kind of technological planning does not allow for realistic strategic planning. With visions of the ambitious but feasible in sight it is then the whole leadership team’s task to decide what course they are going to map out and to work together on the digital transformation journey.
CircleCI has taken proactive steps to mitigate risk for its customers, but simply revoking secrets from the platform is not enough, according to Jaime Blasco, co-founder and CTO of cybersecurity company Nudge Security. “It’s still important to assume that every connected application and secret has been compromised. Organizations should verify the steps that these vendors have taken and also take steps to rotate secrets within any other connected application,” he explains. Customers can leverage commercially available or open-source tools, aside from the one offered by CircleCI, to discover their secrets. “One option is to use Trufflehog, an open-source tool that scans for secrets across multiple platforms, including CircleCI, Github, Gitlab, and AWS S3,” says Blasco. CircleCI is assuming responsibility and taking steps to protect its customers, Assaf Morag, lead data analyst at cloud native security company Aqua Security, notes. But is important for customers to respond proactively to the security incident as well.
Every business probably has some opportunity to use AI more than it does today. The first thing to look at is the availability of data. Do you have performance data that can be organized in a systematic way? Companies that have deep data on their portfolios down to business line, SKU, inventory, and raw ingredients have the biggest opportunities to use machines to gain granular insights that humans could not. Companies whose strategies rely on a few big decisions with limited data would get less from AI. Likewise, those facing a lot of volatility and vulnerability to external events would benefit less than companies with controlled and systematic portfolios, although they could deploy AI to better predict those external events and identify what they can and cannot control. Third, the velocity of decisions matters. Most companies develop strategies every three to five years, which then become annual budgets. If you think about strategy in that way, the role of AI is relatively limited other than potentially accelerating analyses that are inputs into the strategy.
Quote for the day:
"Effective questioning brings insight, which fuels curiosity, which cultivates wisdom." -- Chip Bell