Because cyber insurance deals with risk that has been transferred, there is a subtle but powerful distinction from the need to understand your own risk. In many cases, insurance companies that can curate low risk pools and a favorable loss ratio can significantly improve profits. That’s not the only way they make money, but it is one way. Now enter the resurgence of cyber business intelligence. While concepts like cyber threat intelligence and risk assessments focus on preventing loss, cyber business intelligence aligns with concepts already utilized elsewhere in a business environment. “What pieces of knowledge and trends can I follow – that by following them I can be more profitable?” This is a different mindset. This is one anchored on the idea that “you’ve got to spend money to make money.” This drives a culture and enthusiasm that can foster better innovation, better results and faster progress. There’s another key word there. Business. Not only relevant to technical experts, this information is equally relevant to business leaders and key decision makers.
Not all AI needs are created equal. For instance, in low-stakes situations, such as image recognition for noncritical needs, it’s not likely necessary to understand how the programs are working. However, it is critical to understand how code operates and continues to develop in situations with important outcomes, including medical decisions, hiring decisions, or car safety decisions. It’s important to know where human intervention is needed and when it’s necessary for input and intervention. Additionally, because educated men mainly write AI code, according to (fittingly) the Alan Turing Institute, there’s a natural bias to reflect the experiences and worldviews of those coders. Ideally, coding situations in which the end goal implicates vital interests need to focus on “explainability” and clear points where the coder can intervene and either take control or adjust the program to ensure ethical and desirable end performance. Further, those developing the programs—and those reviewing them—need to ensure the source inputs aren’t biased toward certain populations.
A high-performing team consist of engaged, happy, and motivated people — truly getting the best out of your team means getting the best from the individual. So what does that mean for you? It means quickly getting up to speed on each team member’s background, experiences, portfolio, strengths, growth areas, and goals. How do they want to be recognized? What style of feedback do they prefer? How do they learn best? What goals do they have? The more nuance you can learn about each person, the more successful you will be in leading them. By setting up 1:1 meetings, you’ll be able to learn about each person on your team, coach them, and discuss their progress towards goals. ... Instead of rolling in making changes, spend this time learning about the processes your team is already using. What are the team’s goals? How do they work together and separately? How does your team integrate with other teams — or not, currently, and is that an issue? Who are the customers and partners?
Considering this net-positive shift in budgets, it’s no surprise that 74% of enterprise leaders have adopted or are planning to adopt quantum computing. Interestingly, nearly 30% of respondents that have adopted or plan to adopt quantum computing expect to see a competitive advantage due to quantum computing within the next 12 months. This represents more than a sevenfold increase year-over-year from 2021 (4%) and highlights the growing commitment to near-term quantum computing initiatives as the technology continues to mature. “We’re getting a unique glimpse into the quantum adoption mindset of global enterprise executives, which mirrors what we’re seeing in our customer base,” said Christopher Savoie, CEO of Zapata Computing. “These findings become more interesting when compared to the data we saw last year. Over the past 12 months, we’ve seen significant new developments in technology, particularly generative AI, and near-term advantages from quantum-inspired technologies that are fueling the momentum for quantum computing planning and adoption.
The importance of cyber-risk governance is no longer limited to CISOs anymore; conversations are deepening on how organizations can ensure data resiliency, adaptability, and security at the C-Suite level. As we approach 2023, business leaders will need to assess their data infrastructure with a five-point focus approach — scalability, flexibility, agility, security, and cost. Data protection and management will become a top-tier priority for business leaders. Significant amounts of the IT budget spend will be allocated and invested in technologies to prevent, detect, and recover from inevitable cyberattacks not if, but when they occur. A study by PWC stated that 62% of respondents expect their security budget to increase by as much as 10% in 2023. As cloud investments will continue to soar high in 2023, the parallel shifts in the threat landscape will also become more sophisticated. As per the recent Commvault-IDC survey, over 28% of Indian enterprises stated they will have multiple private and/or public cloud environments and migrate workloads and data between them by 2023. Thus, protection and data recoverability will be essential components in the enterprise security toolbox of organizations.
Compared to other networking certifications, like the CCNA, which is more about how to operate the technology, Cato’s SASE and SSE certifications are high-level overviews. “Our certification is more about what SASE and SSE mean, what are the implications, and what it means to different IT teams,” says Webber-Zvik. “You see presentations, whiteboards, reading materials, and at the end of each section, there is a quiz. When you complete all the sets and pass all the tests, you get the certification.” The majority of the material covered is not Cato-specific, he says. However, the certification does use Cato’s implementation of SASE and SSE in its examples. Take, for instance, single-pass processing. According to Gartner, this is a key characteristic of SASE, and it means that networking and security are integrated. “We explain it according to Gartner’s definition,” Webber-Zvik says. “We also provide an example of Cato’s implementation and use that to articulate what single-pass processing can look like when it’s outside Gartner theory and in real life.” There is no charge for Cato’s certification training and exam, but that might change, he says.
There are several potential solutions. If the use case allows it, the best option is to make tasks asynchronous. If you are calling multiple services, it inevitably takes too long, and often it is better to set the right expectations by promising to provide the results when ready rather than forcing the end user to wait for the request. When service calls do not have side effects (such as search), there is a second option: latency hedging, where we start a second call when the wait time exceeds the 80th percentile and respond when one of them has returned. This can help control the long tail. The third option is to try to complete as much work as possible in parallel by not waiting for a response when we are doing a service call and parallelly starting as many service calls as possible. This is not always possible because some service calls might depend on the results of earlier service calls. However, coding to call multiple services in parallel and collecting the results and combining them is much more complex than doing them one after the other.
Faced with an increasing number of threats both internal and external, CIOs have had to prioritise areas such as cyber security in recent years just to keep their businesses protected. In doing so, they’ve also been charged with embracing the latest technological developments such as artificial intelligence, big data analytics, and the plethora of connected devices that comprise the burgeoning Internet of Things; technologies that will foster greater innovation and provide their businesses with a more competitive edge. Increasingly, however, it won’t necessarily be an organisation’s IT department that drives the adoption of emerging technologies. More often, other areas of the business will now be in a better position to identify the innovative technology that will deliver greater customer value, and the specific use cases in which it can be implemented. 77 per cent of CIOs surveyed by Gartner claimed that IT staff are primarily providing innovation and collaboration capabilities, compared with 18 per cent stating that non-IT personnel are providing these tools.
Whether it’s AI assistance, VR immersion, or web3 decentralization, 2023 will continue to push organizations to adopt cutting-edge technology. It’s a challenge to guess which of these ideas will flourish and which will flounder, but either way, having a reliable foundation will be necessary. Adopting even the most successful new ideas at scale will bring new obstacles and types of incidents. These growing pains of new technologies will require new approaches. As organizations experience these growing pains, they’ll turn to SRE to keep their customers happy while they adjust. Incident retrospectives can help teams handle new sources of incidents quickly, while a reliability mindset can keep customer happiness the number one priority. Reliability is the subjective experience of users based on their expectations of the service. While this is a helpful way to align priorities with customer needs, 2023 will bring an even more holistic definition of reliability. Organizations will start thinking about the reliability of their system, not just in terms of their users’ experiences, but as a complete package covering everything starting from development ideation.
The unauthorized disclosure of data isn’t always the result of malicious actors. Often, data is accidentally overshared or lost by employees. Keep your employees informed with cyber security education. Employees who go through regular phishing tests may be less likely to engage with malicious actors over email or text messaging. ... An inventory of software, hardware and data assets is essential. Having control over the assets with access to your corporate environment starts with an inventory. Inventories can be a part of the overall vulnerability management program to keep all assets up to date, including operating systems and software. Furthermore, a data inventory or catalogue identifies sensitive data, which allows appropriate security controls like encryption, access restrictions and monitoring to be placed on the most important data. ... Reducing your overall data footprint can be an effective way of reducing risk.
Quote for the day:
"Smart leaders develop people who develop others, don't waste your time on those who won't help themselves." -- John C Maxwell