10 cloud mistakes that can sink your business
It’s a common misconception that cloud migration always leads to immediate cost
savings. “In reality, cloud migration is expensive, and not having a full and
complete picture of all costs can sink a business,” warns Aref Matin, CTO at
publishing firm John Wiley & Sons. Cloud migration often does lead to cost
savings, but careful, detailed planning is essential. Still, as the cloud
migration progresses, hidden costs will inevitably appear and multiply. “You
must ensure at the start of the project that you have a full, holistic cloud
budget,” Matin advises. Cloud costs appear in various forms. Sometimes they’re
in plain sight, such as the cost of walking away from an existing data facility.
Yet many expenses aren’t so obvious. ... A major challenge facing many larger
enterprises is leveraging data spread across disparate systems. “Ensuring that
data is accessible and secure across multiple environments, on-premises as well
as on applications running in the cloud, is an increasing headache,” says
Darlene Williams, CIO of software development firm Rocket Software.
Developed countries lag emerging markets in cybersecurity readiness
The drastic difference in cybersecurity preparedness between developed and
developing nations is likely because organizations in emerging markets started
adopting digital technology more recently compared to their peers in developed
markets. “That means many of these companies do not have legacy systems holding
them back, making it relatively easier to deploy and integrate security
solutions across their entire IT infrastructure,” the report said, adding that
technology debt — the estimated cost or assumed impact of updating systems —
continues to be a major driver of the readiness gap. The Cisco
Cybersecurity Readiness Index categorizes companies in four stages of readiness
— beginner, formative, progressive, and mature. ... Identity management was
recognized as the most critical area of concern. Close to three in five
respondents, or 58% of organizations, were either in the formative or beginner
category for identity management. However, 95% were at least at some stage of
deployment with an appropriate ID management application, the report said.
Observability will transform cloud security
Is this different than what you’re doing today for cloud security? Cloud
security observability may not change the types or the amount of data you’re
monitoring. Observability is about making better sense of that data. It’s much
the same with cloud operations observability, which is more common. The
monitoring data from the systems under management is mostly the same. What’s
changed are the insights that can now be derived from that data, including
detecting patterns and predicting future issues based on these patterns, even
warning of problems that could emerge a year out. ... Cloud security
observability looks at a combination of dozens of data streams for a hundred
endpoints and finds patterns that could indicate an attack is likely to occur in
the far or near future. If this seems like we are removing humans from the
process of making calls based on observed, raw, and quickly calculated data,
you’re right. We can respond to tactical security issues, such as a specific
server under attack, with indicating alerts, which means it should block the
attacking IP address.
Operational Resilience: More than Disaster Recovery
Disaster recovery is fairly narrow in its definition and typically viewed in a
small timeframe. Operational resilience is much broader, including aspects like
the sort of governance you’ve put in place; how you manage operational risk
management; your business continuity plans; and cyber, information, and
third-party supplier risk management. In other words, disaster recovery plans
are chiefly concerned with recovery. Operational resilience looks at the bigger
picture: your entire ecosystem and what can be done to keep your business
operational during disruptive events. ... Part of the issue is that cyber is
still seen as special. The discussion always seems to conclude with the
assumption that the security team or IT department is managing a particular
risk, so no one else needs to worry about it. There is a need to demystify
cybersecurity. It’s only with the proper business understanding and risk
ownership that you can put proper resilience mechanisms in place.
Nvidia builds quantum-classical computing system with Israel’s Quantum Machines
The DGX Quantum deploys Nvidia’s Grace Hopper superchip and its technology
platform for hybrid quantum-classical computers coupling so-called graphics
processing units (GPUs) and quantum processing units (QPUs) in one system. It
is supported by Quantum Machine’s flagship OPX universal quantum control
system designed to meet the demanding requirements of quantum control
protocols, including precision, timing, complexity, and ultra-low latency,
according to the Israeli startup. The combination allows “researchers to build
extraordinarily powerful applications that combine quantum computing with
state-of-the-art classical computing, enabling calibration, control, quantum
error correction and hybrid algorithms,” Nvidia said in a statement. Tech
giants like Google, Microsoft, IBM, and Intel are all racing to make quantum
computing more accessible and build additional systems, while countries like
China, the US, Germany, India, and Japan are also pouring millions into
developing their own quantum abilities.
Leveraging Data Governance to Manage Diversity, Equity, and Inclusion (DEI) Data Risk
In organizations with a healthy data culture, the counterpart to compliance is
data democratization. Democratization is the ability to make data accessible
to the right people at the right time in compliance with all relevant legal,
regulatory, and contractual obligations. Leaders delegate responsibility to
stewards for driving data culture by democratizing data so that high-quality
data is available to the enterprise in a compliant manner. Such democratized
data enables frontline action by placing data into the hands of people who are
solving business problems. Stewards democratize data by eliminating silos and
moving past the inertia that develops around sensitive data sources. An
essential aspect of democratization, therefore, is compliance. Stewards will
not be able to democratize data without a clear ability to assess and manage
risk associated with sensitive data. That said, it is critical that DEI
advocates limit democratization of DEI data, especially at the outset of their
project or program.
The Future of Data Science Lies in Automation
Much data science work is done through machine learning (ML). Proper
employment of ML can ease the predictive work that is most often the end goal
for data science projects, at least in the business world. AutoML has been
making the rounds as the next step in data science. Part of machine learning,
outside of getting all the data ready for modeling, is picking the correct
algorithm and fine-tuning (hyper)parameters. After data accuracy and veracity,
the algorithm and parameters have the highest influence on predictive power.
Although in many cases there is no perfect solution, there’s plenty of wiggle
room for optimization. Additionally, there’s always some theoretical
near-optimal solution that can be arrived at mostly through calculation and
decision making. Yet, arriving at these theoretical optimizations is
exceedingly difficult. In most cases, the decisions will be heuristic and any
errors will be removed after experimentation. Even with extensive industry
experience and professionalism, there is just too much room for error.
What NetOps Teams Should Know Before Starting Automation Journeys
Like all people, NetOps professionals enjoy the results of a job well done.
So, while the vision of their automation journey may be big, it’s important to
start with a small, short-term project that can be completed quickly. There
are a couple of benefits to this approach:Quick automation wins will give
NetOps teams confidence for future projects. Projects like this can generate
data and feedback that NetOps teams can convert into learnings and insights
for the next project. This approach can also be applied to bigger, more
complex automation projects. Instead of taking on the entire scale of the
project at once, NetOps teams can break it down into smaller components. ...
The advantages of this approach are the same as with the quick-win scenario:
There is a better likeliness of success and more immediate feedback and data
to guide the NetOps teams through this entire process. Finally, as talented as
most NetOps teams are, they are not likely to have all of the automation
expertise in-house at any given time.
Reducing the Cognitive Load Associated with Observability
Data points need to be filtered and transformed in order to generate the
proper signals. Nobody wants to be staring at a dashboard or tailing logs
24/7, so we rely on alerting systems. When an alert goes off, it is intended
for human intervention, which means transforming the raw signal into an
actionable event with contextual data: criticality of the alert, environments,
descriptions, notes, links, etc. It must be enough information to direct the
attention to the problem, but not too much to drown in noise. Above all else,
a page alert should require a human response. What else could justify
interrupting an engineer from their flow if the alert is not actionable? When
an alert triggers, analysis begins. While we eagerly wait for anomaly
detection and automated analysis to fully remove the human factor from this
equation, we can use a few tricks to help our brains quickly identify what’s
wrong. ... Thresholds are required for alert signals to trigger. When it comes
to visualization, people who investigate and detect anomaly need to consider
these thresholds too. Is this value in data too low or unexpectedly high?
The Urgent Need for AI in GRC and Security Operations: Are You Ready to Face the Future?
Another area where AI tools are transforming the IT industry is security
operations. Businesses face an ever-increasing number of cyberthreats, and it
can be challenging to stay ahead of these threats. AI tools can help by
automating many security operations, such as threat detection and incident
response. They can also help with risk assessment by analyzing large amounts
of data and identifying potential vulnerabilities. The benefits of AI tools in
the IT industry are clear. By automating processes and improving
decision-making, businesses can save time and money while reducing the risk of
errors. AI tools can also help businesses to be more agile and responsive to
changes in the market. However, the use of AI tools in the IT industry also
presents some challenges. One of the key challenges is the need for
specialized technical expertise. While AI tools can be user-friendly,
businesses still need to have specialized expertise to use the tools
effectively.
Quote for the day:
"People seldom improve when they have
no other model but themselves." -- Oliver Goldsmith
No comments:
Post a Comment