Dedicated Cloud: What It’s For and How It’s Different From Public Cloud
While dedicated cloud services give you a level of architectural control you will not get from public clouds, using them comes with trade-offs, the biggest one being the amount of infrastructure engineering ability needed. But if your team has concluded that a public cloud isn’t a good fit, you probably know that already and have at least some of that ability on hand. ... Ultimately, dedicated cloud is about keeping control and giving yourself options. You can quickly deploy different combinations of resources, interconnecting dedicated infrastructure with public cloud services, and keep fine-tuning and refining as you go. You get full control of your data and your architecture with the freedom to change your mind. The trade-off is that you must be ready to roll up your sleeves and manage operating systems, deploy storage servers, tinker with traffic routing and do whatever else you need to do to get your architecture just right. But again, if you already know that you need more knobs than you can turn using a typical public cloud provider, you are probably ready anyway.
Building a More Sustainable Data Center: Challenges and Opportunities in the AI Era
Sustainability is not just a compliance exercise on reducing the negative impact
on the environment, it also can bring financial benefits to an organization.
According to Gartner’s Unlock the Business Benefits of Sustainable IT
Infrastructure report, “[Infrastructure and operations’] contribution to
sustainability strategies tends to focus on environmental impact, but
sustainability also can have a significant positive impact on non-environmental
factors, such as brand, innovation, resilience and attracting talent.” As a
result, boards should embrace the financial opportunities of companies’
Environmental, Sustainability, and Governance (ESG) compliance rather than
consider it just another unavoidable compliance expense without a discernable
return on investment (ROI). ... To improve data center resilience, Gartner
recommends that organizations expand use of renewable energy using a long-term
power purchase agreement to contain costs, generate their own power where
feasible, and reuse and redeploy equipment as much as possible to maximize the
value of the resource.
Data Business Evaluation
Why data businesses? Because they can be phenomenal businesses with
extremely high gross margins — as good or better than software-as-a-service
(SaaS). Often data businesses can be the best businesses within the
industries that they serve. ... Data aggregation can be a valuable way to
assemble a data asset as well, but the value typically hinges on the
difficulty of assembling the data…if it is too easy to do, others will do it
as well and create price competition. Often the value comes in aggregating a
long tail of data that is costly to do more than once either for the
suppliers or a competitive aggregator. ... The most stable data businesses
tend to employ a subscription business model in which customers subscribe to
a data set for an extended period of time. Subscriptions models are clearly
better when the subscriptions are long term or, at least, auto-renewing. Not
surprisingly, the best data businesses are generally syndicated subscription
models. On the other end, custom data businesses that produce data for
clients in a one-off or project-based manner generally struggle to attain
high margins and predictability, but can be solid businesses if the data
manufacturing processes are optimized
Leveraging AI for water management
AI is reshaping the landscape of water management by providing predictive
insights, optimising operations, and enabling real-time decision-making. One
of AI’s key contributions is its ability to forecast water usage patterns.
AI models can accurately predict water demand by analysing historical data
and considering variables like weather conditions, population trends, and
industrial activities. This helps water utilities allocate resources more
effectively, minimising waste while ensuring consistent supply to
communities. Water utilities can also integrate AI systems to monitor and
optimise their supply networks. ... One of the most critical applications of
AI is in water quality monitoring. Traditional methods of detecting water
contaminants are labour-intensive and involve periodic testing, which can
result in delayed responses to contamination events. AI, on the other hand,
can process continuous data streams from IoT-enabled sensors installed in
water distribution systems. These sensors monitor variables like pH levels,
temperature, and turbidity, detecting changes in water quality in real time.
AI algorithms analyse the data, triggering immediate alerts when
contaminants or irregularities are detected.
History of Cybersecurity: Key Changes Since the 1990s and Lessons for Today
Most cyber attackers hadn’t considered using the internet to pursue
financial gain or cause serious harm to organizations. To be sure, financial
crimes based on computer hacking took place in the '90s and early 2000s. But
they didn't dominate the news in an endless stream of cautionary tales, and
most people thought the 1995 movie Hackers was a realistic depiction of how
hacking worked. ... By the mid-2000s, however, internet-based attacks became
more harmful and frequent. This was the era when threat actors realized they
could build massive botnets and then use them to distribute spam or send
scam emails. These attacks could have caused real financial harm, but they
weren't exactly original types of criminal activity. They merely conducted
traditional criminal activity, like scams, using a new medium: the internet.
... The 2010s were also a time of massive technological change. The advent
of cloud computing, widespread adoption of mobile devices, and rollout of
Internet of Things (IoT) hardware meant businesses could no longer define
clear network perimeters or ensure that sensitive data always remained in
their data centers.
Gateways to havoc: Overprivileged dormant service accounts
Dormant accounts go unnoticed, leaving organizations unaware of their access
privileges, the systems they connect to, how to access them, and even of
their purpose of existence. Their elevated privileges, lax security
measures, and invisibility, make dormant service accounts prime targets for
infiltration. By compromising such an account, attackers can gain
significant access to systems and sensitive data, often without raising
immediate suspicion for extended periods of time. During that time, cyber
criminals can elevate privileges, exfiltrate data, disrupt operations, and
install malware and backdoors, causing total mayhem completely undetected
until it’s too late. The weaknesses that plague dormant accounts make them
open doors into an organization’s system. If compromised, an overprivileged
dormant account can give way to sensitive data such as customer PII, PHI,
intellectual property, and financial records, leading to costly and damaging
data breaches. Even without being breached, dormant accounts are significant
liabilities, potentially causing operational disruptions and regulatory
compliance violations.
Overcoming AI hallucinations with RAG and knowledge graphs
One challenge that has come up in deploying RAG into production environments
is that it does not handle searches across lots of documents that contain
similar or identical information. When these files are chunked and turned
into vector embeddings, each one will have its data available for searching.
When each of those files has very similar chunks, finding the right data to
match that request is harder. RAG can also struggle when the answer to a
query exists across a number of documents that cross reference each other.
RAG is not aware of the relationships between these documents. ... Rather
than storing data in rows and columns for traditional searches, or as
embeddings for vector search, a knowledge graph represents data points as
nodes and edges. A node will be a distinct fact or characteristic, and edges
will connect all the nodes that have relevant relationships to that fact. In
the example of a product catalog, the nodes may be the individual products
while the edges will be similar characteristics that each of those products
possess, like size or color.
Preparing for the next big cyber threat
In addressing emerging threats, CISOs will have to incorporate controls to
counter adversarial AI tactics and foster synergies with data and AI
governance teams. Controls to ensure quantum-resistant cryptography in the
symmetric space to future-proof encrypted data and transmissions will also
be put in place if they are not already. Many organizations — including
banks — are already enforcing the use of quantum-resistant cryptography, for
instance, with the use of the Advanced Encryption Standard (AES)-256
algorithm because data encrypted by it is not vulnerable to cracking by
quantum computers. Zero trust as a mindset and approach will be very
important, especially in addressing insecure design components of OT
environments used in Industry 4.0. Therefore, one of the key areas of
strengthening protection would also be identity and access management (IAM).
... As part of strong cyber resilience, we need sound IR playbooks to
effectively draw bridges, we need plan Bs and plan Cs, business continuities
as well as table-tops and red teams that involve our supply chain vendors.
And finally, response to the ever-evolving threat landscape will entail
greater adaptability and agility.
The Impact of AI on The Ethernet Switch Market
Enterprises investing in new infrastructure to support AI will have to
choose which technology is best for their particular needs. InfiniBand and
Ethernet will likely continue to coexist for the foreseeable future. It’s
highly likely that Ethernet will remain dominant in most network
environments while InfiniBand will retain its foothold in high-performance
computing and specialized AI workloads. ... While InfiniBand has several
very strong advantages, advances in Ethernet are quickly closing the gap,
making its ubiquity likely to continue. There are multiple other reasons
that enterprises are likely to stick with Ethernet, too, such as lower cost,
existing in-house talent, prolific integrations with existing
infrastructures, and compatibility with legacy applications, among others.
... The Ultra Ethernet Consortium is proactively working to extend
Ethernet's life to ensure it remains useful and cost-effective for both
current and future technologies. The aim is primarily to reduce the need for
drastic shifts to alternative solutions that may constitute heavy lifts and
costs in adapting existing networks.
Making the Complex Simple: Authorization for the Modern Enterprise
Modernizing legacy authorization systems is essential for organizations to
enhance security and support their growth and innovation. Modernizing and
automating operations allows organizations to overcome the limitations of
legacy systems, enhance the protection of sensitive information and stay
competitive in today’s digital landscape. Simplifying access control and
automating workflows to modernize and optimize operations greatly increases
productivity and lowers administrative burdens. Organizations can direct
important resources toward more strategic endeavors by automating repetitive
operations, which increases output and promotes an agile corporate
environment. This change improves operational efficiency and puts businesses
in a better position to adapt to changing market demands. Enhancing security
is another critical benefit of modernizing authorization systems.
Centralized management coupled with advanced role-based access control
(RBAC) strengthens an organization’s security posture by preventing
unauthorized access. Centralized systems allow for efficient user
permissions management, ensuring that only authorized individuals can access
sensitive information.
Quote for the day:
"Motivation will almost always beat
mere talent." -- Ralph Augustine Norman
No comments:
Post a Comment