How To Secure Microservices in a Multicloud Architecture
In a microservices architecture, each service operates independently, allowing
updates, maintenance and modifications without disrupting others. This isolation
should extend across infrastructure layers, including databases, ensuring no
service can access another’s data. Full isolation prevents attackers from moving
laterally within the system. ... Sensitive data, such as passwords or personal
information, should never be exposed in plain text or storage. Users and
automated systems can easily access this information making it vulnerable to
threats. Businesses should always remove or mask this information before storing
it in any records. Practices like TLS/HTTPS or encrypting logs are not enough,
since one caters to securing data in transit while the other secures data at
rest. Hence, the best way is to stop storing sensitive information altogether.
... Zero trust security works on the idea that no user or device should be
trusted by default, whether inside or outside the network. By using the zero
trust model, businesses can make sure every user and device is constantly
authenticated and authorized, no matter where they are. In microservices, this
means checking every interaction between services, enforcing strict access
controls and logging all actions.
The road to Industry 5.0 is your data and AI
When Industry 5.0 emerges, we can expect to see the convergence of all that
work and collected data. The next industrial revolution will be steeped in
bridging the physical and the digital realms. Effectively this goes back to
that human versus machine argument, but optimizing both human and machine to
enhance their capabilities. AI and cloud computing will reach a harmony where
workers can produce their best results, which can be replicated in processes
throughout the supply chain. Industrial AI powers our lives in the back end.
Industrial AI capabilities will enable power decision-making, and won't be a
force for contention despite speculation. ... From the regulatory complexities
of data collection and storage to varying levels of AI adoption within
businesses, a successful transition into Industry 5.0 requires expert support.
Costs of AI investments can snowball, so you must be strategic and targeted at
improving specific areas of your business. Generic, off-the-shelf AI tools
trained on irrelevant data won’t help here. To remain competitive at a global
scale, companies need to invest in this technology and work with proven
partners.
Why we’re teaching LLMs to forget things
Selective forgetting, something that humans are all too good at, turns out
to be exceptionally difficult to recreate in machine learning models. That’s
especially true for a class of AI models known as foundation models that may
have picked up personal, copyrighted, or toxic information buried in their
training data. ... “True unlearning tries to remove all vestiges of the
unwanted information, so that when the model gets a problematic question, it
simply doesn’t have the answer,” she added. “A model that has ‘unlearned’
insulting behavior no longer knows how to be toxic.” Ideally, unlearning
also comes with a mathematical guarantee that the unwanted data’s influence
on the model has been erased. Achieving that gold standard, however,
typically involves retraining the model, which for LLMs can be prohibitively
expensive. One option for unlearning without guarantees is to fine-tune the
model on the unwanted data using an optimization technique known as gradient
ascent to forget connections between data points. “Using gradient ascent to
update the model’s weights is like running the model’s training in reverse,”
said Swanand Ravindra Kadhe, a senior research scientist at IBM Research
focused on unlearning.
Will IPv6 ever replace IPv4?
The year is 2024 though, and the internet still runs on IPv4. So where did
it all go wrong? IPv6 has been in migration hell for decades, with every
kind of possible initiative to improve IPv6 adoption falling flat, from an
official World IPv6 Day in 2011, the World IPv6 'launch' in 2012, and
several US Federal government action plans in 2005, 2010, and 2020
(including mandating IPv6 readiness for government networks - a deadline
initially set at 2012 and now extended to 2025). There have been numerous
incentives for schools and businesses, promotional campaigns from registries
and ISPs, conferences, and education campaigns. ... Another serious problem
that's faced IPv6 adoption is NAT. NAT is a technology which was designed in
1994 to reduce the number of global IPv4 addresses needed. It allows devices
on a private network to share a single IP address, and is present in almost
all home routers (and has been for decades). NAT is the reason why your
computer has an 'internal' IP address, and needs port forwarding to be
accessible directly from the internet (firewall aside). NAT has allowed us
to continue to grow the number of devices online well past the exhaustion
point of IPv4 to a whopping 30 billion devices.
How the increasing demand for cyber insurance is changing the role of the CISO
Despite CISOs overseeing cybersecurity and the controls meant to blunt cyber
risk, they have not historically been the executives who decide whether
their organization buys cyber insurance. Instead, CFOs or chief risk
officers typically make the call and determine what levels of protection to
buy. However, CISOs are taking on larger roles — as they should — in those
discussions and the decision-making process because they’re well-positioned
to understand the threat landscape, the types of threats that could impact
them, and how each one could impact the organization, says Paul Caron, Head
of Cybersecurity, Americas at S-RM, a global corporate intelligence and
cyber security consultancy. Generally speaking, CISOs are also best
positioned to share the organization’s cybersecurity strategy and details of
its security controls with insurance brokers or carriers, Caron says. “CISOs
are the ones who can best tell their story.” And CISOs are best positioned
to review the resources that a selected insurance company would possess to
respond to an event and whether those resources would be the best
choices.
Many C-suite execs have lost confidence in IT, including CIOs
Many C-suite executives want the IT team to both keep the systems running
and drive strategic innovation, he says, a challenging balance act.
“Organizations perceive IT as struggling to meet these demands, particularly
in deploying new technologies like AI, which have raised expectations among
business leaders,” he says. “Challenges in managing legacy systems and
ongoing talent shortages further exacerbate this issue.” In many cases, the
traditional IT team has been separated from the R&D team, with the IT
teams tasked with keeping the lights on, some tech leaders say. With IT and
business strategies getting more intertwined, and the hard truths involved
in that, the value traditionally driven by IT has shifted to product
engineering and business units, says Martin Mao, CEO and co-founder of
Chronosphere, a cloud observability platform. “The value is not seen in
keeping the wheels on the bus,” he says. “IT is stuck in a negative spiral
of cost cutting and defense mode versus innovation. There is a huge talent
drain occurring from IT to the product engineering side of the house.” IT
teams are often burdened with maintaining legacy systems while
simultaneously asked to support new technologies such as AI, infrastructure
as code, containerization, and cloud services, adds Kenny Van Alstyne
5 ways data scientists can prepare now for genAI transformation
Data scientists have traditionally developed dashboards as quick and easy
ways to learn about new data sets or to help business users answer questions
about their data. While data visualization and analytics platforms have
added natural language querying and machine learning algorithms over the
last several years, data scientists should anticipate a new wave of
genAI-driven innovations. ... “With generative AI, the reliance on
traditional dashboards diminishes as users can remove the noise of the
analytics and get to actionable insights conversationally. Freed from ad-hoc
dashboard-generation, data analysts and data scientists will concentrate on
documenting organizational knowledge into semantic layers and conducting
strategic analytics, creating a virtuous cycle.” Another prediction comes
from Jerod Johnson, senior technology evangelist at CData, saying, “As genAI
platforms become integrated into visualization tools, they enable more
dynamic and interactive representations of data, allowing for real-time
synthesis and scenario analysis. Over the next few years, data scientists
can expect these tools to evolve to make visualizations more intuitive and
insightful, even answering unasked questions for innovative discoveries.”
Can Responsible AI Live In The Cloud?
There are both benefits to be realised and pitfalls to be avoided when
migrating AI to the cloud. Cloud providers offer high-spec, affordable
infrastructure, often with better security arrangements than on-premises
systems can provide – not to mention the capability to handle routine
patching and updates. But there are a number of other factors to be mindful
of, including: – Sovereignty: In many cases, it doesn’t matter where models
are being trained, data transfer fees permitting. Compute in one area may be
significantly cheaper than in another, but if you’re moving data to another
country it’s important to understand how data will be handled there,
including any differences in governmental or security process. –
Sustainability: It’s also important to know how sustainable and
power-efficient your AI cloud partner is, particularly if you’re
transferring data to another country. Some countries have very good
renewable energy mixes – but others don’t, and some datacentres are
intrinsically more efficient than others. Do remember that your AI cloud
provider will form part of your scope 3 emissions, so it pays to do your due
diligence, particularly since AI can be very power hungry. – Suitability:
The kind of data that your AI system is processing will have an impact on
the kind of environment that it needs.
The role of self-sovereign identity in enterprises
By allowing users to selectively share identity attributes, SSI mitigates
the risk of overexposure of personal data. This is particularly important in
industries like healthcare, financial services, and government, where
stringent regulations such as GDPR, HIPAA, and CCPA dictate how personal
information is managed. Passwords and traditional authentication methods
have long been weak links in enterprise security, and a source of user
friction. SSI can eliminate the need for passwords by enabling secure,
passwordless authentication via verifiable credentials. This reduces the
friction for users while maintaining high security standards. SSI can also
improve customer satisfaction by simplifying secure access to services. For
enterprises, SSI can also drive efficiency. By decentralizing identity
verification, businesses can reduce their reliance on third-party identity
providers, cutting costs and minimizing the delays associated with identity
proofing processes. SSI’s interoperability across platforms and services
ensures that enterprises can implement a single identity solution that works
across a wide variety of use cases, from onboarding employees to
authenticating customers and partners.
Reachability and Risk: Prioritizing Protection in a Complex Security Landscape
Despite the benefits, most organizations lack the tools and processes to
analyze reachability across their infrastructure. Most are limited to a few
common approaches with known downsides. External vulnerability scanners
provide limited visibility into internal networks. Penetration testing
typically focuses on external attack surfaces. And, manual analysis is
incredibly time-consuming and error-prone. Achieving comprehensive
reachability analysis is challenging, especially for large environments with
tens of thousands of assets, as it’s difficult to compute all the states
that a system might reach during operation. ... To address these challenges,
organizations should leverage network digital twin technology. A
sophisticated network digital twin collects L2-L7 state and configuration
data across all network devices (load balancers, routers, firewalls and
switches). This data is then used to create an accurate topology (on-prem
and multi-cloud), calculate all possible paths within the network, analyze
detailed behavioral information and make network configuration and behavior
searchable and verifiable. Creating an accurate digital replica of an
organization’s network infrastructure allows for automated analysis of
potential attack paths and reachability between assets.
Quote for the day:
"Leadership is practices not so much
in words as in attitude and in actions." -- Harold Geneen
No comments:
Post a Comment