New framework aims to keep AI safe in US critical infrastructure
According to a release issued by DHS, “this first-of-its kind resource was
developed by and for entities at each layer of the AI supply chain: cloud and
compute providers, AI developers, and critical infrastructure owners and
operators — as well as the civil society and public sector entities that
protect and advocate for consumers.” ... Naveen Chhabra, principal analyst
with Forrester, said, “while average enterprises may not directly benefit from
it, this is going to be an important framework for those that are investing in
AI models.” ... Asked why he thinks DHS felt the need to create the
framework, Chhabra said that developments in the AI industry are “unique, in
the sense that the industry is going back to the government and asking for
intervention in ensuring that we, collectively, develop safe and secure AI.”
... David Brauchler, technical director at cybersecurity vendor NCC sees the
guidelines as a beginning, pointing out that frameworks like this are just a
starting point for organizations, providing them with big picture guidelines,
not roadmaps. He described the DHS initiative in an email as “representing
another step in the ongoing evolution of AI governance and security that we’ve
seen develop over the past two years. It doesn’t revolutionize the discussion,
but it aligns many of the concerns associated with AI/ML systems with their
relevant stakeholders.”
Building an Augmented-Connected Workforce
An augmented workforce can work faster and more efficiently thanks to seamless
access to real-time diagnostics and analytics, as well as live remote
assistance, observes Peter Zornio, CTO at Emerson, an automation technology
vendor serving critical industries. "An augmented-connected workforce
institutionalizes best practices across the enterprise and sustains the value
it delivers to operational and business performance regardless of workforce
size or travel restrictions," he says in an email interview. An
augmented-connected workforce can also help fill some of the gaps many
manufacturers currently face, Gaus says. "There are many jobs unfilled because
workers aren't attracted to manufacturing, or lack the technological skills
needed to fill them," he explains. ... For enterprises that have already
invested in advanced digital technologies, the path leading to an
augmented-connected workforce is already underway. The next step is ensuring a
holistic approach when looking at tangible ways to achieve such a workforce.
"Look at the tools your organization is already using -- AI, AR, VR, and so on
-- and think about how you can scale them or connect them with your human
talent," Gaus says. Yet advanced technologies alone aren't enough to guarantee
long-term success.
DORA and why resilience (once again) matters to the board
DORA, though, might be overlooked because of its finance-specific focus. The
act has not attracted the attention of NIS2, which sets out cybersecurity
standards for 15 critical sectors in the EU economy. And NIS2 came into force
in October; CIOs and hard-pressed compliance teams could be forgiven for not
focusing on another piece of legislation that is due in the New Year. But
ignoring DORA altogether would be short-sighted. Firstly, as Rodrigo Marcos,
chair of the EU Council at cybersecurity body CREST points out, DORA is a law,
not a framework or best practice guidelines. Failing to comply could lead to
penalties. But DORA also covers third-party risks, which includes digital
supply chains. The legislation extends to any third party supplying a
financial services firm, if the service they supply is critical. This will
include IT and communications suppliers, including cloud and software vendors.
... And CIOs are also putting more emphasis on resilience and recovery. In
some ways, we have come full circle. Disaster recovery and business continuity
were once mainstays of IT operations planning but moved down the list with the
move to the cloud. Cyber attacks, and especially ransomware, have pushed both
resilience and recovery right back up the agenda.
Data Is Not the New Oil: It’s More Like Uranium
Comparing data to uranium is an accurate analogy. Uranium is radioactive and
it is imperative to handle it carefully to avoid radiation exposure, the
effects of which are linked to serious health and safety concerns. Issues with
the deployment of uranium, such as in reactors, for instance, can lead to
radioactive fallouts that are expensive to contain and have long-term health
consequences for impacted individuals. The possibility of uranium being stolen
poses significant risks and global repercussions. Data exhibits similar
characteristics. It is critical for it to be stored safely, and those who
experience data theft are forced to deal with long-term consequences –
identity theft and financial concerns, for example. An organization
experiencing a cyberattack must deal with regulatory oversight and fines. In
some cases, losing sensitive data can trigger significant global consequences.
... Maintaining a data chain of custody is paramount. Some companies allow all
employees access to all records, which increases the surface area of a
cyberattack, and compromised employees could lead to a data breach. Even a
single compromised employee computer can lead to a more extensive hack.
Consider the case of the nonprofit healthcare network Ascension, which
operates 140 hospitals and 40 senior care facilities.
Palo Alto Reports Firewalls Exploited Using an Unknown Flaw
Palo Alto said the flaw is being remotely exploited, has a "critical" severity
rating of 9.3 out of 10 on the CVSS scale and that mitigating the
vulnerability should be treated with the "highest" urgency. One challenge for
users: no patch is yet available to fix the vulnerability. Also, no CVE code
has been allocated for tracking it. "As we investigate the threat activity, we
are preparing to release fixes and threat prevention signatures as early as
possible," Palo Alto said. "At this time, securing access to the management
interface is the best recommended action." The company said it doesn't believe
its Prisma Access or Cloud NGFW are at risk from these attacks. Cybersecurity
researchers confirm that real-world details surrounding the attacks and flaws
remain scant. "Rapid7 threat intelligence teams have also been monitoring
rumors of a possible zero-day vulnerability, but until now, those rumors have
been unsubstantiated," the cybersecurity firm said in a Friday blog post. Palo
Alto first warned customers on Nov. 8 that it was investigating reports of a
zero-day vulnerability in the management interface for some types of firewalls
and urged them to lock down the interfaces.
Award-winning palm biometrics study promises low-cost authentication
“By harnessing high-resolution mmWave signals to extract detailed palm
characteristics,” he continued, “mmPalm presents an ubiquitous, convenient and
cost-efficient option to meet the growing needs for secure access in a smart,
interconnected world.” The mmPalm method employs mmWave technology, which is
widely used in 5G networks, to capture a person’s palm characteristics by
sending and analyzing reflected signals and thereby creating a unique palm
print for each user. Beyond this, mmPalm also meets the difficulties that can
arise in authentication technology like distance and hand orientation. The
system uses a type of AI called the Conditional Generative Adversarial Network
(cGAN) to learn different palm orientations and distances, and generates
virtual profiles to fill in gaps. In addition, the system will adapt to
different environments using a transfer learning framework so that mmPalm is
suited to various settings. The system also builds virtual antennas to
increase the spatial resolution of a commercial mmWave device. Tested with 30
participants over six months, mmPalm displayed a 99 percent accuracy rate and
was resistant to impersonation, spoofing and other potential breaches.
Scaling From Simple to Complex Cache: Challenges and Solutions
To scale a cache effectively, you need to distribute data across multiple nodes
through techniques like sharding or partitioning. This improves storage
efficiency and ensures that each node only stores a portion of the data. ... A
simple cache can often handle node failures through manual intervention or basic
failover mechanisms. A larger, more complex cache requires robust
fault-tolerance mechanisms. This includes data replication across multiple
nodes, so if one node fails, others can take over seamlessly. This also includes
more catastrophic failures, which may lead to significant down time as the data
is reloaded into memory from the persistent store, a process known as warming up
the cache. ... As the cache gets larger, pure caching solutions struggle to
provide linear performance in terms of latency while also allowing for the
control of infrastructure costs. Many caching products were written to be fast
at small scale. Pushing them beyond what they were designed for exposes
inefficiencies in underlying internal processes. Potential latency issues may
arise as more and more data are cached. As a consequence, cache lookup times can
increase as the cache is devoting more resources to managing the increased scale
rather than serving traffic.
Understanding the Modern Web and the Privacy Riddle
The main question is users’ willingness to surrender their data and not question
the usage of this data. This could be attributed to the effect of the virtual
panopticon, where users believe they are cooperating with agencies (government
or private) that claim to respect their privacy in exchange for services. The
Universal ID project (Aadhar project) in India, for instance, began as a means
to provide identity to the poor in order to deliver social services, but has
gradually expanded its scope, leading to significant function creep. Originally
intended for de-duplication and preventing ‘leakages,’ it later became essential
for enabling private businesses, fostering a cashless economy, and tracking
digital footprints. ... In the modern web, users occupy multiple roles—as
service providers, users, and visitors—while adopting multiple personas. This
shift requires greater information disclosure, as users benefit from the web’s
capabilities and treat their own data as currency. The unraveling of privacy has
become the new norm, where withholding information is no longer an option due to
the stigmatization of secrecy. Over the past few years, there has been a
significant shift in how consumers and websites view privacy. Users have
developed a heightened sensitivity to the use of their personal information and
now recognize their basic right to internet privacy.
Databases Are a Top Target for Cybercriminals: How to Combat Them
Most ransomware can encrypt pages within a database—Mailto, Sodinokibi (REvil),
and Ragnar Locker—and destroy the database pages. This means the slow, unknown
encryption of everything, from sensitive customer records to critical networks
resources, including Active Director, DNS, and Exchange, and lifesaving patient
health information. Because databases can continue to run even with corrupted
pages, it can take longer to realize that they have been attacked. Most often,
it is the wreckage of the attack that is usually found when the database is
taken down for routine maintenance, and by that time, thousands of records could
be gone. Databases are an attractive target for cybercriminals because they
offer a wealth of information that can be used or sold on the dark web,
potentially leading to further breaches and attacks. Industries such as
healthcare, finance, logistics, education, and transportation are particularly
vulnerable. The information contained in these databases is highly
valuable, as it can be exploited for spamming, phishing, financial fraud, and
tax fraud. Additionally, cybercriminals can sell this data for significant sums
of money on dark web auctions or marketplaces.
The Impact of Cloud Transformation on IT Infrastructure
With digital transformation accelerating across industries, the IT ecosystem
comprises traditional and cloud-native applications. This mixed environment
demands a flexible, multi-cloud strategy to accommodate diverse application
requirements and operational models. The ability to move workloads between
public and private clouds has become essential, allowing companies to
dynamically balance performance and cost considerations. We are committed to
delivering cloud solutions supporting seamless workload migration and
interoperability, empowering businesses to leverage the best of public and
private clouds. ... With today’s service offerings and various tools, migrating
between on-premises and cloud environments has become straightforward, enabling
continuous optimization rather than one-time changes. Cloud-native applications,
particularly containerization and microservices, are inherently optimized for
public and private cloud setups, allowing for dynamic scaling and efficient
resource use. To fully optimize, companies should adopt cloud-native principles,
including automation, continuous integration, and orchestration, which
streamline performance and resource efficiency. Robust tools like identity and
access management (IAM), encryption, and automated security updates address
security and reliability, ensuring compliance and data protection.
Quote for the day:
"The elevator to success is out of
order. You’ll have to use the stairs…. One step at a time.” --
Rande Wilson
No comments:
Post a Comment