Quote for the day:
"It is easy to lead from the front when
there are no obstacles before you, the true colors of a leader are exposed
when placed under fire." -- Mark W. Boyer

Quantum computers are expected to solve problems currently intractable for
even the world’s fastest supercomputers. Their core strengths — efficiently
finding hidden patterns in complex datasets and navigating vast optimization
challenges — will enable the design of novel drugs and materials, the creation
of superior financial algorithms and open new frontiers in cryptography and
cybersecurity. ... The quantum ecosystem now largely agrees that simply
scaling up today’s computers, which suffer from significant noise and errors
that prevent fault-tolerant operation, won’t unlock the most valuable
commercial applications. The industry’s focus has shifted to quantum error
correction as the key to building robust and scalable fault-tolerant machines.
... Most early quantum computing companies tried a full-stack approach. Now
that the industry is maturing, a rich ecosystem of middle-of-the-stack players
has emerged. This evolution allows companies to focus on what they do best and
buy components and capabilities as needed, such as control systems from
Quantum Machines and quantum software development from firms ... recent
innovations in quantum networking technology have made a scale-out approach a
serious contender.

Exfiltration-first attacks have re-written the rules, with stolen data providing
criminals with a faster, more reliable payday than the complex mechanics of
encryption ever could. The threat of leaking data like financial records,
intellectual property, and customer and employee details delivers instant
leverage. Unlike encryption, if the victim stands firm and refuses to pay up,
criminal groups can always sell their digital loot on the dark web or use it to
fuel more targeted attacks. ... Phishing emails, once known for being riddled
with tell-tale grammar and spelling mistakes, are now polished, personalized and
delivered in perfect English. AI-powered deepfake voices and videos are
providing convincing impersonations of executives or trusted colleagues that
have defrauded companies for millions. At the same time, attackers are deploying
custom chatbots to manage ransom negotiations across multiple victims
simultaneously, applying pressure with the relentless efficiency of machines.
... Yet resilience is not simply a matter of dashboards and detection thresholds
– it is equally about supporting those on the frontlines. Security leaders
already working punishing hours under relentless scrutiny cannot be expected to
withstand endless fatigue and a culture of blame without consequence.
Organizations must also embed support for their teams into their response
frameworks, from clear lines of communication and decompression time to
wellbeing checks.

The uncertainty is driving concern. “There's been a lot more talk around,
‘Should we be managing sovereign cloud, should we be using on-premises more,
should we be relying on our non-North American public contractors?” said Tracy
Woo, a principal analyst with researcher and advisory firm Forrester. Ditching a
major public cloud provider over sovereignty concerns, however, is not a
practical option. These providers often underpin expansive global workloads, so
migrating to a new architecture would be time-consuming, costly, and complex.
There also isn’t a simple direct switch that companies can make if they’re
looking to avoid public cloud; sourcing alternatives must be done thoughtfully,
not just in reaction to one challenge. ... “There's a nervousness around
deployment of AI, and I think that nervousness comes from -- definitely in
conversations with other CIOs -- not knowing the data,” said Bell. Although
decoupling from the major cloud providers is impractical on many fronts, issues
of sovereignty as well as cost could still push CIOs to embrace a more localized
approach, Woo said. “People are realizing that we don't necessarily need all the
bells and whistles of the public cloud providers, whether that's for latency or
performance reasons, or whether it's for cost or whether that's for sovereignty
reasons,” explained Woo.

Agentic AI systems don’t just predict or recommend, they act. These intelligent
software agents operate with autonomy toward defined business goals, planning,
learning, and executing across enterprise workflows. This is not the next
version of traditional automation or static bots. It’s a fundamentally different
operating paradigm, one that will shape the future of digital enterprises. ...
For many enterprises, the last decade of AI investment has focused on surfacing
insights: detecting fraud, forecasting demand, and predicting churn. These are
valuable outcomes, but they still require humans or rigid automation to respond.
Agentic AI closes that gap. These agents combine machine learning, contextual
awareness, planning, and decision logic to take goal-directed action. They can
process ambiguity, work across systems, resolve exceptions, and adapt over time.
... Agentic AI will not simply automate tasks. It will reshape how work is
designed, measured, and managed. As autonomous agents take on operational
responsibility, human teams will move toward supervision, exception resolution,
and strategic oversight. New KPIs will emerge, not just around cost or cycle
time, but around agent quality, business impact, and compliance resilience. This
shift will also demand new talent models. Enterprises must upskill teams to
manage AI systems, not just processes.

The digital transformation of public services involves “an accelerated
convergence between IT and OT systems, as well as the massive incorporation of
connected IoT devices,” she explains, which gives rise to challenges such as an
expanding attack surface or the coexistence of obsolete infrastructure with
modern ones, in addition to a lack of visibility and control over devices
deployed by multiple providers. ... “According to the European Cyber Security
Organisation, 86% of European local governments with IoT deployments have
suffered some security breach related to these devices,” she says. Accenture’s
Domínguez adds that the challenge is to consider “the fragmentation of
responsibilities between administrations, concessionaires, and third parties,
which complicates cybersecurity governance and requires advanced coordination
models.” De la Cuesta also emphasizes the siloed nature of project development,
which significantly hinders the development of an active cybersecurity strategy.
... In the integration of new tools, despite Spain holding a leading position in
areas such as 5G, “technology moves much faster than the government’s ability to
react,” he says. “It’s not like a private company, which has a certain agility
to make investments,” he explains. “Public administration is much slower.
Budgets are different. Administrative procedures are extremely long. From the
moment a project is first discussed until it is actually executed, many years
pass.”

Welcome to the shadow SDLC — the one your team built with AI when you weren't
looking: It generates code, dependencies, configs, and even tests at machine
speed, but without any of your governance, review processes, or security
guardrails. ... It’s not just about insecure code sneaking into production, but
rather about losing ownership of the very processes you’ve worked to streamline.
Your “evil twin” SDLC comes with: Unknown provenance → You can’t always trace
where AI-generated code or dependencies came from. Inconsistent reliability → AI
may generate tests or configs that look fine but fail in production. Invisible
vulnerabilities → Flaws that never hit a backlog because they bypass reviews
entirely. ... AI assistants are now pulling in OSS dependencies you didn’t
choose — sometimes outdated, sometimes insecure, sometimes flat-out malicious.
While your team already uses hygiene tools like Dependabot or Renovate, they’re
only table stakes that don’t provide governance. ... The “evil twin” of your
SDLC isn’t going away. It’s already here, writing code, pulling dependencies,
and shaping workflows. The question is whether you’ll treat it as an
uncontrolled shadow pipeline — or bring it under the same governance and
accountability as your human-led one. Because in today’s environment, you don’t
just own the SDLC you designed. You also own the one AI is building — whether
you control it or not.

Researchers at Radware realized the issue earlier this spring, when they figured
out a way of stealing anything they wanted from Gmail users who integrate
ChatGPT. Not only was their trick devilishly simple, but it left no trace on an
end user's network — not even an iota of the suspicious Web traffic typical of
data exfiltration attacks. As such, the user had no way of detecting the attack,
let alone stopping it. ... To perform a ShadowLeak attack, attackers send an
outwardly normal-looking email to their target. They surreptitiously embed code
in the body of the message, in a format that the recipient will not notice — for
example, in extremely tiny text, or white text on a white background. The
code should be written in HTML, being standard for email and therefore less
suspicious than other, more powerful languages would be. ... The malicious code
can instruct the AI to communicate the contents of the victim's emails, or
anything else the target has granted ChatGPT access to, to an
attacker-controlled server. ... Organizations can try to compensate with their
own security controls — for example, by vetting incoming emails with their own
tools. However, Geenens points out, "You need something that is smarter than
just the regular-expression engines and the state machines that we've built.
Those will not work anymore, because there are an infinite number of
permutations with which you can write an attack in natural language."

This is reportedly the first quantum computer to be built using the standard
complementary metal-oxide-semiconductor (CMOS) chip fabrication process which is
the same transistor technology used in conventional computers. A key part of
this approach is building cryoelectronics that connect qubits with control
circuits that work at very low temperatures, making it possible to scale up
quantum processors greatly. “This is quantum computing’s silicon moment,” James
Palles‑Dimmock, Quantum Motion’s CEO, stated. ... In contrast to other quantum
computing approaches, the startup used high-volume industrial 300 millimeter
chipmaking processes from commercial foundries to produce qubits. The
architecture, control stack, and manufacturing approach are all built to scale
to host millions of qubits and pave the way for fault-tolerant, utility-scale,
and commercially viable quantum computing. “With the delivery of this system,
Quantum Motion is on track to bring commercially useful quantum computers to
market this decade,” Hugo Saleh, Quantum Motion’s CEO and president, revealed.
... The system’s underlying QPU is built on a tile-based architecture,
integrating all compute, readout, and control components into a dense,
repeatable array. This design enables future expansion to millions of qubits per
chip, with no changes to the system’s physical footprint.

The cloud has multiplied the fragmentation of solutions within companies,
expanding the number of environments, vendors, APIs, and integration approaches,
which has raised the skill set, necessitated more complex governance, and
prompted the emergence of cross-functional roles between IT and business.
Cybersecurity also introduces further levels of complexity, introducing new
platforms, monitoring tools, regulatory requirements, and risk management
approaches that must be overseen by expert personnel. And then there’s shadow
IT. With the ease of access to cloud technologies, it’s not uncommon for
business units to independently activate services without involving IT,
generating further risks. ... “Structured upskilling and reskilling programs are
needed to prepare people to manage new technologies,” says Massara. “So is an
organizational model capable of managing a growing number of projects, which can
no longer be handled in a one-off manner. The approach to project management is
changing because the project portfolio has expanded significantly, and a
structured PMO is required, with project managers who often no longer reside
solely in IT, but directly within the business.” ... While it’s true that an IT
system with disparate systems leads to greater complexity, companies are still
very cost-conscious and wary about heavily investing in unification right away.
But as systems become obsolete, they become more harmonized.

One of the most compelling arguments for independent third-party support is its
inherent vendor neutrality. When a company relies solely on a software vendor
for support, that vendor naturally has a vested interest in promoting its latest
upgrades, cloud migrations, and proprietary solutions. This can create a
conflict of interest, potentially pushing customers towards expensive,
unnecessary upgrades or discouraging them from exploring alternatives that might
be a better fit for their unique needs. ... The recent acquisition of VMware by
Broadcom provides a compelling and timely illustration of why third-party
support is becoming increasingly critical. Following the merger, many VMware
customers have expressed significant dissatisfaction with changes to licensing
models, product roadmaps, and, crucially, support. Broadcom has been criticized
for restructuring VMware’s offerings and reportedly reducing support for smaller
customers, pushing them towards bundled, more expensive solutions. ... The shift
towards third-party support isn’t just about cost savings; it’s about regaining
control, accessing unbiased expertise, and ensuring business continuity in a
rapidly changing technological landscape. For companies making critical
decisions about AI integration and managing complex enterprise systems,
providers like Spinnaker Support offer a strategic advantage.
No comments:
Post a Comment