Quote for the day:
“If you don’t have a competitive advantage, don’t compete.” -- Jack Welch
Intuit learned to build AI agents for finance the hard way: Trust lost in buckets, earned back in spoonfuls
Intuit's technical strategy centers on a fundamental design decision. For
financial queries and business intelligence, the system queries actual data,
rather than generating responses through large language models (LLMs). Also
critically important: That data isn't all in one place. Intuit's technical
implementation allows QuickBooks to ingest data from multiple distinct sources:
native Intuit data, OAuth-connected third-party systems like Square for payments
and user-uploaded files such as spreadsheets containing vendor pricing lists or
marketing campaign data. This creates a unified data layer that AI agents can
query reliably. ... Beyond the technical architecture, Intuit has made
explainability a core user experience across its AI agents. This goes beyond
simply providing correct answers: It means showing users the reasoning behind
automated decisions. When Intuit's accounting agent categorizes a transaction,
it doesn't just display the result; it shows the reasoning. This isn't marketing
copy about explainable AI, it's actual UI displaying data points and
logic. ... In domains where accuracy is critical, consider whether you need
content generation or data query translation. Intuit's decision to treat AI as
an orchestration and natural language interface layer dramatically reduces
hallucination risk and avoids using AI as a generative system.Step aside, SOC. It’s time to ROC
The typical SOC playbook is designed to contain or remediate issues after the
fact by applying a patch or restoring a backup, but they don’t anticipate or
prevent the next hit. That structure leaves executives without the proper
context or language they need to make financially sound decisions about their
risk exposure. ... At its core, the Resilience Risk Operations Center (ROC) is a
proactive intelligence hub. Think of it as a fusion center in which cyber,
business and financial risk come together to form one clear picture. While the
idea of a ROC isn’t entirely new — versions of it have existed across government
and private sectors — the latest iterations emphasize collaboration between
technical and financial teams to anticipate, rather than react to, threats. ...
Of course, building the ROC wasn’t all smooth sailing. Just like military
adversaries, cyber criminals are constantly evolving and improving. Scarier yet,
just a single keystroke by a criminal actor can set off a chain reaction of
significant disruptions. That makes trying to anticipate their next move feel
like playing chess against an opponent who is changing the rules mid-game. There
was also the challenge of breaking down the existing silos between cyber, risk
and financial teams. ... The ROC concept represents the first real step in that
journey towards cyber resilience. It’s not as a single product or platform, but
as a strategic shift toward integrated, financially informed cyber
defense.
Data Migration in Software Modernization: Balancing Automation and Developers’ Expertise
The Quiet Rise of AI’s Real Enablers
“Models need so much more data and in multiple formats,” shared George
Westerman, Senior Lecturer and Principal Research Scientist, MIT Sloan School of
Management. “Where it used to be making sense of structured data, which was
relatively straightforward, now it’s: ‘What do we do with all this unstructured
data? How do we tag it? How do we organize it? How do we store it?’ That’s a
bigger challenge.” ... As engineers get pulled deeper into AI work, their
visibility is rising. So is their influence on critical decisions. The report
reveals that data engineers are now helping shape tooling choices,
infrastructure plans, and even high-level business strategy. Two-thirds of the
leaders say their engineers are involved in selecting vendors and tools. More
than half say they help evaluate AI use cases and guide how different business
units apply AI models. That represents a shift from execution to influence.
These engineers are no longer just implementing someone else’s ideas. They are
helping define the roadmap. It also signals something bigger. AI success is not
just about algorithms. It is about coordination. ... So the role and visibility
of data engineers are clearly changing. But are we seeing real gains in
productivity? The report suggests yes. More than 70 percent of tech leaders said
AI tools are already making their teams more productive. The workload might be
heavier, but it’s also more focused. Engineers are spending less time fixing
brittle pipelines and more time shaping long-term infrastructure.The silent killer of CPG digital transformation: Data & knowledge decay
Data without standards is chaos. R&D might record sugar levels as “Brix,” QA
uses “Bx,” and marketing reduces it to “sweetness score.” When departments speak
different data languages, integration becomes impossible. ... When each function
hoards its own version of the truth, leadership decisions are built on
fragments. At one CPG I observed, R&D reported a product as cost-neutral to
reformulate, while supply chain flagged a 12% increase. Both were “right” based
on their datasets — but the company had no harmonized golden record. ... Senior
formulators and engineers often retire or are poached, taking decades of
know-how with them. APQC warns that unmanaged knowledge loss directly threatens
innovation capacity and recommends systematic capture methods. I’ve seen this
play out: a CPG lost its lead emulsification expert to a competitor. Within six
months, their innovation pipeline slowed dramatically, while their competitor
accelerated. The knowledge wasn’t just valuable — it was strategic. ...
Intuition still drives most big CPG decisions. While human judgment is critical,
relying on gut feel alone is dangerous in the age of AI-powered formulation and
predictive analytics. ... Define enterprise-wide data standards: Create master
schemas for formulations, processes and claims. Mandate structured inputs.
Henkel’s success demonstrates that without shared standards, even the best tools
underperform.
From Chef to CISO: An Empathy-First Approach to Cybersecurity Leadership
Rather than focusing solely on technical credentials or a formal cybersecurity education, Lyons prioritizes curiosity and hunger for learning as the most critical qualities in potential hires. His approach emphasizes empathy as a cornerstone of security culture, encouraging his team to view security incidents not as failures to be punished, but as opportunities to coach and educate colleagues. ... We're very technically savvy and it's you have a weak moment or you get distracted because you're a busy person. Just coming at it and approaching it with a very thoughtful culture-oriented response is very important for me. Probably the top characteristic of my team. I'm super fortunate. And that I have people from ages, from end to end, backgrounds from end to end that are all part of the team. But one of those core principles that they all follow with is empathy and trying to grow culture because culture scales. ... anyone who's looking at adopting new technologies in the cybersecurity world is firstly understand that the attackers have access to just about everything that you have. So, they're going to come fast and they're going to come hard at you and its they can make a lot more mistakes than you have. So, you have to focus and ensure that you're getting right every day what they can have the opportunity to get wrong.It takes an AWS outage to prioritize diversification
AWS’s latest outage, caused by a data center malfunction in Northern Virginia,
didn’t just disrupt its direct customers; it served as a stark reminder of how
deeply our digital world relies on a select few cloud giants. A single system
hiccup in one region reverberated worldwide, stopping critical services for
millions of users. ... The AWS outage is part of a broader pattern of
instability common to centralized systems. ... The AWS outage has reignited a
longstanding argument for organizational diversification in the cloud sector.
Diversification enhances resilience. It decentralizes an enterprise’s exposure
to risks, ensuring that a single provider’s outage doesn’t completely paralyze
operations. However, taking this step will require initiative—and courage—from
IT leaders who’ve grown comfortable with the reliability and scale offered by
dominant providers. This effort toward diversification isn’t just about using a
multicloud strategy (although a combined approach with multiple hyperscalers is
an important aspect). Companies should also consider alternative platforms and
solutions that add unique value to their IT portfolios. Sovereign clouds,
specialized services from companies like NeoCloud, managed service providers,
and colocation (colo) facilities offer viable options. Here’s why they’re worth
exploring. ... The biggest challenge might be psychological rather than
technical. Many companies have internalized the idea that the hyperscalers are
the only real options for cloud infrastructure.
What brain privacy will look like in the age of neurotech
What Meta has just introduced, what Apple has now made native as part of its accessibility protocols, is to enable picking up your intentions through neural signals and sensors that AI decodes to allow you to navigate through all of that technology. So I think the first generation of most of these devices will be optional. That is, you can get the smart watch without the neural band, you can get the airpods without the EEG [electroencephalogram] sensors in them. But just like you can't get an Apple watch now without getting an Apple watch with a heart rate sensor, second and third generation of these devices, I think your only option will be to get the devices that have the neural sensors in them. ... There's a couple of ways to think about hacking. One is getting access to what you're thinking and another one is changing what you're thinking. One of the now classic examples in the field is how researchers were able to, when somebody was using a neural headset to play a video game, embed prompts that the conscious mind wouldn't see to be able to figure out what the person's PIN code and address were for their bank account and mailing address. In much the same way that a person's mind could be probed for how they respond to Communist messaging, a person's mind could be probed to see recognition of a four digit code or some combination of numbers and letters to be able to try to get to a person's password without them even realizing that's what's happening.Beyond Alerts and Algorithms: Redefining Cyber Resilience in the Age of AI-Driven Threats
In an average enterprise Security Operations Center (SOC), analysts face tens of
thousands of alerts daily. Even the most advanced SIEM or EDR platforms struggle
with false positives, forcing teams to spend the bulk of their time sifting
through noise instead of investigating real threats. The result is a silent
crisis: SOC fatigue. Skilled analysts burn out, genuine threats slip through,
and the mean time to respond (MTTR) increases dangerously. But the real issue
isn’t just too many alerts — it’s the lack of context. Most tools operate in
isolation. An endpoint alert means little without correlation to user behavior,
network traffic, or threat intelligence. Without this contextual layer,
detection lacks depth and intent remains invisible. ... Resilience, however,
isn’t achieved once — it’s engineered continuously. Techniques like Continuous
Automated Red Teaming (CART) and Breach & Attack Simulation (BAS) allow
enterprises to test, validate, and evolve their defenses in real time. AI won’t
replace human judgment — it enhances it. The SOC of the future will be
machine-accelerated yet human-guided, capable of adapting dynamically to
evolving threats. ... Today’s CISOs are more than security leaders — they’re
business enablers. They sit at the intersection of risk, technology, and trust.
Boards now expect them not just to protect data, but to safeguard reputation and
ensure continuity.Quantum Circuits brings dual-rail qubits to Nvidia’s CUDA-Q development platform
Quantum Circuits’ dual-rail chip means that it combines two different quantum
computing approaches — superconducting resonators with transmon qubits. The
qubit itself is a photon, and there’s a superconducting circuit that controls
the photon. “It matches the reliability benchmarks of ions and neutral atoms
with the speed of the superconducting platform,” says Petrenko. There’s another
bit of quantum magic built into the platform, he says — error awareness. “No
other quantum computer tells you in real time if it encounters an error, but
ours does,” he says. That means that there’s potential to correct errors before
scaling up, rather than scaling up first and then trying to do error correction
later. In the near-term, the high reliability and built-in error correction
makes it an extremely powerful tool for developing new algorithms, says
Petrenko. “You can start kind of opening up a new door and tackling new
problems. We’ve leveraged that already for showing new things for machine
learning.” It’s a different approach to what other quantum computer makers are
taking, confirms TechInsights’ Sanders. According to Sanders, this dual-rail
method combines the best of both types of qubits, lengthening coherence time,
plus integrating error correction. Right now, Seeker is only available via
Quantum Circuits’ own cloud platform and only has eight qubits.
No comments:
Post a Comment