Daily Tech Digest - October 17, 2025


Quote for the day:

"Listen with curiosity, speak with honesty act with integrity." -- Roy T Bennett



AI Agents Transform Enterprise Application Development

There's now discussion about the agent development life cycle and the need to supervise or manage AI agent developers - calling for agent governance and infrastructure changes. New products, services and partnerships announced in the past few weeks support this trend. ... Enterprises were cautious about entrusting public models and agents with intellectual property. But the partnership with Anthropic could make models more trustworthy. "Enterprises are looking for AI they can actually trust with their code, their data and their day-to-day operations," said Mike Krieger, chief product officer at Anthropic. ... Embedding agentic AI within the fabric of enterprise architecture enables organizations to unlock transformative agility, reduce cognitive load and accelerate innovation - without compromising trust, compliance or control - says an IBM report titled "Architecting secure enterprise AI agents with MCP." Developers adopted globally recognized models such as Capability Maturity Model Integration, or CMMI, and CMMI-DEV as paths to improve the software development and maintenance processes. ... Enterprises must be prepared to implement radical process and infrastructure changes to successfully adopt AI agents in software delivery. AI agents must be managed by a central governance framework to enable complete visibility into agents, agent performance monitoring and security.


There’s no such thing as quantum incident response – and that changes everything

CISOs are directing attention to have quantum security risks added to the corporate risk register. It belongs there. But the problem to be solved is not a quick fix, despite what some snake oil salesmen might be pushing. There is no simple configuration checkbox on AWS or Azure or GCP where you “turn on” post-quantum cryptography (PQC) and then you’re good to go. ... Without significant engagement from developers, QA teams and product owners, the quantum decryption risk will remain in play. You cannot transfer this risk by adding more cyber insurance policy coverage. The entire cyber insurance industry itself is in a bit of an existential doubt situation regarding whether cybersecurity can reasonably be insured against, given the systemic impacts of supply chain attacks that cascade across entire industries. ...The moment when a cryptographically relevant quantum computer comes into existence won’t arrive with fanfare or bombast. Hence, the idea of the silent boom. But by then, it will be too late for incident response. What you should do Monday morning: Start that data classification exercise. Figure out what needs protecting for the long term versus what has a shorter shelf life. In the world of DNS, we have Time To Live (TTL) that declares how long a resolver can cache a response. Think of a “PQC TTL” for your sensitive data, because not everything needs 30-year protection.


Hackers Use Blockchain to Hide Malware in Plain Sight

At least two hacking groups are using public blockchains to conceal and control malware in ways that make their operations nearly impossible to dismantle, shows research from Google's Threat Intelligence Group. ... The technique, known as EtherHiding, embeds malicious instructions in blockchain smart contracts rather than traditional servers. Since the blockchain is decentralized and immutable, attackers gain what the researchers call a "bulletproof" infrastructure. The development signals an "escalation in the threat landscape," said Robert Wallace, consulting leader at Mandiant, which is part of Google Cloud. Hackers have found a method "resistant to law enforcement takedowns" that and can be "easily modified for new campaigns." ... The group over time expanded its architecture from a single smart contract to a three-tier system mimicking a software "proxy pattern." This allows rapid updates without touching the compromised sites. One contract acts as a router, another fingerprints the victim's system and a third holds encrypted payload data and decryption keys. A single blockchain transaction, costing as little as a dollar in network fees, can change lure URLs or encryption keys across thousands of infected sites. The researchers said the threat actor used social engineering tricks like fake Cloudflare verification or Chrome update prompts to persuade victims to run malicious commands.


Everyone’s adopting AI, few are managing the risk

Across industries, many organizations are caught in what AuditBoard calls the “middle maturity trap.” Teams are active, frameworks are updated, and risks are logged, but progress fades after early success. When boards include risk oversight as a standing agenda item and align on shared performance goals, activity becomes consistent and forward-looking. When governance and ownership are unclear, adoption slows and collaboration fades. ... Many enterprises are adopting or updating risk frameworks, but implementation depth varies. The typical organization maps its controls to several frameworks, while leading firms embed thousands of requirements into daily operations. The report warns that “surface compliance” is common. Breadth without depth leaves gaps that only appear during audits or disruptions. Mature programs treat frameworks as living systems that evolve with business and regulatory change. ... The findings show that many organizations are investing heavily in risk management and AI, but maturity depends less on technology and more on integration. Advanced organizations use governance to connect teams and turn data into foresight. AuditBoard’s research suggests that as AI becomes more embedded in enterprise systems, risk leaders will need to move beyond activity and focus on consistency. Those that do will be better positioned to anticipate change and turn risk management into a strategic advantage.


A mini-CrowdStrike moment? Windows 11 update cripples dev environments

The October 2025 cumulative update, (KB5066835), addressed security issues in Windows operating systems (OSes), but also appears to have blocked Windows’ ability to talk within itself. Localhost allows apps and services to communicate internally without using internet or external network access. Developers use the function to develop, test, and debug websites and apps locally on a Windows machine before releasing them to the public. ... When localhost stops working, entire application development environments can be impacted or “even grind to a halt,” causing internal processes and services to fail and stop communicating, he pointed out. This means developers are unable to test or run web applications locally. This issue is really about “denial of service,” where tools and processes dependent on internal loopback services break, he noted. Developers can’t debug locally, and automated testing processes can fail. At the same time, IT departments are left to troubleshoot, field an influx of service tickets, roll back patches, and look for workarounds. “This bug is definitely disruptive enough to cause delays, lost productivity, and frustration across teams,” said Avakian. ... This type of issue underscores the importance of quality control and thorough testing by third-party suppliers and vendors before releasing updates to commercial markets, he said. Not doing so can have significant downstream impacts and “erode trust” in the update process while making teams more cautious about patching.


How Banks of Every Size Can Put AI to Work, and Take Back Control

For smaller banks and credit unions, the AI conversation begins with math. They want the same digital responsiveness as larger competitors but can’t afford the infrastructure or staffing that traditionally make that possible. The promise of AI, especially low-code and automated implementation, changes that equation. What once required teams of engineers months of coding can now be deployed out-of-the-box, configured and pushed live in a day. That shift finally brings digital innovation within reach for smaller institutions that had long been priced out of it. But even when self-service tools are available, many institutions still rely on outside help for routine changes or maintenance. For these players, the first question is whether they’re willing or able to take product dev work inhouse, even with "AI inside"; the next question is whether they can find partners that can meet them on their own terms. ... For mid-sized players, the AI opportunity centers on reclaiming control. These institutions typically have strong internal teams and clear strategic ideas, yet they remain bound by vendor SLAs that slow innovation. The gap between what they can envision and what they can deliver is wide. AI-driven orchestration tools, especially those that let internal teams configure and launch digital products directly, can help close that gap. By removing layers of technical dependency, mid-sized institutions can move from periodic rollouts to something closer to iterative improvement. 


Why your AI is failing — and how a smarter data architecture can fix it

Traditional enterprises operate four separate, incompatible technology stacks, each optimized for different computing eras, not for AI reasoning capabilities. ... When you try to deploy AI across these fragmented stacks, chaos follows. The same business data gets replicated across systems with different formats and validation rules. Semantic relationships between business entities get lost during integration. Context critical for intelligent decision-making gets stripped away to optimize for system performance. AI systems receive technically clean datasets that are semantically impoverished and contextually devoid of meaning. ... As organizations begin shaping their enterprise general intelligence (EGI) architecture, critical operational intelligence remains trapped in disconnected silos. Engineering designs live in PLM systems, isolated from the ERP bill of materials. Quality metrics sit locked in MES platforms with no linkage to supplier performance data. Process parameters exist independently of equipment maintenance records. ... Enterprises solving the data architecture challenge gain sustainable competitive advantages. AI deployment timelines are measured in weeks rather than months. Decision accuracy reaches enterprise-grade reliability. Intelligence scales across all business domains. Innovation accelerates as AI creates new capabilities rather than just automating existing processes.


Under the hood of AI agents: A technical guide to the next frontier of gen AI

With agents, authorization works in two directions. First, of course, users require authorization to run the agents they’ve created. But as the agent is acting on the user’s behalf, it will usually require its own authorization to access networked resources. There are a few different ways to approach the problem of authorization. One is with an access delegation algorithm like OAuth, which essentially plumbs the authorization process through the agentic system. ... Agents also need to remember their prior interactions with their clients. If last week I told the restaurant booking agent what type of food I like, I don’t want to have to tell it again this week. The same goes for my price tolerance, the sort of ambiance I’m looking for, and so on. Long-term memory allows the agent to look up what it needs to know about prior conversations with the user. Agents don’t typically create long-term memories themselves, however. Instead, after a session is complete, the whole conversation passes to a separate AI model, which creates new long-term memories or updates existing ones. ... Agents are a new kind of software system, and they require new ways to think about observing, monitoring and auditing their behavior. Some of the questions we ask will look familiar: Whether the agents are running fast enough, how much they’re costing, how many tool calls they’re making and whether users are happy. 


Data Is the New Advantage – If You Can Hold On To It

Proprietary data has emerged as one of the most valuable assets for enterprises—and increasingly, the expectation is that data must be stored indefinitely, ready to fuel future models, insights, and innovations as the technology continues to evolve. ... Globally, data architects, managers, and protectors are in uncharted territory. The arrival of generative AI has proven just how unpredictable and fast-moving technological leaps can be – and if there’s one thing the past few years have taught us, it’s that we can’t know what comes next. The only way to prepare is to ensure proprietary data is not just stored but preserved indefinitely. Tomorrow’s breakthroughs – whether in AI, analytics, or some other yet-unimagined technology – will depend on the depth and quality of the data you have today, and how well you can utilize the storage technologies of your choice to serve your data usage and workflow needs. ... The lesson is clear: don’t get left behind, because your competitors are learning these lessons as well. The enterprises that thrive in this next era of digital innovation will be those that recognize the enduring value of their data. That means keeping it all and planning to keep it forever. By embracing hybrid storage strategies that combine the strengths of tape, cloud, and on-premises systems, organizations can rise to the challenge of exponential growth, protect themselves from evolving threats, and ensure they are ready for whatever comes next. In the age of AI, your competitive advantage won’t just come from your technology stack.


Why women are leading the next chapter of data centers

Working her way up through finance and operations into large-scale digital infrastructure, Xiao’s career reflects a steady ascent across disciplines, including senior roles as president of Chindata Group and CFO at Shanghai Wangsu. These roles sharpened her ability to translate high-level strategy into expansion, particularly in the demanding data center sector. ... Today, she shapes BDC’s commercial playbook, which includes setting capital priorities, driving cost-efficient delivery models, and embedding resilience and sustainability into every development decision. In mission-critical industries like data centers, repeatability is a challenge. Every market has unique variables – land, power, water, regulatory frameworks, contractor ecosystems, and community engagement. ... For the next wave of talent, building credibility in the data center industry requires more than technical expertise. Engaging in forums, networks, and industry resources not only earns recognition and respect but also broadens knowledge and sharpens perspective. ... Peer networks within hyperscaler and operator communities, Xiao notes, are invaluable for exchanging insights and challenging assumptions. “Industry conferences, cross-company working groups, government-industry task forces, and ecosystem media engagements all matter. And for bench strength, I value partnerships with local technology innovators and digital twin or AI firms that help us run safer, greener facilities,” Xiao explains.

No comments:

Post a Comment