Quote for the day:
"Life is not about finding yourself. Life is about creating yourself." -- Lolly Daskal
AI Is Making Cybercrime Quieter and Quicker
The rise of AI-enabled cybercrime is no longer theoretical. Nearly 72% of
organisations In India said that they have encountered AI-powered cyber threats
in the past year. These threats are scaling fast, with a 2X increase reported by
70% and a 3X increase by 12% of organisations. This new class of AI-powered
threats are harder to detect and often exploit weaknesses in human behaviour,
misconfigurations, and identity systems. In India, the top AI-driven threats
reported include AI-assisted credential stuffing and brute force attacks,
Deepfake impersonation in business email compromise (BEC), AI-powered malware
(Polymorphic malware), Automated reconnaissance of attack surfaces, and
AI-generated phishing emails. ... The most disruptive threats are no longer the
most obvious. Topping the list are unpatched and zero-day exploits, followed
closely by insider threats, cloud misconfigurations, software supply chain
attacks, and human error. These threats are particularly damaging because they
often go undetected by traditional defences, exploiting internal weaknesses and
visibility gaps. As a result, these quieter, more complex risks are now viewed
as more dangerous than well-known threats like ransomware or phishing.
Traditional threats such as phishing and malware are still growing at a rate of
~10%, but this is comparatively modest —likely due to mature defences like
endpoint protection and awareness training.The Evolution and Future of the Relationship Between Business and IT
IT professionals increasingly serve as translators — converting executive goals
into technical requirements, and turning technical realities into actionable
business decisions. This fusion of roles has also led to the rise of
cross-functional “fusion teams,” where IT and business units co-own projects
from ideation through execution. ... Artificial Intelligence is already
influencing how decisions are made and systems are managed. From intelligent
automation to predictive analytics, AI is redefining productivity. According to
a PwC report, AI is expected to contribute over $15 trillion to the global
economy by 2030 — and IT organizations will play a pivotal role in enabling this
transformation. At the same time, the lines between IT and the business will
continue to blur. Platforms like low-code development tools, AI copilots, and
intelligent data fabrics will empower business users to create solutions without
traditional IT support — requiring IT teams to pivot further into governance,
enablement, and strategy. Security, compliance, and data privacy will become
even more important as businesses operate across fragmented and federated
environments. ... The business-IT relationship has evolved from one rooted in
infrastructure ownership to one centered on service integration, strategic
alignment, and value delivery. IT is no longer just the department that runs
servers or writes code — it’s the nervous system that connects capabilities,
ensures reliability, and enables growth.Can regulators trust black-box algorithms to enforce financial fairness?
Regulators, in their attempt to maintain oversight and comparability, often opt
for rules-based regulation, said DiRollo. These are prescriptive, detailed
requirements intended to eliminate ambiguity. However, this approach
unintentionally creates a disproportionate burden on smaller institutions, he
continued, DiRollo said, “Each bank must effectively build its own data
architecture to interpret and implement regulatory requirements. For instance,
calculating Risk-Weighted Assets (RWAs) demands banks to collate data across a
myriad of systems, map this data into a bespoke regulatory model, apply overlays
and assumptions to reflect the intent of the rule and interpret evolving
guidance and submit reports accordingly.” ... Secondly around regulatory
arbitrage. In this area, larger institutions with more sophisticated modelling
capabilities can structure their portfolios or data in ways that reduce
regulatory burdens without a corresponding reduction in actual risk. “The
implication is stark: the fairness that regulators seek to enforce is undermined
by the very framework designed to ensure it,” said DiRollo. While institutions
pour effort into interpreting rules and submitting reports, the focus drifts
from identifying and managing real risks. In practice, compliance becomes a
proxy for safety – a dangerous assumption, in the words of DiRollo.The legal questions to ask when your systems go dark
Legal should assume the worst and lean into their natural legal pessimism. There’s very little time to react, and it’s better to overreact than underreact (or not react at all). The legal context around cyber incidents is broad, but assume the worst-case scenario like a massive data breach. If that turns out to be wrong, even better! ... Even if your organization has a detailed incident response plan, chances are no one’s ever read it and that there will be people claiming “that’s not my job.” Don’t get caught up in that. Be the one who brings together management, IT, PR, and legal at the same table, and coordinate efforts from the legal perspective. ... If that means “my DPO will check the ROPA” – congrats! But if your processes are still a work in progress, you’re likely about to run a rapid, ad hoc data inventory: involving all departments, identifying data types, locations, and access controls. Yes, it will all be happening while systems are down and everyone’s panicking. But hey – serenity now, emotional damage later. You literally went to law school for this. ... You, as in-house or external legal support, really have to understand the organization and how its tech workflows actually function. I dream of a world where lawyers finally stop saying “we’ll just do the legal stuff,” because “legal stuff” remains abstract and therefore ineffective if you don’t put it in the context of a particular organization.New Quantum Algorithm Factors Numbers With One Qubit
Ultimately, the new approach works because of how it encodes information.
Classical computers use bits, which can take one of two values. Qubits, the
quantum equivalent, can take on multiple values, because of the vagaries of
quantum mechanics. But even qubits, once measured, can take on only one of two
values, a 0 or a 1. But that’s not the only way to encode data in quantum
devices, say Robert König and Lukas Brenner of the Technical
University of Munich. Their work focuses on ways to encode information with
continuous variables, meaning they can take on any values in a given range,
instead of just certain ones. ... In the past, researchers have tried to improve
on Shor’s algorithm for factoring by simulating a qubit using a continuous
system, with its expanded set of possible values. But even if your system
computes with continuous qubits, it will still need a lot of them to factor
numbers, and it won’t necessarily go any faster. “We were wondering whether
there’s a better way of using continuous variable systems,” König said. They
decided to go back to basics. The secret to Shor’s algorithm is that it uses the
number it’s factoring to generate what researchers call a periodic function,
which has repeating values at regular intervals. Then it uses a mathematical
tool called a quantum Fourier transform to identify the value of that period —
how long it takes for the function to repeat.What Are Large Action Models?
LAMs are LLMs trained on specific actions and enhanced with real connectivity
to external data and systems. This makes the agents they power more robust
than basic LLMs, which are limited to reasoning, retrieval and text
generation. Whereas LLMs are more general-purpose, trained on a large data
corpus, LAMs are more task-oriented. “LAMs fine-tune an LLM to specifically be
good at recommending actions to complete a goal,” Jason Fournier, vice
president of AI initiatives at the education platform Imagine Learning, told
The New Stack. ... LAMs trained on internal actions could streamline
industry-specific workflows as well. Imagine Learning, for instance, has
developed a curriculum-informed AI framework to support teachers and students
with AI-powered lesson planning. Fournier sees promise in automating
administrative tasks like student registration, synthesizing data for
educators and enhancing the learning experience. Or, Willson said, consider
marketing: “You could tell an agentic AI platform with LAM technology, ‘Launch
our new product campaign for the ACME software across all our channels with
our standard messaging framework.'” Capabilities like this could save time,
ensure brand consistency, and free teams to focus on high-level strategy.Five mistakes companies make when retiring IT equipment: And how to avoid them
Outdated or unused IT assets often sit idle in storage closets, server rooms,
or even employee homes for extended periods. This delay in decommissioning can
create a host of problems. Unsecured, unused devices are prime targets for
data breaches, theft, or accidental loss. Additionally, without a timely and
consistent retirement process, organizations lose visibility into asset
status, which can create confusion, non-compliance, or unnecessary costs. The
best way to address this is by implementing in-house destruction solutions as
an integrated part of the IT lifecycle. Rather than relying on external
vendors or waiting until large volumes of devices pile up, organizations can
equip themselves with high security data destruction machinery – such as hard
drive shredders, degaussers, crushers, or disintegrators – designed to render
data irretrievable on demand. This allows for immediate, on-site sanitization
and physical destruction as soon as devices are decommissioned. Not only does
this improve data control and reduce risk exposure, but it also simplifies
chain-of-custody tracking by eliminating unnecessary handoffs. With in-house
destruction capabilities, organizations can securely retire equipment at the
pace their operations demand – no waiting, no outsourcing, and no
compromise.Event Sourcing Unpacked: The What, Why, and How
Using Traffic Mirroring to Debug and Test Microservices in Production-Like Environments
Don’t be a victim of high cloud costs
The simplest reason for the rising expenses associated with cloud services is
that major cloud service providers consistently increase their prices. Although
competition among these providers helps keep prices stable to some extent,
businesses now face inflation, the introduction of new premium services, and the
complex nature of pricing models, which are often shrouded in mystery. All these
factors complicate cost management. Meanwhile, many businesses have inefficient
usage patterns. The typical approach to adoption involves migrating existing
systems to the cloud without modifying or improving their functions for cloud
environments. This “lift and shift” shortcut often leads to inefficient resource
allocation and unnecessary expenses. ... First, before embracing cloud
technology for its advantages, companies should develop a well-defined plan that
outlines the rationale, objectives, and approach to using cloud services.
Identify which tasks are suitable for cloud deployment and which are not, and
assess whether a public, private, or hybrid cloud setup aligns with your
business and budget objectives. Second, before transferring data, ensure that
you optimize your tasks to improve efficiency and performance. Please resist the
urge to move existing systems to the cloud in their current state. ... Third,
effectively managing cloud expenses relies on implementing strong governance
practices.
No comments:
Post a Comment