Quote for the day:
"People will not change their minds but they will make new decisions based upon new information." -- Orrin Woodward
The smart way to tackle data storage challenges

Data intelligence makes data stored on the X10000 ready for AI applications to
use as soon as they are ingested. The company has a demo of this, where the
X10000 ingests customer support documents and enables users to instantly ask it
relevant natural language questions via a locally hosted version of the DeepSeek
LLM. This kind of application wouldn’t be possible with low-speed legacy object
storage, says the company. The X10000’s all-NVMe storage architecture helps to
support low-latency access to this indexed and vectorized data, avoiding
front-end caching bottlenecks. Advances like these provide up to 6x faster
performance than the X10000’s leading object storage competitors, according to
HPE’s benchmark testing. ... The containerized architecture opens up options for
inline and out-of-band software services, such as automated provisioning and
life cycle management of storage resources. It is also easier to localize a
workload’s data and compute resources, minimizing data movement by enabling
workloads to process data in place rather than moving it to other compute nodes.
This is an important performance factor in low-latency applications like AI
training and inference. Another aspect of container-based workloads is that all
workloads can interact with the same object storage layer.
Talent gap complicates cost-conscious cloud planning

The top strategy so far is what one enterprise calls the “Cloud Team.” You
assemble all your people with cloud skills, and your own best software
architect, and have the team examine current and proposed cloud applications,
looking for a high-level approach that meets business goals. In this process,
the team tries to avoid implementation specifics, focusing instead on the
notion that a hybrid application has an agile cloud side and a
governance-and-sovereignty data center side, and what has to be done is push
functionality into the right place. ... To enterprises who tried the Cloud
Team, there’s also a deeper lesson. In fact, there are two. Remember the old
“the cloud changes everything” claim? Well, it does, but not the way we
thought, or at least not as simply and directly as we thought. The economic
revolution of the cloud is selective, a set of benefits that has to be
carefully fit to business problems in order to deliver the promised gains.
Application development overall has to change, to emphasize a
strategic-then-tactical flow that top-down design always called for but didn’t
always deliver. That’s the first lesson. The second is that the kinds of
applications that the cloud changes the most are applications we can’t move
there, because they never got implemented anywhere else.
Your smart home may not be as secure as you think

Most smart devices rely on Wi-Fi to communicate. If these devices connect to
an unsecured or poorly protected Wi-Fi network, they can become an easy
target. Unencrypted networks are especially vulnerable, and hackers can
intercept sensitive data, such as passwords or personal information, being
transmitted from the devices. ... Many smart devices collect personal
data—sometimes more than users realize. Some devices, like voice assistants or
security cameras, are constantly listening or recording, which can lead to
privacy violations if not properly secured. In some cases, manufacturers don’t
encrypt or secure the data they collect, making it easier for malicious actors
to exploit it. ... Smart home devices often connect to third-party
platforms or other devices. These integrations can create security holes if
the third-party services don’t have strong protections in place. A breach in
one service could give attackers access to an entire smart home ecosystem. To
mitigate this risk, it’s important to review the security practices of any
third-party service before integrating it with your IoT devices. ... If your
devices support it, always enable 2FA and link your accounts to a reliable
authentication app or your mobile number. You can use 2FA with smart home hubs
and cloud-based apps that control IoT devices.
Beyond compensation—crafting an experience that retains talent

Looking ahead, the companies that succeed in attracting and retaining top
talent will be those that embrace innovation in their Total Rewards
strategies. AI-driven personalization is already changing the
game—organizations are using AI-powered platforms to tailor benefits to
individual employee needs, offering a menu of options such as additional PTO,
learning stipends, or wellness perks. Similarly, equity-based compensation
models are evolving, with some businesses exploring cryptocurrency-based
rewards and fractional ownership opportunities. Sustainability is also
becoming a key factor in Total Rewards. Companies that incorporate
sustainability-linked incentives, such as carbon footprint reduction rewards
or volunteer days, are seeing higher engagement and satisfaction levels. ...
Total Rewards is no longer just about compensation—it’s about creating an
ecosystem that supports employees in every aspect of their work and life.
Companies that adopt the VALUE framework—Variable pay, Aligned well-being
benefits, Learning and growth opportunities, Ultimate flexibility, and
Engagement-driven recognition—will not only attract top talent but also foster
long-term loyalty and satisfaction.
Bridging the Gap Between the CISO & the Board of Directors
_Borka_Kiss_Alamy.jpg?width=1280&auto=webp&quality=95&format=jpg&disable=upscale)
Many executives, including board members, may not fully understand the CISO's
role. This isn't just a communications gap; it's also an opportunity to build
relationships across departments. When CISOs connect security priorities to
broader business goals, they show how cybersecurity is a business enabler
rather than just an operational cost. ... Often, those in technical roles lack
the ability to speak anything other than the language of tech, making it
harder to communicate with board members who don't hold tech or cybersecurity
expertise. I remember presenting to our board early into my CISO role and,
once I was done, seeing some blank stares. The issue wasn't that they didn't
care about what I was saying; we just weren't speaking the same language. ...
There are many areas in which communication between a board and CISO is
important — but there may be none more important than compliance. Data
breaches today are not just technical failures. They carry significant legal,
financial, and reputational consequences. In this environment, regulatory
compliance isn't just a box to check; it's a critical business risk that CISOs
must manage, particularly as boards become more aware of the business impact
of control failures in cybersecurity.
What does a comprehensive backup strategy look like?

Though backups are rarely needed, they form the foundation of disaster
recovery. Milovan follows the classic 3-2-1 rule: three data copies, on two
different media types, with one off-site copy. He insists on maintaining
multiple copies “just in case.” In addition, NAS users need to update their OS
regularly, Synology’s Alexandra Bejan says. “Outdated operating systems are
particularly vulnerable there.” Bejan emphasizes the positives from
implementing the textbook best practices Ichthus employs. ... One may imagine
that smaller enterprises make for easier targets due to their limited IT.
However, nothing could be further from the truth. Bejan: “We have observed
that the larger the enterprise, the more difficult it is to implement a
comprehensive data protection strategy.” She says the primary reason for this
lies in the previously fragmented investments in backup infrastructure, where
different solutions were procured for various workloads. “These legacy
solutions struggle to effectively manage the rapidly growing number of
workloads and the increasing data size. At the same time, they require
significant human resources for training, with steep learning curves, making
self-learning difficult. When personnel are reassigned, considerable time is
needed to relearn the system.”
Malicious actors increasingly put privileged identity access to work across attack chains

Many of these credentials are extracted from computers using so-called
infostealer malware, malicious programs that scour the operating system and
installed applications for saved usernames and passwords, browser session
tokens, SSH and VPN certificates, API keys, and more. The advantage of using
stolen credentials for initial access is that they require less skill compared
to exploiting vulnerabilities in publicly facing applications or tricking users
into installing malware from email links or attachments — although these initial
access methods remain popular as well. ... “Skilled actors have created tooling
that is freely available on the open web, easy to deploy, and designed to
specifically target cloud environments,” the Talos researchers found. “Some
examples include ROADtools and AAAInternals, publicly available frameworks
designed to enumerate Microsoft Entra ID environments. These tools can collect
data on users, groups, applications, service principals, and devices, and
execute commands.” These are often coupled with techniques designed to exploit
the lack of MFA or incorrectly configured MFA. For example, push spray attacks,
also known as MFA bombing or MFA fatigue, rely on bombing the user with MFA push
notifications on their phones until they get annoyed and approve the login
thinking it’s probably the system malfunctioning.
Role of Blockchain in Enhancing Cybersecurity
/pcq/media/media_files/2025/03/28/FMHV44KuuMhvUZSHNwsy.webp)
At its core, a blockchain is a distributed ledger in which each data block is
cryptographically connected to its predecessor, forming an unbreakable chain.
Without network authorization, modifying or removing data from a blockchain
becomes exceedingly difficult. This ensures that conventional data records stay
consistent and accurate over time. The architectural structure of blockchain
plays a critical role in protecting data integrity. Every single transaction is
time-stamped and merged into a block, which is then confirmed and sealed through
consensus. This process provides an undeniable record of all activities,
simplifying audits and boosting confidence in system reliability. Similarly,
blockchain ensures that every financial transaction is correctly documented and
easily accessible. This innovation helps prevent record manipulation,
double-spending, and other forms of fraud. By combining cryptographic safeguards
with a decentralized architecture, it offers an ideal solution to information
security. It also significantly reduces risks related to data breaches, hacking,
and unauthorized access in the digital realm. Furthermore, blockchain
strengthens cybersecurity by addressing concerns about unauthorized access and
the rising threat of cyberattacks.
Thriving in the Second Wave of Big Data Modernization

When businesses want to use big data to power AI solutions – as opposed to the
more traditional types of analytics workloads that predominated during the first
wave of big data modernization–the problems stemming from poor data management
snowball. They transform from mere annoyances or hindrances into show stoppers.
... But in the age of AI, this process would likely instead entail giving the
employee access to a generative AI tool that can interpret a question formulated
using natural language and generate a response based on the organizational data
that the AI was trained on. In this case, data quality or security issues
could become very problematic. ... Unfortunately, there is no magic bullet that
can cure the types of issues I’ve laid out above. A large part of the solution
involves continuing to do the hard work of improving data quality, erecting
effective access controls and making data infrastructure even more scalable. As
they do these things, however, businesses must pay careful attention to the
unique requirements of AI use cases. For example, when they create security
controls, they must do so in ways that are recognizable to AI tools, such that
the tools will know which types of data should be accessible to which users.
The DevOps Bottleneck: Why IaC Orchestration is the Missing Piece
At the end of the day, instead of eliminating operational burdens, many
organizations just shifted them. DevOps, SREs, CloudOps—whatever you call
them—these teams still end up being the gatekeepers. They own the application
deployment pipelines, infrastructure lifecycle management, and security
policies. And like any team, they seek independence and control—not out of
malice, but out of necessity. Think about it: If your job is to keep production
stable, are you really going to let every dev push infrastructure changes
willy-nilly? Of course not. The result? Silos of unique responsibility and
sacred internal knowledge. The very teams that were meant to empower developers
become blockers instead. ... IaC orchestration isn’t about replacing your
existing tools; it’s about making them work at scale. Think about how GitHub
changed software development. Version control wasn’t new—but GitHub made it
easier to collaborate, review code, and manage contributions without stepping on
each other’s work. That’s exactly what orchestration does for IaC. It allows
large teams to manage complex infrastructure without turning into a bottleneck.
It enforces guardrails while enabling self-service for developers.