Quote for the day:
"Leadership is a matter of having people look at you and gain confidence, seeing how you react. If you're in control, they're in control." -- Tom Laundry
5 reasons the enterprise data center will never die
Cloud repatriation — enterprises pulling applications back from the cloud to the
data center — remains a popular option for a variety of reasons. According to a
June 2024 IDC survey, about 80% of 2,250 IT decision-maker respondents “expected
to see some level of repatriation of compute and storage resources in the next
12 months.” IDC adds that the six-month period between September 2023 and March
2024 saw increased levels of repatriation plans “across both compute and storage
resources for AI lifecycle, business apps, infrastructure, and database
workloads.” ... According to Forrester’s 2023 Infrastructure Cloud Survey, 79%
of roughly 1,300 enterprise cloud decision-makers said their firms are
implementing internal private clouds, which will use virtualization and private
cloud management. Nearly a third (31%) of respondents said they are building
internal private clouds using hybrid cloud management solutions such as
software-defined storage and API-consistent hardware to make the private cloud
more like the public cloud, Forrester adds. ... “Edge is a crucial technology
infrastructure that extends and innovates on the capabilities found in core
datacenters, whether enterprise- or service-provider-oriented,” says IDC. The
rise of edge computing shatters the binary “cloud-or-not-cloud” way of thinking
about data centers and ushers in an “everything everywhere all at once”
distributed model
How to Understand and Manage Cloud Costs with a Data-Driven Strategy
Understanding your cloud spend starts with getting serious about data. If your
cloud usage grew organically across teams over time, you're probably staring at
a bill that feels more like a puzzle than a clear financial picture. You know
you're paying too much, and you have an idea of where the spending is happening
across compute, storage, and networking, but you are not sure which teams are
overspending, which applications are being overprovisioned, and so on.
Multicloud environments add even another layer of complexity to data visibility.
... With a holistic view of your data established, the next step is augmenting
tools to gain a deeper understanding of your spending and application
performance. To achieve this, consider employing a surgical approach by
implementing specialized cost management and performance monitoring tools that
target specific areas of your IT infrastructure. For example, granular financial
analytics can help you identify and eliminate unnecessary expenses with
precision. Real-time visibility tools provide immediate insights into cost
anomalies and performance issues, allowing for prompt corrective actions.
Governance features ensure that spending aligns with budgetary constraints and
compliance requirements, while integration capabilities with existing systems
facilitate seamless data consolidation and analysis across different
platforms.
Top cybersecurity priorities for CFOs
CFOs need to be aware of the rising threats of cyber extortion, says Charles
Soranno, a managing director at global consulting firm Protiviti. “Cyber
extortion is a form of cybercrime where attackers compromise an organization’s
systems, data or networks and demand a ransom to return to normal and prevent
further damage,” he says. Beyond a ransomware attack, where data is encrypted
and held hostage until the ransom is paid, cyber extortion can involve other
evolving threats and tactics, Soranno says. “CFOs are increasingly concerned
about how these cyber extortion schemes impact lost revenue, regulatory fines
[and] potential payments to bad actors,” he says. ... “In collaboration with
other organizational leaders, CFOs must assess the risks posed by these external
partners to identify vulnerabilities and implement a proactive mitigation and
response plan to safeguard from potential threats and issues.” While a deep
knowledge of the entire supply chain’s cybersecurity posture might seem like a
luxury for some organizations, the increasing interconnectedness of partner
relationships is making third-party cybersecurity risk profiles more of a
necessity, Krull says. “The reliance on third-party vendors and cloud services
has grown exponentially, increasing the potential for supply chain attacks,”
says Dan Lohrmann, field CISO at digital services provider Presidio.
GDPR authorities accused of ‘inactivity’
The idea that the GDPR has brought about a shift towards a serious approach to
data protection has largely proven to be wishful thinking, according to a
statement from noyb. “European data protection authorities have all the
necessary means to adequately sanction GDPR violations and issue fines that
would prevent similar violations in the future,” Schrems says. “Instead, they
frequently drag out the negotiations for years — only to decide against the
complainant’s interests all too often.” ... “Somehow it’s only data protection
authorities that can’t be motivated to actually enforce the law they’re
entrusted with,” criticizes Schrems. “In every other area, breaches of the law
regularly result in monetary fines and sanctions.” Data protection authorities
often act in the interests of companies rather than the data subjects, the
activist suspects. It is precisely fines that motivate companies to comply
with the law, reports the association, citing its own survey. Two-thirds of
respondents stated that decisions by the data protection authority that affect
their own company and involve a fine lead to greater compliance. Six out of
ten respondents also admitted that even fines imposed on other organizations
have an impact on their own company.
The three tech tools that will take the heat off HR teams in 2025
As for the employee review process, a content services platform enables HR
employees to customise processes, routing approvals to the right managers,
department heads, and people ops. This means that employee review processes
can be expedited thanks to customisable forms, with easier goal setting,
identification of upskilling opportunities, and career progression. When
paperwork and contracts are uniform, customisable, and easily located,
employers are equipped to support their talent to progress as quickly as
possible – nurturing more fulfilled employees who want to stick around. ...
Naturally, a lot of HR work is form-heavy, with anything from employee
onboarding and promotions to progress reviews and remote working requests
requiring HR input. However, with a content services platform, HR
professionals can route and approve forms quickly, speeding up the process
with digital forms that allow employees to enter information quickly and
accurately. Going one step further, HR leaders can leverage automated
workflows to route forms to approvers as soon as an employee completes them –
cutting out the HR intermediary. ... Armed with a single source of truth, HR
professionals can take advantage of automated workflows, enabling efficient
notifications and streamlining HR compliance processes.
AI Could Turn Against You — Unless You Fix Your Data Trust Issues
Without unified standards for data formats, definitions, and validations,
organizations struggle to establish centralized control. Legacy systems, often
ill-equipped to handle modern data volumes, further exacerbate the problem.
These systems were designed for periodic updates rather than the continuous,
real-time streams demanded by AI, leading to inefficiencies and scalability
limitations. To address these challenges, organizations must implement
centralized governance, quality, and observability within a single framework.
This enables them to leverage data lineage and track their data as it moves
through systems to ensure transparency and identify issues in real-time. It
also ensures they can regularly validate data integrity to support consistent,
reliable AI models by conducting real-time quality checks. ... For
organizations to maximize the potential of AI, they must embed data trust into
their daily operations. This involves using automated systems like data
observability to validate data integrity throughout its lifecycle, integrated
governance to maintain reliability, and assuring continuous validation within
evolving data ecosystems. By addressing data quality challenges and investing
in unified platforms, organizations can transform data trust into a strategic
advantage.
Backdoor in Chinese-made healthcare monitoring device leaks patient data
“By reviewing the firmware code, the team determined that the functionality is
very unlikely to be an alternative update mechanism, exhibiting highly unusual
characteristics that do not support the implementation of a traditional update
feature,” CISA said in its analysis report. “For example, the function provides
neither an integrity checking mechanism nor version tracking of updates. When
the function is executed, files on the device are forcibly overwritten,
preventing the end customer — such as a hospital — from maintaining awareness of
what software is running on the device.” In addition to this hidden remote code
execution behavior, CISA also found that once the CMS8000 completes its startup
routine, it also connects to that same IP address over port 515, which is
normally associated with the Line Printer Daemon (LPD), and starts transmitting
patient information without the device owner’s knowledge. “The research team
created a simulated network, created a fake patient profile, and connected a
blood pressure cuff, SpO2 monitor, and ECG monitor peripherals to the patient
monitor,” the agency said. “Upon startup, the patient monitor successfully
connected to the simulated IP address and immediately began streaming patient
data to the address.”
3 Considerations for Mutual TLS (mTLS) in Cloud Security
Traditional security approaches often rely on IP whitelisting as a primary
method of access control. While this technique can provide a basic level of
security, IP whitelists operate on a fundamentally flawed assumption: that IP
addresses alone can accurately represent trusted entities. In reality, this
approach fails to effectively model real-world attack scenarios. IP whitelisting
provides no mechanism for verifying the integrity or authenticity of the
connecting service. It merely grants access based on network location, ignoring
crucial aspects of identity and behavior. In contrast, mTLS addresses these
shortcomings by focusing on cryptographic identity(link is external) rather than
network location. ... In the realm of mTLS, identity is paramount. It's not just
about encrypting data in transit; it's about ensuring that both parties in a
communication are exactly who they claim to be. This concept of identity in mTLS
warrants careful consideration. In a traditional network, identity might be tied
to an IP address or a shared secret. But, in the modern world of cloud-native
applications, these concepts fall short. mTLS shifts the mindset by basing
identity on cryptographic certificates. Each service possesses its own unique
certificate, which serves as its identity card.
Artificial Intelligence Versus the Data Engineer
It’s worth noting that there is a misconception that AI can prepare data for AI,
when the reality is that, while AI can accelerate the process, data engineers
are still needed to get that data in shape before it reaches the AI processes
and models and we see the cool end results. At the same time, there are AI tools
that can certainly accelerate and scale the data engineering work. So AI is both
causing and solving the challenge in some respects! So, how does AI change the
role of the data engineer? Firstly, the role of the data engineer has always
been tricky to define. We sit atop a large pile of technology, most of which we
didn’t choose or build, and an even larger pile of data we didn’t create, and we
have to make sense of the world. Ostensibly, we are trying to get to something
scientific. ... That art comes in the form of the intuition required to sift
through the data, understand the technology, and rediscover all the little
real-world nuances and history that over time have turned some lovely clean data
into a messy representation of the real world. The real skill great data
engineers have is therefore not the SQL ability but how they apply it to the
data in front of them to sniff out the anomalies, the quality issues, the
missing bits and those historical mishaps that must be navigated to get to some
semblance of accuracy.
How engineering teams can thrive in 2025
Adopting a "fail forward" mentality is crucial as teams experiment with AI and
other emerging technologies. Engineering teams are embracing controlled
experimentation and rapid iteration, learning from failures and building
knowledge. ... Top engineering teams will combine emerging technologies with new
ways of working. They’re not just adopting AI—they’re rethinking how software is
developed and maintained as a result of it. Teams will need to stay agile to
lead the way. Collaboration within the business and access to a
multidisciplinary talent base is the recipe for success. Engineering teams
should proactively scenario plan to manage uncertainty by adopting agile
frameworks like the "5Ws" (Who, What, When, Where, and Why.) This approach
allows organizations to tailor tech adoption strategies and marry regulatory
compliance with innovation. Engineering teams should also actively address AI
bias and ensure fair and responsible AI deployment. Many enterprises are hiring
responsible AI specialists and ethicists as regulatory standards are now in
force, including the EU AI Act, which impacts organizations with users in the
European Union. As AI improves, the expertise and technical skills that proved
valuable before need to be continually reevaluated. Organizations that
successfully adopt AI and emerging tech will thrive.