Three Critical Factors for a Successful Digital Transformation Strategy

AI reshaping the management of remote workforce

The DPDP act: Navigating digital compliance under India’s new regulatory landscape

10 Things To Avoid in Domain-Driven Design (DDD)
To prevent potential issues, it is your responsibility to maintain a domain
model that is uncomplicated and accurately reflects the domain. This
diligent approach is important to focus on modeling the components of the
domain that offer strategic importance and to streamline or exclude less
critical elements. Remember, Domain-Driven Design (DDD) is primarily
concerned with strategic design and not with needlessly complexifying the
domain model with unnecessary intricacies. ... It's crucial to leverage
Domain-Driven Design (DDD) to deeply analyze and concentrate on the domain's
most vital and influential parts. Identify the aspects that deliver the
highest value to the business and ensure that your modeling efforts are
closely aligned with the business's overarching priorities and strategic
objectives. Actively collaborating with key business stakeholders is
essential to gain a comprehensive understanding of what holds the greatest
value to them and subsequently prioritize these areas in your modeling
endeavors. This approach will optimally reflect the business's critical
needs and contribute to the successful realization of strategic goals.
With a new data-friendly CIO at the helm, Hidalgo was able to assemble the
right team for the job and, at the same time, create an environment of
maximum engagement with data culture. She assembled discussion teams and
even a data book club that read and reviewed the latest data governance
literature. In turn, that team assembled its own data governance website as
a platform not just for sharing ideas but also to spread the momentum. “We
kept the juices flowing, kept the excitement,” Hidalgo recalled. “And then
with our data governance office and steering committee, we engaged with all
departments, we have people from HR, compliance, legal product, everywhere –
to make sure that everyone is represented.” ... After choosing a technology
platform in May, Hidalgo began the most arduous part of the process:
preparation for a “jumpstart” campaign that would kick off in July. Hidalgo
and her team began to catalog existing data one subset of data at a time –
20 KPIs or so – and complete its business glossary terms. Most importantly,
Hidalgo had all along been building bridges between Shaw’s IT team, data
governance crew, and business leadership to the degree that when the
jumpstart was completed – on time – the entire business saw the immense
value-add of the data governance that had been built.
One stumbling block on the cloud journey is misunderstanding or confusion
around the shared responsibility model. This framework delineates the
security obligations of cloud service providers, or CSPs, and customers. The
model necessitates a clear understanding of end-user obligations and
highlights the need for collaboration and diligence. Broad assumptions about
the level of security oversight provided by the CSP can lead to
security/data breaches that the U.S. National Security Agency (NSA) notes
“likely occur more frequently than reported.” It’s also worth noting that
82% of breaches in 2023 involved cloud data. The confusion is often
magnified in cases of a cloud “lift-and-shift,” a method where
business-as-usual operations, architectures and practices are simply pushed
into the cloud without adaptation to their new environment. In these cases,
organizations may be slow to implement proper procedures, monitoring and
personnel to match the security limitations of their new cloud environment.
While the level of embedded security can differ depending on the selected
cloud model, the customer must often enact strict security and identity and
access management (IAM) controls to secure their environment.
Hewlett Packard Labs is now adopting a holistic co-design approach,
partnering with other organizations developing various qubits and quantum
software. The aim is to simulate quantum systems to solve real-world
problems in solid-state physics, exotic condensed matter physics, quantum
chemistry, and industrial applications. “What is it like to actually deliver
the optimization we’ve been promised with quantum for quite some time, and
achieve that on an industrial scale?” Bresniker posed. “That’s really what
we’ve been devoting ourselves to—beginning to answer those questions of
where and when quantum can make a real impact.” One of the initial
challenges the team tackled was modeling benzine, an exotic chemical derived
from the benzene ring. “When we initially tackled this problem with our
co-design partners, the solution required 100 million qubits for 5,000
years—that’s a lot of time and qubits,” Bresniker told Frontier Enterprise.
Considering current quantum capabilities are in the tens or hundreds of
qubits, this was an impractical solution. By employing error correction
codes and simulation methodologies, the team significantly reduced the
computational requirements.
At its core, the new proposal requires developers and cloud service
providers to fulfill reporting requirements aimed at ensuring the safety and
cybersecurity resilience of AI technologies. This necessitates the
disclosure of detailed information about AI models and the platforms on
which they operate. One of the proposal’s key components is cybersecurity.
Enterprises must now demonstrate robust security protocols and engage in
what’s known as “red-teaming”—simulated attacks designed to identify and
address vulnerabilities. This practice is rooted in longstanding
cybersecurity practices, but it does introduce new layers of complexity and
cost for cloud users. Based on the negative impact of red-teaming on
enterprises, I suspect it may be challenged in the courts. The regulation
does increase focus on security testing and compliance. The objective is to
ensure that AI systems can withstand cyberthreats and protect data. However,
this is not cheap. Achieving this result requires investments in advanced
security tools and expertise, typically stretching budgets and resources. My
“back of the napkin” calculations figure about 10% of the system’s total
cost.
Quote for the day:
"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall
How to Build a Data Governance Program in 90 Days

Varied Cognitive Training Boosts Learning and Memory
The researchers observed that varied practice, not repetition, primed older adults to learn a new working memory task. Their findings, which appear in the journal Intelligence, propose diverse cognitive training as a promising whetstone for maintaining mental sharpness as we age. “People often think that the best way to get better at something is to simply practice it over and over again, but robust skill learning is actually supported by variation in practice,” said lead investigator Elizabeth A. L. Stine-Morrow ... The researchers narrowed their focus to working memory, or the cognitive ability to hold one thing in mind while doing something else. “We chose working memory because it is a core ability needed to engage with reality and construct knowledge,” Stine-Morrow said. “It underpins language comprehension, reasoning, problem-solving and many sorts of everyday cognition.” Because working memory often declines with aging, Stine-Morrow and her colleagues recruited 90 Champaign-Urbana locals aged 60-87. At the beginning and end of the study, researchers assessed the participants’ working memory by measuring each person’s reading span: their capacity to remember information while reading something unrelated.Why Cloud Migrations Fail

AI - peril or promise?
The interplay between AI data centers and resource usage necessitates innovative approaches to mitigate environmental impacts. Advances in cooling technology, such as liquid immersion cooling and the use of recycled water, offer potential solutions. Furthermore, utilizing recycled or non-potable water for cooling can alleviate the pressure on freshwater resources. Moreover, AI itself can be leveraged to enhance the efficiency of data centers. AI algorithms can optimize energy use by predicting cooling needs, managing workloads more efficiently, and reducing idle times for servers. Predictive maintenance powered by AI can also prevent equipment failures, thereby reducing the need for excessive cooling. This is good news as the sector continues to use AI to benefit from greater efficiencies, cost savings, driving improvements in services with the expected impact of AI on the operational side for data centres expected to be very positive. Over 65 percent of our survey respondents reported that their organizations are regularly using generative AI, nearly double the percentage from their 2023 survey and around 90 percent of respondents expect their data centers to be more efficient as a direct result of AI applications.HP Chief Architect Recalibrates Expectations Of Practical Quantum Computing’s Arrival From Generations To Within A Decade

New AI reporting regulations

Quote for the day:
"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall
No comments:
Post a Comment