Daily Tech Digest - September 14, 2024

Three Critical Factors for a Successful Digital Transformation Strategy

Just as important as the front-end experience are the back-end operations that keep and build the customer relationship. Value-added digital services that deliver back-end operational excellence can improve the customer experience through better customer service, improved security and more. Emerging tech like artificial intelligence can substantially improve how companies get a clearer view into their operations and customer base. Take data flow and management, for example. Many executives report they are swimming in information, yet around half admit they struggle analyzing it, according to research by Paynearme. While data is important, the insights derived from that data are key to the conclusions executives must draw. Maintaining a digital record of customer information, transaction history, spend behaviors and other metrics and applying AI to analyze and inform decisions can help companies provide better service and protect their end users. They can streamline customer service, for instance, by immediately sourcing relevant information and delivering a resolution in near-real time, or by automating the analysis of spend behavior and location data to shut down potential fraudsters.


AI reshaping the management of remote workforce

In a remote work setting, one of the biggest challenges for organizations remains in streamlining of operations. For a scattered team, the implementation of AI emerges as a revolutionary tool in automating shift and rostering using historical pattern analytics. Historical data on staff availability, productivity, and work patterns enable organizations to optimise schedules and strike a perfect balance between operational needs and employee preferences. Subsequently, this reduces conflicts and enhances overall work efficiency. Apart from this, AI analyses staff work duration and shifts that further enable organizations to predict staffing needs and optimise resource allocation. This enhances capacity modelling to ensure the right team member is available to handle tasks during peak times, preventing overstaffing or understaffing issues. ... With expanding use cases, AI-powered facial recognition technology has become a critical part of identity verification and promoting security in remote work settings. Organisations need to ensure security and confidentiality at all stages of their work. In tandem, AI-powered facial recognition ensures that only authorized personnel have access to the company’s sensitive systems and data. 


The DPDP act: Navigating digital compliance under India’s new regulatory landscape

Adapting to the DPDPA will require tailored approaches, as different sectors face unique challenges based on their data handling practices, customer bases, and geographical scope. However, some fundamental strategies can help businesses effectively navigate this new regulatory landscape. First, conducting a comprehensive data audit is essential. Businesses need to understand what data they collect, where it is stored, and who has access to it. Mapping out data flows allows organizations to identify risks and address them proactively, laying the groundwork for robust compliance. Appointing a Data Protection Officer (DPO) is another critical step. The DPO will be responsible for overseeing compliance efforts, serving as the primary point of contact for regulatory bodies, and handling data subject requests. While it’s not yet established whether it’s mandatory or not, it is safe to say that this role is vital for embedding a culture of data privacy within the organisation. Technology can also play a significant role in ensuring compliance. Tools such as Unified Endpoint Management (UEM) solutions, encryption technologies, and data loss prevention (DLP) systems can help businesses monitor data flows, detect anomalies, and prevent unauthorized access. 


10 Things To Avoid in Domain-Driven Design (DDD)

To prevent potential issues, it is your responsibility to maintain a domain model that is uncomplicated and accurately reflects the domain. This diligent approach is important to focus on modeling the components of the domain that offer strategic importance and to streamline or exclude less critical elements. Remember, Domain-Driven Design (DDD) is primarily concerned with strategic design and not with needlessly complexifying the domain model with unnecessary intricacies. ... It's crucial to leverage Domain-Driven Design (DDD) to deeply analyze and concentrate on the domain's most vital and influential parts. Identify the aspects that deliver the highest value to the business and ensure that your modeling efforts are closely aligned with the business's overarching priorities and strategic objectives. Actively collaborating with key business stakeholders is essential to gain a comprehensive understanding of what holds the greatest value to them and subsequently prioritize these areas in your modeling endeavors. This approach will optimally reflect the business's critical needs and contribute to the successful realization of strategic goals.


How to Build a Data Governance Program in 90 Days

With a new data-friendly CIO at the helm, Hidalgo was able to assemble the right team for the job and, at the same time, create an environment of maximum engagement with data culture. She assembled discussion teams and even a data book club that read and reviewed the latest data governance literature. In turn, that team assembled its own data governance website as a platform not just for sharing ideas but also to spread the momentum. “We kept the juices flowing, kept the excitement,” Hidalgo recalled. “And then with our data governance office and steering committee, we engaged with all departments, we have people from HR, compliance, legal product, everywhere – to make sure that everyone is represented.” ... After choosing a technology platform in May, Hidalgo began the most arduous part of the process: preparation for a “jumpstart” campaign that would kick off in July. Hidalgo and her team began to catalog existing data one subset of data at a time – 20 KPIs or so – and complete its business glossary terms. Most importantly, Hidalgo had all along been building bridges between Shaw’s IT team, data governance crew, and business leadership to the degree that when the jumpstart was completed – on time – the entire business saw the immense value-add of the data governance that had been built.


Varied Cognitive Training Boosts Learning and Memory

The researchers observed that varied practice, not repetition, primed older adults to learn a new working memory task. Their findings, which appear in the journal Intelligence, propose diverse cognitive training as a promising whetstone for maintaining mental sharpness as we age. “People often think that the best way to get better at something is to simply practice it over and over again, but robust skill learning is actually supported by variation in practice,” said lead investigator Elizabeth A. L. Stine-Morrow ... The researchers narrowed their focus to working memory, or the cognitive ability to hold one thing in mind while doing something else. “We chose working memory because it is a core ability needed to engage with reality and construct knowledge,” Stine-Morrow said. “It underpins language comprehension, reasoning, problem-solving and many sorts of everyday cognition.” Because working memory often declines with aging, Stine-Morrow and her colleagues recruited 90 Champaign-Urbana locals aged 60-87. At the beginning and end of the study, researchers assessed the participants’ working memory by measuring each person’s reading span: their capacity to remember information while reading something unrelated.


Why Cloud Migrations Fail

One stumbling block on the cloud journey is misunderstanding or confusion around the shared responsibility model. This framework delineates the security obligations of cloud service providers, or CSPs, and customers. The model necessitates a clear understanding of end-user obligations and highlights the need for collaboration and diligence. Broad assumptions about the level of security oversight provided by the CSP can lead to security/data breaches that the U.S. National Security Agency (NSA) notes “likely occur more frequently than reported.” It’s also worth noting that 82% of breaches in 2023 involved cloud data. The confusion is often magnified in cases of a cloud “lift-and-shift,” a method where business-as-usual operations, architectures and practices are simply pushed into the cloud without adaptation to their new environment. In these cases, organizations may be slow to implement proper procedures, monitoring and personnel to match the security limitations of their new cloud environment. While the level of embedded security can differ depending on the selected cloud model, the customer must often enact strict security and identity and access management (IAM) controls to secure their environment.


AI - peril or promise?

The interplay between AI data centers and resource usage necessitates innovative approaches to mitigate environmental impacts. Advances in cooling technology, such as liquid immersion cooling and the use of recycled water, offer potential solutions. Furthermore, utilizing recycled or non-potable water for cooling can alleviate the pressure on freshwater resources. Moreover, AI itself can be leveraged to enhance the efficiency of data centers. AI algorithms can optimize energy use by predicting cooling needs, managing workloads more efficiently, and reducing idle times for servers. Predictive maintenance powered by AI can also prevent equipment failures, thereby reducing the need for excessive cooling. This is good news as the sector continues to use AI to benefit from greater efficiencies, cost savings, driving improvements in services with the expected impact of AI on the operational side for data centres expected to be very positive. Over 65 percent of our survey respondents reported that their organizations are regularly using generative AI, nearly double the percentage from their 2023 survey and around 90 percent of respondents expect their data centers to be more efficient as a direct result of AI applications.


HP Chief Architect Recalibrates Expectations Of Practical Quantum Computing’s Arrival From Generations To Within A Decade

Hewlett Packard Labs is now adopting a holistic co-design approach, partnering with other organizations developing various qubits and quantum software. The aim is to simulate quantum systems to solve real-world problems in solid-state physics, exotic condensed matter physics, quantum chemistry, and industrial applications. “What is it like to actually deliver the optimization we’ve been promised with quantum for quite some time, and achieve that on an industrial scale?” Bresniker posed. “That’s really what we’ve been devoting ourselves to—beginning to answer those questions of where and when quantum can make a real impact.” One of the initial challenges the team tackled was modeling benzine, an exotic chemical derived from the benzene ring. “When we initially tackled this problem with our co-design partners, the solution required 100 million qubits for 5,000 years—that’s a lot of time and qubits,” Bresniker told Frontier Enterprise. Considering current quantum capabilities are in the tens or hundreds of qubits, this was an impractical solution. By employing error correction codes and simulation methodologies, the team significantly reduced the computational requirements.


New AI reporting regulations

At its core, the new proposal requires developers and cloud service providers to fulfill reporting requirements aimed at ensuring the safety and cybersecurity resilience of AI technologies. This necessitates the disclosure of detailed information about AI models and the platforms on which they operate. One of the proposal’s key components is cybersecurity. Enterprises must now demonstrate robust security protocols and engage in what’s known as “red-teaming”—simulated attacks designed to identify and address vulnerabilities. This practice is rooted in longstanding cybersecurity practices, but it does introduce new layers of complexity and cost for cloud users. Based on the negative impact of red-teaming on enterprises, I suspect it may be challenged in the courts. The regulation does increase focus on security testing and compliance. The objective is to ensure that AI systems can withstand cyberthreats and protect data. However, this is not cheap. Achieving this result requires investments in advanced security tools and expertise, typically stretching budgets and resources. My “back of the napkin” calculations figure about 10% of the system’s total cost.



Quote for the day:

"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall

No comments:

Post a Comment