Daily Tech Digest - January 12, 2024

Navigating Tomorrow: Becoming an Enterprise of the Future

Preparing for what lies ahead goes far beyond just implementing the right technologies, it is about developing a culture that embraces change with empathy. Cultivating a mindset across the organisation that values innovation, continuous learning, and agility ensures that every employee charges forward with confidence. In times of economic uncertainties and technological advancements, it is crucial that we practice empathy. Naturally, there is some fear that technologies like AI will replace human workers. As such, leaders must help employees understand that technology is here to augment their roles and empower them to spend more time on other valuable tasks. The key to embracing any new technology and providing access at scale is to get everyone in the team on board. Whether greeted with excitement or anxiety, leaders must champion this culture of change by encouraging employees to seek new ways of working while ensuring they remain engaged and valued. Certainly, data-driven decision-making will undoubtedly continue to be the cornerstone of future business attempts. 


The Importance of Enterprise Architecture in the Modern Business Landscape

The field of Enterprise Architecture is constantly evolving, driven by emerging trends and innovations. One of the significant trends is the adoption of cloud computing and hybrid IT environments. Cloud-based solutions offer scalability, flexibility, and cost-efficiency, making them increasingly popular among businesses. Enterprise Architecture helps organizations leverage these technologies by designing architectures that integrate cloud services and on-premises infrastructure, ensuring seamless operations and efficient resource utilization. Another emerging trend is the incorporation of artificial intelligence (AI) and machine learning (ML) in Enterprise Architecture practices. AI and ML technologies enable businesses to automate processes, analyze vast amounts of data, and gain valuable insights. By integrating AI and ML into their Enterprise Architecture frameworks, organizations can enhance decision-making, optimize business processes, and improve overall efficiency. Furthermore, the rise of digital transformation has had a significant impact on Enterprise Architecture. 


Top 8 challenges IT leaders will face in 2024

To guide an organization through uncertainty, IT leaders must help ensure everyone in the company is on the same page, Srivastava says. Instead of playing catch-up, he suggests a proactive approach with clear communication as a guiding principle. “It starts with establishing a clear set of agreed upon initiatives and outcomes for the organization,” he says. “We have to make sure everyone understands what they are doing, why they are doing it, and — most importantly — how success will be measured.” ... Security is a challenge that makes the list of top CIO worries perennially, but Grant McCormick, CIO of cybersecurity company Exabeam, notes a rising need for increased collaboration between IT and security teams to address the issue. “The role of the CIO has recently seen a massive convergence with cybersecurity,” says McCormick. “Regardless of whether or not security reports into the CIO, or another leader within the company, it is in everyone’s best interest to be conscious of the organization’s security posture and to enable IT and cybersecurity to work in a highly synchronized manner.”


Economic Uncertainty Doesn’t Mean Compromising Cybersecurity

This futuristic technology isn’t just something to tap into to enrich individual experiences; it is also to help solve some of society’s most pressing challenges and, most of all, to keep people safe. For cryptocurrencies, where there is estimated to be four times more fraud than in regular fiat payments, technology providers are devising new innovations to stay ahead. New solutions can help customers make informed decisions that protect their business, as well as the entire payments ecosystem. A simple dashboard can provide visibility of crypto spend, transaction volumes and an anti-money laundering risk rating exposure. Through solutions like these, banks and other businesses can earn and, importantly, keep the trust of their customers—on whom their business depends. Trust is fragile. It can be broken in a nanosecond. And as the global financial ecosystem expands, it’s getting harder for organizations to navigate the maze of cyber risks alone. Businesses, merchants, financial institutions and fintechs need trailblazing tools and expert knowledge to understand the risks they’re facing. 


Redefining Data Governance: Bridging The Gap Between Technical And Domain Experts

As the data industry gravitates toward decentralization, specifically federated systems, the absence of a robust framework in data governance, master data and data quality becomes glaringly evident. The prevailing issue in many companies is not the sheer volume of data or a lack of technological options but the erroneous assumption that their data is inherently primed for insights, AI applications and democratization. This misconception overshadows the real challenge: the need for a comprehensive approach to data management that integrates the expertise of domain professionals. The advent of practical AI applications marks a watershed moment in the history of data governance. This technology is not just a tool for automation; it serves as a bridge between the technical and business realms. It provides a platform where business experts can meaningfully contribute to data strategies and decision-making processes. Technical teams initially assumed the mantle of data governance out of necessity due to the requisite skill sets. 


Orchestrating Resilience Building Modern Asynchronous Systems

The first one is state management. Basically, the problem here is that you need to contemplate lots of possible combinations of states and events. For example, the "review received" message could come in while the campaign is in pending state instead of the relevant waiting state, or an out of sequence event could come in from somewhere, and so on. All of those cases need to be handled, even though they are not the most likely sequence of events and states. ... Handling retries becomes a task almost as complex as implementing primary logic, sometimes even more so. You can think of implementing your retry mechanisms in different ways, for example by storing a retry counter in the database and incrementing it on each failed attempt until either you succeed or reach the maximum allowed number of retries. Alternatively, you could embed the retry counter in the queue message itself, so you dequeue a message, process it, and, if it fails, re-enqueue the message and increment the retry count. In both cases this implies a huge overhead for developers.


Attackers deploy rootkits on misconfigured Apache Hadoop and Flink servers

In the attack chain against Hadoop, the attackers first exploit the misconfiguration to create a new application on the cluster and allocate computing resources to it. In the application container configuration, they put a series of shell commands that use the curl command-line tool to download a binary called “dca” from an attacker-controlled server inside the /tmp directory and then execute it. A subsequent request to Hadoop YARN will execute the newly deployed application and therefore the shell commands. Dca is a Linux-native ELF binary that serves as a malware downloader. Its primary purpose is to download and install two other rootkits and to drop another binary file called tmp on disk. It also sets a crontab job to execute a script called dca.sh to ensure persistence on the system. The tmp binary that’s bundled into dca itself is a Monero cryptocurrency mining program, while the two rootkits, called initrc.so and pthread.so, are used to hide the dca.sh script and tmp file on disk. The IP address that was used to target Aqua’s Hadoop honeypot was also used to target Flink, Redis, and Spring framework honeypots 


Merck's Cyberattack Settlement: What Does it Mean for Cyber Insurance Coverage?

The Merck and Mondelez cases are likely not going to be the last of their kind. More legal disputes between insurers and insureds, whether regarding war exclusions or other issues, could arise in the future. “I think that the cyber litigation is just getting started,” says Stern. More cases could drive change in the way cyber insurance companies approach risk tied to cyberattacks and what is considered cyberwarfare. When new risks challenge the existing approach to coverage, it drives industry change. “Maybe it takes a second or a third dispute to really achieve a definitive conclusion on that particular matter,” says Kannry. “Then, what can often happen is insurance industry says, ‘You know what, that type of loss needs to be understood and defined separately.’” Compared to many other insurance products, cyber insurance is relatively new. That means there remains plenty of room for the development of innovative ways to offer cyber insurance coverage. But the road forward likely won’t be without bumps for insurers and insureds.


Organizations Must Be Prudent To Realize Value In Generative AI

Rather than being swayed by the allure of generative AI capabilities, remain steadfast about the core features that can genuinely transform and enhance your operations. This pragmatic approach should be considered a short- to mid-term strategy for any forward-thinking organization. The reality is that features closely coupled with generative AI capabilities are still on the horizon. It will be at least a couple of years before they become commonplace. To navigate this transformative landscape effectively as an analytics professional, you must equip yourself with a deep understanding of generative AI. This proficiency will enable you to distinguish between features loosely coupled with generative AI and features that are natively and seamlessly integrated into the technology stack. Furthermore, keep a vigilant eye on the vendors supplying your critical business software. A vendor's stance and commitment to generative AI can profoundly impact how your organization operates in the future. 


LLM hype fades as enterprises embrace targeted AI models

LLMs were created by research teams exploring the capabilities of AI technology rather than as models designed to solve specific business problems. As a result, their capabilities are broad and shallow — writing a fairly generic email or press releases, for example. For the modern business, they have limited capabilities beyond that, requiring more data to produce results with any depth. While the AI landscape used to be dominated solely by OpenAI, major names in the tech world are beginning to outperform ChatGPT with their own LLMs, including Google’s new Gemini model. However, due to the broad capabilities of these new large language models, the text and image-based benchmarks used to determine the model’s prowess were just as general. These benchmarks ranged from simple multi-step reasoning to basic arithmetic. If an AI company’s gauge for a successful Generative AI platform is how correctly it can complete rudimentary math equations, that has little to no relevance for the work of an enterprise organization.



Quote for the day:

"Before you are a leader, success is all about growing yourself when you become a leader, success is all about growing others." -- Jack Welch

Daily Tech Digest - January 11, 2024

Four Ways the Evolution of AI Is Changing the Corporate Governance Landscape

There is no doubt that AI has been touted as the long-awaited answer to everyone’s productivity and efficiency woes. Tools like ChatGPT can do everything from generating interview questions to writing a song. It can create pictures, deliver data, and solve complex problems. Yet AI is not without its issues, and some believe that the most pressing dangers associated with this technology have not even begun to emerge. AI giants have been very clear that society must pay close attention to AI development. It’s crucial for directors and investors alike to understand that while science fiction movies seem like they belong in a fantasy realm, the reality they depict may not be as far-fetched as it seems. Similarly, scientists cannot take for granted that a bent toward corporate profit won’t motivate boards to push AI developers in that same direction. Instead of making an attempt to battle the behemoth of monetary thirst, it may be a better idea to come up with creative ways to make social goals and AI safety profitable. If developers can’t overcome the opposing viewpoint, why not try to find a way to join them?


The Incident Lifecycle: How a Culture of Resilience Can Help You Accomplish Your Goals

There are three points within the incident lifecycle where we can focus time and energy to improve the learning cycle and gain some bandwidth to improve resilience in the system. It’s not easy, because you’ll generally have to make small adjustments and changes along the way. CTOs won’t generally approve $100,000 for cross-incident analysis (that won’t be a marketable improvement to stakeholders) without evidence that it’s helpful. ... You need perspectives from across the organization. The discussion shouldn’t include only the incident manager and the person who pushed the bad code. I find that folks in marketing, product management, and especially customer support have great insights into the impact of an incident. When you meet, make sure it's an open conversation – the person facilitating should be talking less than anyone else in the room. This way, you will capture how this incident affected different groups. You may learn, for example, that the on-call engineer lacked dashboard access or customer support got slammed with complaints.


Nurturing Leadership Through The Power Of Reading

The most straightforward yet impactful way reading can contribute to self-development is through gaining knowledge. Whether extracting insights from books, articles or research papers, immersing oneself in written content is a foundational pillar of continuous development. This direct approach is not just about gathering information; it's also about internalizing concepts and lessons to create a reservoir of intellectual wealth for informed decision-making and sustained professional evolution. The simple power of reading remains a reliable means of absorbing knowledge—a timeless practice that can help propel individuals toward continuous growth and success. ... Reading also facilitates internal exploration. Self-help and philosophical literature invite introspection, which can nurture profound self-awareness. Atomic Habits by James Clear, for example, provides actionable insights for leaders seeking to enhance their habits and maximize their potential, fostering a deeper understanding of personal strengths and weaknesses. 


CI Is Not CD

A crucial difference I’ve often observed is that CI and CD tools have different audiences. While developers are often active on both sides of CI/CD, CD tools are frequently used by a wider group of people. ... CD tools have a range of subtle features that make it easier to handle deployment scenarios. They have a way to manage environments and infrastructure. This mechanism applies the correct configuration for each deployment and provides a way to handle deployments at scale, such as managing tenant-specific infrastructure or deployments to different locations (such as retail stores, hospitals or cloud regions). Alongside practical deployment features, CD tools also make the state of deployments visible to everyone who needs to know what software versions are where. This removes the need for people to ask for status updates, just as your task board handles work items. If you want to know your bank balance, you don’t want to phone your bank; you want to self-serve the answer instantly. The same is true for your deployments.


Managing CEO expectations is this year’s Priority No. 1

Today’s CEOs are more likely to get their IT visions from stories written by credulous writers authoring for online business media. That’s if we’re lucky. If we aren’t, they’ll want Tony Stark’s ability to conjure up high-tech solutions by gesticulating into a 3D touch interface while arguing with the AI that ran the Iron Man’s lab. That leaves it up to you, your company’s hard-working CIO, to temper the CEO’s expectations from what they infer from the Marvel Cinematic Universe to Earth 2024. Because CEOs’ real reality (“real” by definition) is likely to be disappointing compared to the MCU and other semi-fictional realities they see, hear of, or imagine, CIOs can worry a little less about how IT might disappoint them on this score. ... Okay, fair’s fair and fun’s fun. But few CEOs will be completely consumed by these semi-whimsical depictions of information technology’s future. They’ll continue to have practical concerns, too, like where all the money is that cloud computing was supposed to save them. Some disappointments, that is, are both evergreen and rooted in real reality. 


Embracing offensive cybersecurity tactics for defense against dynamic threats

The essence of a coalition approach in offensive cyber operations is straightforward: combining forces to enhance cyber defense capabilities. This approach is critical in today’s world, where cyber threats transcend national borders. By pooling resources, knowledge, and intelligence, a coalition approach facilitates a more comprehensive and effective response to cyber threats. In the financial industry for example we have FS-ISAC that supports all these. Effective implementation involves establishing clear communication channels, defining shared objectives, and ensuring mutual trust among participating entities. ... Looking ahead, the line between offense and defense in cybersecurity is blurring. The future I envision is one where these two are not distinct entities but different aspects of a singular, holistic strategy. Offensive tools will be used not just to attack but to inform, to scout for threats and act before they materialize. This integrated approach is akin to a martial artist’s stance, ready to block and strike simultaneously.


CES 2024: Will the Coolest New AI Gadgets Protect Your Privacy?

As Tschider points out, "COPPA doesn’t have any cybersecurity requirements to actually reinforce its privacy obligations. This issue is only magnified in contemporary AI-enabled IoT because compromising a large number of devices simultaneously only requires pwning the cloud or the AI model driving function of hundreds or thousands of devices. Many products don't have the kind of robust protections they actually need." She adds, "Additionally, it relies primarily on a consent model. Because most consumers don't read privacy notices (and it would take well over a hundred days a year to read every privacy notice presented to you), this model is not really ideal." For Tschider, a superior legal framework for consumer electronics might take bits of inspiration from HIPAA, or New York State's cybersecurity law for financial services. But really, one need only look across the water for an off-the-shelf model of how to do it right. For cybersecurity, the NIS 2 Directive out of the EU is broadly useful," Tschider says, adding that "there are many good takeaways both from the General Data Protection Regulation and the AI Act in the EU."


Critical Components for Data Fabric Success

In a physical data fabric, users access data, run analytics on it, or use APIs at a consumption layer to deliver the data wherever it is needed. Prior to that, data is modeled, prepared, and curated in the discovery layer, and transformed and/or cleansed as needed in the orchestration layer. In the ingestion layer, data is drawn from one or more data sources (which can be on premises or in the cloud) and stored in the persistence layer, which is usually a data lake or data warehouse. Logical data fabrics integrate data using data virtualization to establish a single, trusted source of data regardless of where the data is physically stored. This enables organizations to integrate, manage, and deliver distributed data to any user in real time regardless of the location, format, and latency of the source data. Unlike a logical data fabric, a physical data fabric requires the ability to physically centralize all the required data from multiple sources before it can deliver the data to consumers. Data also needs to be physically transformed and replicated every time and be adapted to each new use case. 


Boost Your Business With Digital Twin Technology

Digital twins allow businesses to answer questions that can directly impact strategic and operational decisions. “Organizations can move from answering simple questions about asset performance to understanding how these assets -- machines, assembly lines, supply chains -- will operate in the future, and what actions the business can take to meet performance and uptime goals,” Mann explains. Manufacturers are the businesses most likely to gain value from digital twin technology. “Manufacturers look to understand the causes of downtime, model scenarios to improve efficiency, and reduce waste,” says Devin Yaung, senior vice president, group enterprise, IoT products and services, at technology and business solutions provider NTT, in an email interview. Digital twins of individual machines permit instant views into maintenance issues and potential failures. “The growth of connected IoT sensors and devices has allowed all industries to gain insights into assets,” Yaung says. “Because of this explosion of connectivity, we are seeing large adoption not only in manufacturing but also in utilities, mining, hospitals, ports, airports, logistics/transportation, agriculture, and many other industries.”


Hey Gen. Z, you’re looking for tech jobs in all the wrong places

The pace of digital adoption and technological change today is far greater than it's ever been, according to Ger Doyle, senior vice president of US-based IT staffing firm Experis. The rise AI and genAI is likely to accelerate that trend, “so new graduates, as well as those in the workforce today, need to embrace a concept of life-long learning to stay relevant in the new world,” Doyle said. Pandor agreed: “Candidates should remain consistently curious throughout the job-searching process. Keeping up to date with the latest trends and developments in the digital world by reading technical news enables them to showcase their interest in the ever-changing sector when they do land a job interview. From a more practical perspective, talent can also continue to practice and enhance their technical skills while job hunting so that they are ready to hit the ground running.” Younger job candidates might not be aware of the breadth and diversity of roles available, Pandor said, and they shouldn’t rule out other opportunities early in their careers.



Quote for the day:

“Nobody talks of entrepreneurship as survival, but that’s exactly what it is.” -- Anita Roddick

Daily Tech Digest - January 10, 2024

It's Time to Take a Modern Approach to Password Management

Standards for decentralized identity are being advocated by recognized bodies such as W3C. While regulations and other aspects such as authorization, role, and attribute-based access are still further developing, businesses and institutions now have the opportunity to create interoperable designs that can seamlessly integrate with this new model. In this architecture, the most trusted identity providers are likely to play a dominant role as decentralized issuers (DID), which will be crucial for the adoption of VCs. Users are more likely to trust these established brands to certify their digital credentials. However, new vendors, brands, and institutions may emerge to compete in this space and position themselves as market leaders. Furthermore, a witness ledger, which offers traceability and trust of VC transactions, will likely be supported by a technology similar to blockchain network but more eco-friendly. This will enable digital merchants to verify the credibility of a credential, and ultimately their potential customers. 


Putting AI to Work: Systems of Intelligence and Actionable Agency

Pervasive AI will create a new System of Intelligence (SoI) that integrates data, technologies, platforms, and practices for the purposes of finding and understanding patterns, extracting insights, promoting efficiency and creativity, and facilitating decision-making. This will illuminate how organizations actually do on a functional basis, through real-time data inputs, allowing people greater awareness and meaningful action. Here is why: The system of intelligence is designed to work in a way that is different from traditional data systems or systems of record. Rather than requiring users to know how to extract insights from the data, the system of intelligence is designed to identify, ask questions and provide insights in a way that is easy for all users to understand. Standards and practices of the SoI are still emerging, which gives leaders the rare chance to both learn from and guide the development of a new system of work in the coming year. This is necessary work since imagining that nothing will change with AI is akin to thinking that, at the dawn of television, radio would simply be transposed wholesale, with no particular effect on culture, process, or business models.


Leveraging Blockchain Technology to Counter the Threat of Deepfake Videos

One of the fundamental features of blockchain is its immutability. Once data is added to a blockchain, it becomes virtually impossible to alter or erase. Applying this characteristic to video content could create an immutable record of the original footage, ensuring that any subsequent alterations or manipulations would be immediately apparent. ... Blockchain’s timestamping capabilities can provide a reliable chronology of when content is created, modified, or accessed. Integrating blockchain into the video creation process allows for the creation of a verifiable and transparent timeline for each piece of content. This timestamping ensures that any attempt to manipulate videos would be easily traceable, enabling swift identification of the source of misinformation and aiding in the attribution of responsibility. ... Blockchain operates on a decentralized network of nodes, each maintaining a copy of the ledger. This decentralized nature can be harnessed for video verification, where multiple nodes across the network can independently verify the authenticity of a given video.


How to Build Team Culture in a Remote-Work World

A positive team culture leads to happier employees. This may result in increased productivity over the long run. Because a positive work environment leads to things like friendships and increased levels of support between coworkers, you're more likely to see lower turnover rates and higher employee retention rates when you emphasize team culture. Positive team cultures also reduce levels of stress and anxiety among employees. With a less stressful environment, your skilled workers are more likely to remain with your company long-term. Additionally, they'll share their positive experiences as an employee. As this word-of-mouth spreads, your positive team culture may eventually result in your business becoming a sought-after place to work. In addition to these hiring and personnel benefits, positive team cultures correlate directly with profitability. You might engage in a chance discussion with a coworker over the water cooler or in the breakroom — leading to opportunities, collaborations and innovations that otherwise might not have happened.


Faster than ever: Wi-Fi 7 standard arrives

For home networks, Wi-Fi 7 enhances the performance of smart home devices, providing a more reliable connection for Internet of Things technologies. The improved bandwidth and speed are perfect for families, like mine, which have multiple devices streaming high-definition content simultaneously. Your overall Wi-Fi performance, whether it's just you or your family and friends, will see a dramatic improvement. In businesses, Wi-Fi 7 can support more devices with minimal interference. This capability makes it ideal for large offices and coworking spaces. The improved speed and stability facilitate seamless video conferencing and efficient cloud-based applications, which are essential for modern companies. All that's the good news. The bad news is that the 6 GHz wireless spectrum uses shorter wavelengths. Short wavelengths are great for fast data transfers at close range, So, they're great for connecting to your Wi-Fi 7-enabled HDTV a few feet away from your router. But short wavelengths are poor at connecting at long distances and suffer greater interference from physical obstructions, such as dense walls or floors in a building.


3 Essential Attitudes & Dispositions of Good Corporate Governance

While leadership and governance are two concepts that often work in tandem, every director must understand that they are not the same thing. Leadership refers to a person’s ability to influence the attitudes and actions of others to lead them toward a common goal. Governance, on the other hand, should be about making decisions that lead to increased corporate performance and meeting or exceeding agreed-upon targets. Being a good leader without shifting their mindset toward governance can make directors behave in selfish and territorial ways, putting undue focus on their own desires and beliefs. On the other hand, having the power to govern without the skill of leadership can often lead to passivity and a bend toward bureaucracy, which can easily stop board progress in its tracks. ... Many directors — especially those who are new to the position — struggle with speaking up when they see something happening that needs the board’s attention. They may be worried about receiving private or public backlash or derailing the company’s progress toward meeting targets. 


Why most companies suck at digital transformation

Focus on architecture in the wide, without forgetting architecture in the narrow. Enterprises need to understand the holistic architecture required to support accurate DX positive outcomes and not just focus on individual systems. This is an outcome of a comprehensive strategy, in that we’re utilizing all systems in place, including legacy and other on-premises assets, and establishing how they will work and play well with migrated or net-new systems existing on public clouds. If companies focus only on small systems or architectures, they usually neglect to understand how they will exist within a strategically defined DX ecosystem. This results in decoupled projects that may be impressive on their own but provide little or no value to the larger strategy that is more important than just the parts that make it up. ... The most significant issue is that most don’t even understand what digital transformation is, even those with the term in their titles. Instead, they focus on the tactics, meaning tools and technology, never understanding the plan to make things incrementally better.


Researchers develop technique to prevent software bugs

Baldur took several months to build. The work was done as a collaboration with Google, and built on top of a significant amount of prior research. First, whose team performed its work at Google, used Minerva, an LLM trained on a large corpus of natural-language text, and then fine-tuned it on 118GB of mathematical scientific papers and webpages containing mathematical expressions. Next, she further fine-tuned the LLM on a language, called Isabelle/HOL, in which the mathematical proofs are written. Baldur then generated an entire proof and worked in tandem with the theorem prover to check its work. When the theorem prover caught an error, it fed the proof, as well as information about the error, back into the LLM, so that it can learn from its mistake and generate a new and hopefully error-free proof. This process yields a remarkable increase in accuracy. The tool for automatically generating proofs is called Thor, which can generate proofs 57% of the time. 


Reconciling Agile Development With AI Safety

On one hand, the Agile principles of an iterative approach, regular risk management checks, cross-expertise collaboration, solicitation of third-party feedback at every stage, and adaptability to changing priorities or new findings seem well-suited to the seamless incorporation of responsible AI practices. However, we think responsible AI development will require a full revamp of the software development lifecycle, from pre-training assessments of data to post-deployment monitoring for performance and safety. Some practices, such as automated algorithmic checks (like tests for data bias and model performance metrics, all of which are part of Stanford’s HELM set of evaluations) can be utilized anywhere in the development lifecycle. Other techniques may be purely ex post, like algorithmic audits. An Agile approach avoids engineering siloes, allowing for stage-specific practices to be adopted where necessary, while ensuring stage-agnostic practices are adopted at all relevant stages, and allowing these stages to proceed in tandem.


Modern-day manufacturing: A process built on data governance

The average manufacturer generates high volumes and different types of data, including customer information, production orders, and shipment tracking, to name a few. This is further compounded with every supplier, distributor, and third party that’s added to the supply chain. Without a system to validate all this data, a manufacturer can find itself with inaccurate or incomplete data. Poor data quality not only leads to operational inefficiencies and mistakes, it also hinders the organization’s growth by limiting its ability to forecast demands and plan production runs. ... Within complex manufacturing ecosystems, it can be unclear who owns data as it flows across the supply chain. Various teams generate and use different types of data, making ownership and responsibility a challenge to pin down. ... Establishing data ownership involves identifying primary stakeholders who are responsible for ensuring the quality, security, and correct use of data assets. 



Quote for the day:

"We live in a society obsessed with public opinion. But leadership has never been about popularity." -- Marco Rubio

Daily Tech Digest - January 09, 2024

10 ways to destroy developer happiness

Who doesn’t get annoyed by endless meetings? Developers are busy people, and most would rather spend their time coding than talking about it. Meetings that are not focused and efficient are a frequent source of disenchantment. “Meetings that drag on without contributing to progress can be very draining,” says Vlad Gukasov, software development engineer at Amazon. “These often take up valuable time that could be better spent on actual development work.” ... Unnecessary red tape can be incredibly frustrating to developers. “Navigating through layers of bureaucracy can be quite stifling,” Gukasov says. “The complexity of internal procedures can sometimes hinder the smooth progress of software development.” Developers like efficiency, says Remi Desmarais, director of engineering and software development at software company Tempo Software. “They frequently encounter delays, from waiting for clarification on requirements, to code processes like compilation, building and testing to seeking approval from code reviewers, which can hinder their progress,” he says.


Data Center Cooling: Embracing Liquid Cooling for the Era of Sustainable and Efficient Operations

While the adoption of liquid cooling undeniably offers an enticing remedy for the thermal challenges posed by high-density racks, its integration into data center management introduces a new set of considerations and complexities. The deployment of liquid cooling systems necessitates a bespoke infrastructure, comprising specialized components such as pumps, heat exchangers, and filtration systems. These elements work in concert to ensure the seamless circulation and efficient heat dissipation of the liquid coolant throughout the intricate network of electronic components. Beyond the physical requirements, the use of liquid coolants imposes a critical need for stringent safety protocols and specialized training for personnel entrusted with the operation and maintenance of these systems. The introduction of liquid into the data center ecosystem marks a shift that extends beyond hardware considerations, demanding a holistic approach to facility management and personnel training to guarantee the safe and effective functioning of these advanced cooling solutions.


Meet the industrial metaverse: How Sony and Siemens seek to unleash the power of immersive engineering

According to Siemens CEO Roland Busch, "This will empower customers to accelerate innovation, enhance sustainability, and adopt new technologies faster and at scale, leading to a profound transformation of entire industries and our everyday lives. Together with our customers and partners, Siemens is proud to announce new products that will bring the industrial metaverse a step closer to all of us." ... Another interesting phrase Siemens has been using is immersive engineering. The idea is that engineering, design, and content creation can be done inside a 3D environment using a toolkit called the Siemens Xcelerator portfolio. Siemens offers a product called NX Immersive Designer. This tool is designed to "seamlessly connect the real and digital worlds." Essentially, users can create a digital twin, a version of a real-world system modeled and simulated in the virtual world. Digital twins replicate physical entities with accurate virtual models, helping engineers simulate performance testing and make predictions about points of failure. The virtual nature of the twin allows many more variations and usages compared to the cost and risk of building a single physical prototype. 


Leadership opportunities must align with an individual's natural strengths: Gallup’s Rohit Kar

In addressing the correlation between effective leadership and employee engagement, our approach to leadership development strategy emphasises the enduring elements that remain constant amid technological advancements and societal shifts. We prioritise understanding the unchanging fundamental aspects of human nature, acknowledging that regardless of external influences, certain core needs persist. At Gallup, when we delve into engagement, we recognise that employees bring their emotions into the workplace. To measure these emotions productively, we simplify the process, ensuring it remains straightforward and outcome-oriented. Our focus lies in enabling managers to engage in meaningful conversations by providing adaptable frameworks and tools. We ensure scalability, speed, and agility in our delivery methods to cater to changing consumption patterns without altering the essence of what we deliver – because the core principles of management don't require alteration. Our unique approach lies in retaining fundamental principles while adapting to the evolving landscape. 


New York Times’ blockbuster suit could decide the fate of genAI

Microsoft and OpenAI say their use of copyrighted material is transformative. They contend the output of the chatbots transforms the original content into something different. The Times suit claims there’s no real transformation, that what Microsoft and OpenAI are doing is outright theft. It claims the companies are not just stealing Times content, but their audience as well, and making billions of dollars from it. People will have no need to read the Times either online or in print, if they can get all the newspaper’s information for free from a chatbot instead, the suit alleges. “There is nothing ‘transformative’ about using The Times’scontent without payment to create products that substitute for The Times and steal audiences away from it. Because the outputs of Defendants’ GenAI models compete with and closely mimic the inputs used to train them, copying Times works for that purpose is not fair use.” ... So, who’s right? This is not a difficult call. The answer is simple. The Times is right. Microsoft and OpenAI are wrong. Microsoft and OpenAI are getting a free ride to use copyrighted material that takes a tremendous amount of time and money to create, and uses that material to reap big profits.


After Orange Disruption, Brace for More BGP Route Hijacking

Orange's attacker appeared to have obtained and used a valid password for the telco's administrator account with RIPE, for which two-factor authentication wasn't enabled. Security experts report that the source of the password appears to have been information-stealing malware called Raccoon. After gaining access to the account, the attacker used RIPE's hosted RPKI resource certification service to broadcast a valid, cryptographically signed route origin authorization to direct traffic to an autonomous system number not controlled by Orange, resulting in the traffic never reaching its intended destination. ... The telecommunications giant subsequently told Information Security Media Group in a statement: "The Orange account in the IP network coordination center (RIPE) has suffered an improper access that has affected the browsing of some of our customers." The company said it had immediately responded and resolved the problem later Wednesday and that "appropriate measures have been taken to prevent such an incident from happening again."


Data Governance & Controls: An Increasingly Critical Foundation for AML Compliance Programs

Ensuring effective AML compliance necessitates a steadfast commitment to data accuracy and integrity, achievable only through robust governance. Financial institutions can only rely on the consistency and correctness of their AML compliance solutions by establishing stringent controls ensuring accuracy and integrity. The significance of data accuracy and integrity can easily be seen in relation to false positive and false negative alerts in transaction monitoring. False positives, stemming from inaccurate data triggering unnecessary alerts, can lead to a significant waste of resources and needless increases in program costs. Conversely, false negatives, arising from incomplete or unreliable data, may cause suspicious activity to go unreported, which could lead to potential regulatory action. Financial institutions that build strong data accuracy and integrity standards into their processes minimize these risks, thereby enhancing the overall efficiency and efficacy of their AML compliance program.


Designing an IT department for a world defined by change

For years, inside the technology department, we’ve referred to “the business”, meaning everything other than IT, and for its part, “the business” has spent years trying to isolate IT as a mere cost centre. But nearly a quarter of the way through the 21st century, the need to integrate digital tools into every part of business means this splendid isolation no longer really works. Even in the most performant IT departments, centralised decision-making around tooling and technology choices falls behind the galloping pace of the market, and it's difficult to imagine how it could ever be otherwise in a department of finite resources. ... Still, by refusing to allow the curation of information technology outside of that provided by the technology function, it becomes a self-fulfilling prophecy - the only people expected to understand the challenges of technology are those hired into the IT department. The only way to break that vicious cycle is to discard the model of the central IT department. Once you start questioning the existence of the IT department, it’s only a short journey to rejecting the concept of functional structures entirely.


Building a Better Analytics Team

Once you have the right group of people on your team, your challenge is to build an environment where they can work effectively. Ensure that they are clear about the organization’s mission, vision, values, and objectives. Create an environment where the connection between your team’s objectives, projects, and tasks and the targets and objectives of the organization are clear. Team members need to understand how the data architecture being built will have a direct impact on the decisions the organization is making and how their activities further the goal of becoming a more data-driven organization. Team building is not a one-time activity. It is built into everyday activities, including goal setting, performance reviews, project planning and execution, team meetings, and one-on-one meetings between you and your direct reports. The clarity of purpose is something that increases each time it is reviewed and as the team’s objectives and goals are refined and realigned. As a leader, one of your main objectives is to remove obstacles that arise. 


The Rise of Dual Ransomware Attacks

Two ransomware attacks mean that companies could be facing more data encryption, more data exfiltration, more data leaks, and multiple demands for payment. If an enterprise is dealing with a single attacker behind two attacks, they could be facing the challenge of different ransomware strains. How did the two strains impact an organization’s systems, and what will it take to remediate and return to normal operations? How much data was taken? Ransomware groups can take that exfiltrated data to leak sites to up the pressure on victims to pay. If two different attackers are in play, recovery can be even more complicated. Two different attackers may encrypt the same files. “We've had a couple of cases where the first ransomware incident occurs. They encrypted a bunch of files. In the midst of dealing with that a second attack…those attackers encrypt the encrypted files of the other attackers,” Minder shares. “If that happens, where you get encrypted files encrypted again, the likelihood of corruption goes up thousand percent, and you may not get your files back at all.”



Quote for the day:

“Our greatest fear should not be of failure but of succeeding at things in life that don't really matter.” -- Francis Chan

Daily Tech Digest - January 08, 2024

Can Generative AI and Data Quality Coexist?

Not only is it possible for generative AI and data quality to co-exist, it’s imperative that they do, says Marinela Profi, AI strategy advisor for analytics software developer SAS in an email interview. Data for AI is like food for humans, she notes. “Based on the quality of the food you feed your body and your brain, you will receive a certain quality of outputs, such as higher performance or more focus.” Simply put, if you've neglected the quality of your enterprise data, or haven't defined a proper data strategy, you won't get value out of generative AI, Profi says. “On the flip side those who have implemented a strong data management discipline are uniquely positioned to gain a competitive advantage with generative AI.” ... New training techniques that require less data or that can learn more effectively from existing datasets might reduce the pressure on data quantity but increase the need for highly representative and unbiased data samples, Profi says. “Self-supervised and unsupervised learning techniques, in which models generate their own labels or learn from unlabeled data, reduce reliance on manually labeled datasets” she notes.


10 top priorities for CIOs in 2024

We are in a cybersecurity pandemic right now, warns Juan Orlandini, CTO for North America at solutions and systems integrator Insight Enterprises. He encourages CIOs to focus on cyber preparedness to ensure they’ve done everything possible to prevent assaults. “Attackers are getting more sophisticated, and all it takes is one mistake for them to get in,” Orlandini notes. “Assume that attacks are inevitable.” Work toward having the right cybersecurity team in place, Orlandini advises. “This could be an in-house team or trusted advisors who can make sure you’ve done what you can to protect yourself.” ... Caldas believes that’s important to renew and maintain existing workforce competencies as well as to establish a high-performance culture that’s ready to deliver results in today’s fast-paced technology ecosystem. “We’re shifting away from building development plans based on job profiles alone and are now pivoting to build plans on top of a foundation of skills,” she states. “Skill-building will inform how we provide training as well as how team members can grow their careers.”


Insights From Quantum Computing Could Create Light-Controlled Memory Tech

The discovery is tightly linked to the realm of quantum technologies, and combined principles from two scientific communities that so far had little overlap: “We arrived to this understanding by using principles that are well established within the quantum computing and quantum optics communities but less so in the spintronics and magnetism communities.” The interaction between a magnetic material and radiation is well established when the two are in perfect equilibrium. However, the situation where there is both radiation and a magnetic material that are not in equilibrium has so far been described very partially. This non-equilibrium regime is at the core of quantum optics and quantum computing technologies. From our examination of this non-equilibrium regime in magnetic materials, while borrowing principles from quantum physics, we have underpinned the fundamental understanding that magnets can even respond to the short time scales of the light. Moreover, the interaction turns out to be very significant and efficient.


IoT Data Governance: Taming the Deluge in Connected Environments

IoT environments typically comprise diverse devices and systems, often operating on different standards and protocols. Data Governance must address the challenge of integrating this disparate data to ensure seamless interoperability. This requires establishing common data models and communication protocols to use. For example, integrating data from wearable devices, electronic health records, and diagnostic equipment in a healthcare IoT setup is crucial for comprehensive patient monitoring and care. IoT Data Governance must also include policies for data storage and lifecycle management. This involves determining how long data should be stored and when it should be archived or deleted, as well as ensuring that storage solutions are scalable and cost-effective. Likewise, it’s important to understand that proper Data Governance doesn’t relate only to touchpoints between the device, the network, and the cloud. Instead, proper security protocols must be applied throughout the organization. From integrated document editors and AI assistants to plugins and VPNs, everything must be airtight.


5 Reasons Not to Use Serverless Computing

Serverless computing is cost-efficient in the sense that you only pay for the time your workloads are active. However, the per-minute cost of serverless is almost always higher than the cost of running an equivalent workload on a VM. For this reason, serverless may result in greater total costs than other types of cloud services, especially for workloads that are active most of the time. ... Each serverless computing platform works in a different way. That makes it challenging to migrate workloads from one serverless environment (like AWS Lambda) to another (such as Azure Functions). By comparison, the differences between other types of cloud services (such as AWS EC2 and Azure Virtual Machines) are less pronounced, leading to lower levels of lock-in when you use those services. ... Although serverless workloads theoretically run on-demand, in practice there is typically a delay between when serverless code is triggered and when it actually runs. This is especially true in the case of "cold starts," which happen when serverless code hasn't run recently. Sometimes, the delays in startup time can be a second or longer.


Industry 4.0 at Scale Can Make Smart Manufacturing Pervasive

We're barely scratching the surface to realize the potential of generative AI. Consider a worker's typical task: locating and utilizing manuals to address issues at a workstation. This process usually takes hours, sifting through pages and troubleshooting cryptic error codes. Now, picture digitizing these manuals, integrating them into a bedrock system and overlaying an anthropic cloud interface. With this setup, a worker standing before a machine encountering an error can simply input the error code and receive a clear explanation of the problem. This saves an immense amount of time, directly translating to cost savings and increased uptime. But the real potential lies in integrating this knowledge into workflows seamlessly. Imagine moving from receiving an error message to being provided with repair instructions while simultaneously updating maintenance records and checking inventory for required spare parts - all automatically. This transformative capability eliminates the need for manual intervention, which currently involves hours of inventory checks, finding personnel for repairs, and managing associated logistics.


Embracing the future: Top 5 home automation trends for 2024

As we strive towards a sustainable future, the inclusion of energy-saving devices has become a priority. Energy-efficient smart home devices include unique features that make energy conservation easier than ever. Lighting systems, for example, independently dim or switch off when rooms are unused, saving electricity and extending bulb life. Also, schedules to switch off lights, etc. at a preset time every day – ensure nil wastage of energy. By incorporating energy-efficient smart home gadgets into daily life, people can not only reduce their environmental effects but also benefit from lower costs of electricity. ... Artificial intelligence (AI) is stepping in to make the lives of people even more convenient and efficient. Virtual assistants powered by AI, such as Amazon Alexa and Google Assistant, have already become commonplace in many homes. These smart companions can answer inquiries and provide weather updates, as well as operate smart gadgets and much more. In the coming years, AI will become more important in home automation. Your home can learn your habits and adjust to create the perfect atmosphere.


AI Will Create Demand and Empower Developers, Not Replace Them

AI takes care of the annoying, tedious, routine tasks that may otherwise take up a significant amount of developers’ time, so they’re able to better concentrate on the real work at hand. In fact, 92% of developers are already using AI to lighten their load. ... AI can do this in seconds. All a developer needs to do is tell the technology what they want to accomplish and what language they want to use, and they can get some understanding of the best approach to their problem. Learning: Although there’s a need to check for accuracy, AI can help developers understand code snippets and programming concepts without having to do the research themselves. Documentation: Nobody likes documentation. It’s tedious and difficult. However AI documentation can help bring attention to things that didn’t work during the development process while reducing development times in the aftermath. Code quick-starts: This gives a major leg up to developers who have an idea but don’t know exactly where to begin. AI can generate coding within seconds despite the language. Even if the parameters of the project need a review, it gives you a head start.


Future challenges and innovations in cloud security platforms

The role of identity and access management systems in cloud security has evolved from simple gatekeeping to intelligent filtering. These systems now serve as sophisticated sentinels equipped to discern legitimate users from intruders. They’re no longer static barriers but dynamic shields, adapting to new threats and user behaviors. The advancement in these systems means they can now offer personalised access based on user roles and context, adding a layer of security that’s both smart and user-centric. Essentially, they act as the first line of defence, ensuring that only the right people can access the right data at the right time. ... The adage “it takes a village” holds particular relevance in cloud security, emphasising the importance of collective effort and cooperation in this field. Achieving unbreakable security is a collaborative mission, relying not on individual effort but on collective strength and teamwork. This collaboration transcends organizational boundaries, bringing together experts, companies, and competitors to forge a united front against cyber threats.


Social engineer reveals effective tricks for real-world intrusions

My main social engineering trick is just walking into a location like you belong there. People underestimate how far confidence will get you into a location and how unsuspecting people are when they feel secure. I’ve always said the only thing worse than no security is the false sense of security because it is tough to imagine something terrible will happen when you have that false sense of security. One of the main tricks that I do when I am doing a phishing attack is not to tell them that something positive has happened. I always have the topic of the e-mail to be unfortunate, something that may be a mistake, something that has happened that is important and, if not fixed immediately, could have dire consequences. People are very suspicious when they get an e-mail that something good has happened or will happen to them. Still, throughout history, humans have always craved information almost at any cost when they felt like a threatening situation was occurring around them. They need to discover what is happening and how it could affect them.



Quote for the day:

"Ambition is the path to success. Persistence is the vehicle you arrive in." -- Bill Bradley

Daily Tech Digest - January 07, 2024

2024 cybersecurity forecast: Regulation, consolidation and mothballing SIEMs

CISOs’ jobs are getting harder. Many are grappling with an onslaught of security threats, and now the legal and regulatory stakes are higher. The new SEC cybersecurity disclosure requirements have many CISOs concerned they’ll be left with the liability when an attack occurs. ... After the Cyber Resilience Act, policymakers and developers drive adoption of security-by-design. The CRA wisely avoided breaking the open source software ecosystem, but now the hard work starts: helping manufacturers adopt modern software development practices that will enable them to ship secure products and comply with the CRA, and driving public investment in open source software security to efficiently raise all boats. ... With the increase in digital business-as-usual, cybersecurity practitioners are already feeling lost in a deluge of inaccurate information from mushrooming multiple cybersecurity solutions coupled with a lack of cybersecurity architecture and design practices, resulting in porous cyber defenses.


Expert Insight: Adam Seamons on Zero-Trust Architecture

Zero trust goes beyond restricting access by need to know and the principle of least privilege. It’s about properly verifying access and being 110% certain that the access is legitimate. That means things like limiting access to specific criteria, such as by port or protocol, time period, IP address and/or physical location. ... A zero-trust network is about verification or double-checking. You want to be verifying not just the person, but also the device and limiting that access to specific permissions and rights that have been approved in advance. And you’re also restricting data access, particularly in situations like the example I just gave. Think of it like the difference between a key to the front door that gives you access to the whole house, and needing a key for the front door as well as separate keys for all the different rooms. ... AI and machine learning have both been used in detecting anomalies and suspicious patterns for some time, and will only continue to be used more. I expect SOCs to become increasingly reliant on AI. Getting more specific, log analysis is a key area for AI to automate. 


6 innovative and effective approaches to upskilling

Beverage maker Torani has been mixing up L&D by flipping the traditional performance review — which can be “demoralizing” — on its head. It puts the onus on future rather than past performance and on employee learning aspirations, rather than manager assessment. ... Devine adds: “With today’s shift to agile working, some firms believe yearly performance objectives and appraisals are insufficient and inflexible. They need something more frequent, nimble, and focused on feedback, skills and future needs. But you still need managers to assess performance to justify and provide transparency on promotions and pay decisions.” ... Microsoft is supporting workers across its organization gain skills related to AI — from non-techies to IT professionals and leaders. Simon Lambert, chief learning officer at Microsoft UK, says: “One lesson we’ve learned from our AI learning journey is that upskilling means far more than merely equipping employees with skills. It requires an ecosystem that fosters adaptability and continuous learning. In the face of AI-upskilling demand, employees need faster, seamless access to learning infrastructure.


2024 Data Center Un-Predictions: Five Unlikely Industry Forecasts

The potential impact of data centers on local communities is an important issue. At a recent conference in Virginia, we had activists from the community right alongside data center leaders to discuss the challenges and opportunities we face. While there were still some disconnects, we met in the middle on some critical topics around power, community engagement, and ensuring we create a more sustainable future. ... Leveraging self-driving technology, robots independently chart and traverse the data center, gathering real-time sensor data. This lets them immediately juxtapose present patterns against pre-defined norms, facilitating swift identification of deviations for human examination. In an ever more interconnected and intricate environment, this robotic technology grants decision-makers enhanced visibility, rapidity, and a breadth of intelligence that surpasses what humans or stationary cameras can provide. This advanced capability is vital for maintaining the efficiency and security of data centers in our increasingly digital world.


US DOD’s CMMC 2.0 rules lift burdens on MSPs, manufacturers

The proposed rules also let manufacturers off the hook for complying with NIST SP 800-171. SP 800-171 is a set of NIST cybersecurity rules to protect sensitive federal information. “The requirements of 171 set of cyber standards are designed for IT networks and information systems,” Metzger says. “They were never really designed for a manufacturing environment. It’s now said clearly in the proposed rules that the assessments won’t apply to operational technology." "That, to me, should cause manufacturers to breathe a huge sigh of relief because being required to meet NIST standards that simply don’t fit a manufacturing or OT environment is a recipe for trouble of many forms," Metzger says. “The most important change is what did not change. The document has essentially the same structure and strategy that was in 1.0. It requires third-party assessments for a very large number of defense suppliers.” The proposed version 2.0 of the CMMC rules was published in the Federal Register December 26. Interested parties have until February 26 to file comments with the DOD before the agency finalizes the rules.


Banking Innovation is Paramount Even as Regulatory and Competitive Pressures Mount

Guiding technology-forward regulations can empower banks to harness innovation, enhancing security, transparency, and customer value. Regulators should seek thoughtful oversight that encourages innovation while safeguarding against excessive risks instead of attempting to prevent the recurrence of a once-in-a-century financial crisis. Banks face a growing challenge to their market share from alternative lending platforms, which poses an existential threat, as noted in McKinsey’s 2023 Global Banking Annual Review. Over 70% of the growth in global financial assets since 2015 has shifted away from traditional bank lending, finding its way into private markets, institutional investors and the realm of “shadow banking.” Near-zero interest rates have enabled private equity firms and non-bank lenders to offer lower-cost loans. With its digitally savvy consumer base, the fintech sector has further accelerated this transition, particularly during the pandemic.


IT and OT cybersecurity: A holistic approach

As OT becomes more interconnected, the need to safeguard OT systems against cyber threats is paramount. Many cyber threats and vulnerabilities specifically target OT systems, which emphasizes the potential impact on industrial operations. Many OT systems still use legacy technologies and protocols that may have inherent vulnerabilities, as they were not designed with modern cybersecurity standards in mind. They may also use older or insecure communication protocols that may not encrypt data, making them susceptible to eavesdropping and tampering. Concerns about system stability often lead OT environments to avoid frequent updates and patches. This can leave systems exposed to known vulnerabilities. OT systems are not immune to social engineering attacks either. Insufficient training and awareness among OT personnel can lead to unintentional security breaches, such as clicking on malicious links or falling victim to social engineering attacks. Supply chain risks also pose a threat, as third-party suppliers and vendors may introduce vulnerabilities into OT systems if their products or services are not adequately secured.


Exploring the Future of Information Governance: Key Predictions for 2024

In today’s rapidly evolving digital landscape, information governance has become a collective responsibility. Looking ahead to 2024, we can anticipate a significant shift towards closer collaboration between the legal, compliance, risk management, and IT departments. This collaborative effort aims to ensure comprehensive data management and robust protection practices across the entire organization. By adopting a holistic approach and providing cross-functional training, companies can empower their workforce to navigate the complexities of information governance with confidence, enabling them to make informed decisions and mitigate potential risks effectively. Embracing this collaborative mindset will be crucial for organizations to adapt and thrive in an increasingly data-driven world. ... Blockchain technology, with its decentralized and immutable nature, has the tremendous potential to revolutionize information governance across industries. By 2024, as businesses continue to recognize the benefits, we can expect a significant increase in the adoption of blockchain for secure and transparent transaction ledgers.


Data Professional Introspective: Demystifying Data Culture

We are discussing data culture from several points of view: what new content should be added to the DCAM, where would it fall within the current framework structure, what changes we propose to that structure, what modifications should be made to existing content, how the new/modified content would be assessed, and so on. ... One can begin decomposing data culture from a high-level vision, which summarizes what the organization has accomplished when it can feel confident in asserting that, “We have a strong data culture.” One can also compile a collection of activities and behaviors that demonstrate a developed data culture, and then categorize them and parse them into the DCAM. Or, one can apply a combination of the two approaches, which is the path the working group has followed. The working definition posited to date includes a summary description of a strong data culture: “A strong data culture promotes data-driven decision-making, data transparency, and the alignment of data and analytics to business objectives. It prioritizes strategic data use and encourages sharing and collaboration around data.”


Tackling technical debt in the insurance industry

The impact of technical debt on insurers spans various dimensions. Data inefficiencies arise, leading to compliance issues and difficulties in recruiting and retaining talent. Outdated processes hinder optimal decision-making, impacting both established and newer insurers. Addressing technical debt requires insurers to foster a culture of change, emphasising the risks of neglecting this issue and aligning strategies with broader organisational objectives. Tackling technical debt involves immediate action, prioritised backlog creation, and adaptive development processes. Insurers are advised to navigate technical debt through a combination of incremental and transformational changes. Incremental adjustments and breakthrough advancements should complement comprehensive restructuring efforts for sustained and effective resolution. The roadmap to a resilient, innovative future in insurance hinges on proactive management of technical debt. Insurers must embark on their journey towards pricing transformation to remain competitive and future-ready.



Quote for the day:

"In the end, it is important to remember that we cannot become what we need to be by remaining what we are." -- Max De Pree