Daily Tech Digest - December 08, 2024

Here’s the one thing you should never outsource to an AI model

One of the biggest dangers in letting AI take the reins of your product ideation process is that AI processes content — be it designs, solutions or technical configurations — in ways that lead to convergence rather than divergence. Given the overlapping bases of training data, AI-driven R&D will result in homogenized products across the market. Yes, different flavors of the same concept, but still the same concept. Imagine this: Four of your competitors implement gen AI systems to design their phones’ user interfaces (UIs). Each system is trained on more or less the same corpus of information — data scraped from the web about consumer preferences, existing designs, bestseller products and so on. What do all those AI systems produce? Variations of a similar result. What you’ll see develop over time is a disturbing visual and conceptual cohesion where rival products start mirroring one another. ... In platforms like ArtStation, many artists have raised concerns regarding the influx of AI-produced content that, instead of showing unique human creativity, feels like recycled aesthetics remixing popular cultural references, broad visual tropes and styles. This is not the cutting-edge innovation you want powering your R&D engine.


How much capacity is in aging data centers?

Individual data centers have considerable differences between them, and one of the most critical is their size. With this weighting factor, the average moves — but not by much. The “average megawatt” is 10.2 years old. Whereas older data centers (10-plus years) represent 48 percent of the survey sample, they contain 38 percent of the total IT capacity — still a large minority. Interestingly, a more dramatic shift occurs within the population of data centers that have been operating for less than 10 years — well within the typical design lifespan. By facility count alone, there is an even split between the data centers that are one to five years old and those that have been in operation for six to ten years. But when measuring in megawatts, the newest data centers hold significantly more capacity (38 percent) than those with six to ten years of service. This is intuitive; in the past five years, some data center projects have reached unprecedented sizes. Very recent builds are overshadowing the capacity of data centers that are only slightly older, even though the designs are not dramatically different. However, the weighted figures above suggest that even this massive build-out has not yet overcome the moderating influence of much older, potentially less efficient facilities.


Generative AI is making traditional ways to measure business success obsolete

Often touted as the “iron triangle” from the perspective of operational efficiency, this equation implies that, in order to attain a degree of quality, firms must balance cost with the time spent to achieve that level of quality. ... AI has upended this thinking, as firms can now achieve both speed and accuracy at the same time by leveraging AI. This can enhance productivity and drive innovation without losing out on quality. Likewise, through generative AI, smaller companies with fewer resources are able to rub shoulders and compete with larger firms using AI-powered tools. They can do this by streamlining operations, creating cost-effective marketing content and delivering personalised customer experiences. This can make existing businesses more efficient, competitive and creative. It can also lower the barriers to entry into markets for prospective small and medium-sized business owners. ... The UK government’s recent autumn budget included a number of tax rises that will hit businesses, especially some small and medium-sized enterprises (SMEs) that don’t have the financial buffers to weather severe economic challenges. Generative AI has reconfigured the Cost x Time = Quality formula and has enabled firms to do things both quickly and accurately without a trade off.


UK Cyber Risks Are ‘Widely Underestimated,’ Warns Country’s Security Chief

“What has struck me more forcefully than anything else since taking the helm at the NCSC is the clearly widening gap between the exposure and threat we face, and the defences that are in place to protect us,” he said. “And what is equally clear to me is that we all need to increase the pace we are working at to keep ahead of our adversaries.”  ... Horne added that the guidance and frameworks drawn up by the NCSC are not widely used. Ultimately, businesses need to change their perspective on cyber security from a “necessary evil” or “compliance function” to “an integral part of achieving their purpose.” ... “The defence and resilience of critical infrastructure, supply chains, the public sector and our wider economy must improve” to protect against these nation-state threats, Horne said. Ian Birdsey, partner and cyber specialist at law firm Clyde & Co, told TechRepublic in an email: “The UK has increasingly become a target for hostile nations due to the redrawing of geopolitical battle lines and the rise in global conflicts in recent years. In turn, threat actors based in those territories are increasingly launching more severe and sophisticated cyberattacks on UK organisations, particularly within critical national infrastructure and its supply chain.


5 JavaScript Libraries You Should Say Goodbye to in 2025

jQuery is the grandparent of modern JavaScript libraries, loved for its cross-browser support, simple DOM manipulation, and concise syntax. However, in 2025, it’s time to officially let go. Native JavaScript APIs and modern frameworks like React, Vue, and Angular have rendered jQuery’s core utilities obsolete. Not to mention, vanilla JavaScript now includes native methods such as querySelector, addEventListener, and fetch that more conveniently provide the functionality we once relied on jQuery to deliver. Also, modern browsers have standardized, making the need for a cross-browser solution like jQuery redundant. Not to mention, bundling jQuery into an application today can add unnecessary bloat, slowing down load times in an age when speed is king. ... Moment.js was the default date-handling library for a long time, and it was celebrated for its ability to parse, validate, manipulate, and display dates. However, it’s now heavy and inflexible compared to newer alternatives, not to mention it’s been deprecated. Moment.js clocks in at around 66 KB (minified), which can be a significant payload in an era where smaller bundle sizes lead to faster performance and better UX.


How media, publishing and entertainment organizations can master Data Governance in the age of AI

One of the reasons AI governance has proven to be such a challenging new discipline is that it’s so multifaceted. Tiankai explained that it’s comprised of several key elements: Ownership and stewardship: AI models need ownership, and so does AI governance. The right people must be accountable for ensuring AI models are used in the right ways. Cross-functional decision-making: A cross-domain thinking and decision-making model is essential. One central function can’t make every AI-relevant governance decision, so you need ways to bring the accountable people together. Processes and metadata: Teams must make their models explainable, so everyone can understand the quality of their outputs and the root causes of any negative outcomes. Technology enablement: Technology must support governance frameworks and make them work at scale. This shows that AI governance requires a combination of people, process and technology change. The panel agreed that the ‘people’ element is the toughest to manage effectively. Nathalie Berdat, Head of Data and AI Governance, BBC, explained some of the people-specific challenges that she has encountered along its AI governance journey. 


5 ways to tell people what to do at work

Nick Woods, CIO of airport group MAG, said dialogue is the priority for any professional who wants to avoid ambiguity. "If you're telling somebody what to do, you're already in the wrong place," he said. "Success is about a coaching, conversational dialogue that you need to have that ultimately comes down to a handshake on, 'Are we clear on what's next?'" Woods told ZDNET that most management decisions involve an ongoing debate. He doesn't believe in being directive about outputs and telling people what they need to go and do. "I think I'm much more in a space of, 'Actually, I've hired good people. I'm going to allow you to go and tell me what we need to do, and then we're going to have a dialogue about it,'" he said. ... Niall Robinson, head of product innovation at the Met Office, said talented staff should be given space to express their creativity. "There's a temptation as a leader to tell people how to do stuff -- and that can be a trap," he said. Robinson told ZDNET that he focuses on avoiding that problem by trusting his staff to generate recommended actions. "A habit I've been trying to practice is to tell people what success looks like and then giving them the agency to describe the options to me because they're closer to many of the solutions. So, success is about giving people the power to advise me."


Navigating NextGen Enterprise Architecture with GenAI

GenAI can modernize technology architecture by facilitating optimal best-of-breed solutions selection based on diverse criteria deep analyses. It offers tailored guidance aligned with business requirements as well as key capabilities such as scalability, resilience, and reversibility. This dynamic capacity adapts to evolving IT landscapes and business requirements, continuously refining recommendations based on the changing need and technological state-of-art. Moreover, GenAI accelerates homemade solutions development by generating code snippets. It produces-free functions and classes code segments written in any programming language, which improves efficiency and reduces manual coding efforts. This capacity improves developers' productivity and allows teams to focus more on high-level design. It also ensures that generated code is aligned with coding standards related to maintainability, readability, collaboration, and consistency. GenAI has amazing advantages, but it also has some major challenges. One of them is sustainability issues, which are increasingly important in technology adoption. In fact, many enterprises take this criterion into account in their technology architecture principles and assess it when they select a new solution to enhance their IT landscape.


The 7 R's of cloud migration: How to choose the right method

The R's model isn't new, but it has evolved significantly over the years. Its genesis is usually attributed to Gartner, who came up with the 5 R's model back in 2010. The original five were rehost, refactor, revise, rebuild and replace. As the cloud continued to evolve and more diverse workloads were being migrated to the cloud, AWS added a sixth R -- retire -- and eventually, a seventh, for retain. This seventh R is effectively an acknowledgment that not all workloads are suited to being hosted in the cloud. ... Rehosting can be done in a few ways, but it often means creating cloud-based virtual machines that mimic the infrastructure an application is currently running on. ... Rehosting an application requires you to create a cloud VM instance and then move the application onto that instance. Relocating, on the other hand, involves moving an existing VM from an on-premises environment to the cloud without making significant changes to it. ... A workload might be suitable for retirement if it is no longer actively supported by the vendor. In such cases, it's important to make sure you have a workaround before retiring an application the organization still uses. That might mean adopting a competing application that offers similar functionality or developing one in-house.


Evolving Your Architecture: Essential Steps and Tools for Modernization

Tech debt, lack of modernization can also get you out there in the news, and not as a very good thing, as we could see for SWA a couple years ago when they had a pretty huge meltdown with their booking systems and all that. It damaged their image, but also got them pretty down on their plans in revenue and all that, and still, nowadays they are facing the consequences of that meltdown, which was basically because of ignoring and putting aside the conversations about tech debt and application modernization as a whole. ... It's basically looking at the inventory of applications that you have in your organization, and understanding, what are the critical ones? What is the value that it adds? Alignment with the business goals. Really like, is it commodity? Can I just go and buy one out of the shelf, two? Then it's fine, go and buy it. If it's something that differentiates you, you got to innovate, then it might be worth looking at building it and hence modernizing it. ... The other thing is the age of technology. If you have outdated technology, you very likely have vulnerabilities. If you have lack of support, either from the community or the vendors, there is a security vulnerability there, but there is no security patch being released because there is no support anymore.



Quote for the day:

"Do something today that your future self will thank you for." -- Unknown

Daily Tech Digest - December 07, 2024

In the recent past, people had the perception that HDD storage is slow and can only be used for backup. However, in the last 2 years, we have demonstrated in our European HDD laboratory how to combine multiple HDDs to test function and performance. If you have 100s of HDDs in your large-scale storage system, you also have around a billion different configuration possibilities. ... The demand for HDDs in surveillance applications continues to surge, with an increasing number of digital video recorder manufacturers entering the market. From relatively cheap surveillance systems for private homes, to medium priced surveillance systems to expensive surveillance systems for large-scale infrastructures like smart cities. The sequential nature of video surveillance data and the fact that it is over-written at some point in time, makes HDDs the uncontested choice at all levels for surveillance storage. ... At the very least, preserving a duplicate of one’s data using an alternative technology is a sensible measure. This could be a combination of cloud services or a mix of cloud and external storage, such as a USB-connected portable HDD like a Toshiba Canvio. It’s a small price to pay for peace of mind that your data is safe.


Top 3 Strategies for Leveraging AI to Transform Customer Intelligence

Transitioning from reactive to proactive engagement is one of AI's most transformative capabilities for customer intelligence. Predictive models trained on historical data allow organizations to anticipate customer needs, helping them deliver timely, relevant solutions. By recognizing patterns and trends, AI empowers businesses to forecast future customer actions — whether that's product preferences, the likelihood of churn, or upcoming purchase intent — enabling a more proactive approach to customer engagement. ... AI enables companies to personalize customer interactions dynamically across multiple channels. For instance, AI-powered chatbots can provide instant responses, creating a conversational experience that feels natural and responsive. By integrating these capabilities into CRM systems, companies ensure that every customer touchpoint — chat, email, or in-app messaging — is customized based on a customer's unique history and recent activities. This focus on personalization also extends to effective customer segmentation, as organizations aim to provide the right level of service to each customer based on their specific needs and entitlements.


Who’s the Bigger Villain? Data Debt vs. Technical Debt

Although data debt and tech debt are closely connected, there is a key distinction between them: you can declare bankruptcy on tech debt and start over, but doing the same with data debt is rarely an option. Reckless and unintentional data debt emerged from cheaper storage costs and a data-hoarding culture, where organizations amassed large volumes of data without establishing proper structures or ensuring shared context and meaning. It was further fueled by resistance to a design-first approach, often dismissed as a potential bottleneck to speed. ... With data debt, prevention is better than relying on a cure. Shift left is a practice that involves addressing critical processes earlier in the development lifecycle to identify and resolve issues before they grow into more significant problems. Applied to data management, shift left emphasizes prioritizing data modeling early, if possible — before data is collected or systems are built. Data modeling allows for following a design-first approach, where data structure, meaning, and relationships are thoughtfully planned and discussed before collection. This approach reduces data debt by ensuring clarity, consistency, and alignment across teams, enabling easier integration, analysis, and long-term value from the data.


Understanding NVMe RAID Mode: Unlocking Faster Storage Performance

While NVMe RAID mode offers excellent benefits, it’s not without its challenges. One of the most significant hurdles is the complexity of setting it up. RAID arrays, particularly with NVMe drives, require specialized hardware or software RAID controllers. Additionally, configuring RAID in the BIOS or UEFI settings can be tricky for less experienced users. Another challenge is cost. NVMe SSDs, while dropping in price over the years, are still generally more expensive than traditional SATA-based drives. Combining multiple NVMe drives into a RAID array can significantly increase the cost of the storage solution. For users on a budget, this might not be the most cost-effective option. Finally, RAID configurations that emphasize performance, like RAID 0, do not provide any data redundancy. If one drive fails, all data in the array is lost. ... NVMe RAID mode is ideal for users who need extremely fast read and write speeds, high storage capacity, and, in some cases, redundancy. This includes professionals who work with large video files, developers running complex simulations, and enthusiasts building high-end gaming PCs. Additionally, businesses that rely on fast access to large databases or those that run virtual machines may benefit from NVMe RAID configurations.


Supply chain compromise of Ultralytics AI library results in trojanized versions

According to researchers from ReversingLabs, the attackers leveraged a known exploit via GitHub Actions to introduce malicious code during the automated build process, therefore bypassing the usual code review process. As a result, the code was present only in the package pushed to PyPI and not in the code repository on GitHub. The trojanized version of Ultralytics on PyPI (8.3.41) was published on Dec. 4. Ultralytics developers were alerted Dec. 5, and attempted to push a new version (8.3.42) to resolve the issue, but because they didn’t initially understand the source of the compromise, this version ended up including the rogue code as well. A clean and safe version (8.3.43) was eventually published on the same day. ... According to ReversingLabs’ analysis of the malicious code, the attacker modified two files: downloads.py and model.py. The code injected in model.py checks the type of machine where the package is deployed to download a payload targeted for that platform and CPU architecture. The rogue code that performs the payload download is stored in downloads.py. “While in this case, based on the present information the RL research team has, it seems that the malicious payload served was simply an XMRig miner, and that the malicious functionality was aimed at cryptocurrency mining,” ReversingLabs’ researchers wrote. 


Data Governance Defying Gravitas

When it comes to formalizing data governance in a complex organization, there’s often an expectation of gravitas — a sense of seriousness, authority, and weight that makes the effort seem formidable and unyielding. But let’s be honest: Too much gravitas can weigh down your data governance program before it even begins. Enter the Non-Invasive Data Governance approach, which flips the script on gravitas by delivering effectiveness without the unnecessary posturing, proving that you can have impact without the drama. ... Complex organizations are not static, and neither should their data governance approach be. NIDG defies the traditional concept of gravitas by embracing adaptability. While other frameworks crumble under the weight of organizational change, NIDG thrives in dynamic environments. It’s built to flex and evolve, ensuring governance remains effective as technologies, priorities, and personnel shift. This adaptability fosters a sense of trust. People know that NIDG isn’t a rigid set of rules, but a living framework designed to support their needs. It’s this trust that gives NIDG its gravitas — not the false authority of inflexible mandates, but the real authority that comes from being a program people believe in and rely on. 


Weaponized AI: Hot for Fraud, Not for Election Interference

"Criminals use AI-generated text to appear believable to a reader in furtherance of social engineering, spear phishing and financial fraud schemes such as romance, investment and other confidence schemes, or to overcome common indicators of fraud schemes," it said. More advanced use cases investigated by law enforcement include criminals using AI-generated audio clips to fool banks into granting them access to accounts, or using "a loved one's voice to impersonate a close relative in a crisis situation, asking for immediate financial assistance or demanding a ransom," the bureau warned. Key defenses against such attacks, the FBI said, include creating "a secret word or phrase with your family to verify their identity," which can also work well in business settings - for example, as part of a more robust defense against CEO fraud (see: Top Cyber Extortion Defenses for Battling Virtual Kidnappers). Many fraudsters attempt to exploit victims before they have time to pause and think. Accordingly, never hesitate to hang up the phone, independently find a phone number for a caller's supposed organization, and contact them directly, it said.


Data Assurance Changes How We Network

Today, the simplest way to control the path data takes between two points is to use a private network (leased lines, for example). But today’s private networks are extremely expensive and don’t offer much in the way of visibility. They also take months to provision, which slows business agility. Even with MPLS, IGP shortest path routing will always follow the shortest IGP path. If alternate paths are available, traffic engineering (TE) with segment routing (SR) can utilize non-shortest paths. However, if the decision is made within the Provider Edge (PE) router in the service provider's network, it will necessitate source-based routing, which is not sustainable due to the challenges of implementing source routing on a per-customer basis within the service provider network. This approach will not scale effectively in an MPLS environment, and moreover, 99% of MPLS private networks do not encrypt traffic, leading to significant performance and scalability issues. Another option is to move your operations to a public cloud that can guarantee you meet data assurance goals. This, too, can be prohibitively expensive and also lacks visibility. 


Spotting the Charlatans: Red Flags for Enterprise Security Teams

Sadly, by the time most people catch on that there is a charlatan in the team, grave damage has been done to both the morale and progress of the security team. That being said, there are some clues that charlatans leave behind from time to time. If we are astute and perceptive, we can pick up on these clues and work to contain the damage that charlatans cause. ... Most talented security professionals I’ve worked with have a healthy amount of self-doubt and insecurity. This is completely normal, of course. Charlatans take advantage of this, cutting down talented professionals that they see as a threat. This causes those targeted to recoil in a moment of thought and introspection, which is all the charlatan needs to retake the spotlight. ... One of the strategies of a charlatan is to throw their perceived threat off their game. One way in which they do this is by taking pot shots. Charlatans throw subtle slights, passive-aggressive insults, and unpredictable surprises at their targets. If the targeted individual reacts to the tactic or calls the charlatan out, the target then seems like the aggressor. The best response is to ignore the pot shots and try to stay focused. In many cases, when the charlatan realizes they cannot rattle you, they will slowly lose interest.


Why ICS Cybersecurity Regulations Are Essential for Industrial Resilience

As the cybersecurity landscape becomes increasingly complex, industrial companies, especially those managing industrial control systems (ICS), face heightened risks. From protecting sensitive data to safeguarding critical infrastructure, compliance with cybersecurity regulations has become essential. Here, we explore why ICS cybersecurity is crucial, the risks involved, and key steps organizations can take to meet regulatory demands without compromising operational efficiency. ... Cybersecurity risks are no longer a secondary concern but a primary focus, especially for industries managing critical infrastructure such as energy, water, and transportation. Cyber threats targeting ICS environments have become more sophisticated, posing risks not only to individual companies but also to the broader economy and society. Regulatory adherence ensures these vulnerabilities are managed systematically, reducing potential downtime, data breaches, and even physical threats. ... Cybersecurity in ICS environments isn’t merely about meeting regulatory requirements; it’s a strategic priority that protects both assets and people. By focusing on identity management, automating updates, aligning with industry standards, and bridging IT-OT security gaps, organizations can enhance resilience against emerging threats.



Quote for the day:

“Identify your problems but give your power and energy to solutions.” -- Tony Robbins

Daily Tech Digest - December 06, 2024

Preparing for AI-Augmented Software Engineering

AI-augmented approaches will free software engineers to focus on tasks that require critical thinking and creativity, predicts John Robert, deputy director of the software solutions division of the Carnegie Mellon University Software Engineering Institute. "A key potential benefit that excites most enthusiasts of AI-augmented software engineering approaches is efficiency -- the ability to develop more code in less time and lower the barrier to entry for some tasks." Teaming humans and AI will shift the attention of humans to the conceptual tasks that computers aren't good at while reducing human error from tasks where AI can help, he observes in an email interview. ... Hall notes that GenAI can access vast amounts of data to analyze market trends, current user behavior, customer feedback, and usage data to help identify key features that are in high demand and have the potential to deliver significant value to users. "Once features are described and prioritized, multiple agents can create the software program's components." This approach breaks down big tasks into multiple activities with an overall architecture. "It truly changes how we solve complex issues and apply technology."


Code Busters: Are Ghost Engineers Haunting DevOps Productivity?

The assertion here is that almost 10% of software application developers do effectively nothing all day, or indeed all week. For wider clarification, the remote worker segment has more outlier positive performers, but in-office workers exhibit a higher average performance overall. ... “Many ghost engineers I’ve talked to share a common story, i.e. they become disengaged due to frustration or loss of motivation in their roles. Over time, they may test the limits of how much effort they can reduce without consequence. This gradual disengagement often results in them turning into ghosts; originally not out of malice, but as a by-product of their work environment.” He says that managers want to build high-performing teams but face conflicting incentives. A poorly performing team reflects badly on its leadership, leading some to downplay problems rather than address them head-on. Additionally, organizational politics may discourage reducing team sizes, even when smaller, more focused teams could be more effective. ... “There’s also the fact that senior leaders are often further removed from day-to-day operations. Their decisions are based on trust in middle management or flawed metrics, such as lines of code or commit counts. They, too, are sometimes not incentivized to reduce team sizes or deeply investigate performance issues, as their focus tends to be on higher-level strategic outcomes,” said Denisov-Blanch.


Why Data Centers Must Strengthen Network Resiliency In The Age of AI

If a network outage occurs, there will be widespread disruptions, negatively affecting businesses globally. In particular, network outages will compromise the accessibility of AI applications, the very thing data centers scaled to support. Outages—and even reduced performance—carry significant risks, both financial and reputational. Data centers must therefore adopt network solutions, like Failover to Cellular and out-of-band (OOB) management, to ensure AI services remain accessible amid disruptions to normal operations. ... OOB management capabilities and Failover to Cellular integration lay a solid foundation for network resilience. However, data centers don’t need to stop there. AI integrations promise further enhancements, elevating these tools to the next level through advanced intelligence and automation. While it may seem odd to use AI when the extra stress on data centers today comes from increased AI usage, the advanced capabilities and accompanying benefits of this technology speak for themselves. AI’s ability to analyze patterns allows it to detect connectivity issues that could cause failures. When combined with Failover to Cellular, for example, AI orchestrates a seamless Failover to Cellular backup, especially during peak traffic. AI can also automatically take proactive measures like predictive maintenance or rerouting traffic, reducing downtime and improving resilience.


Financial services need digital identity stitched together, investors take note

Financial institutions are all looking for a low friction, high accuracy way of authenticating customers, prospects and business partners that also keeps regulators happy. Some of the approaches and techniques used by established players in the digital identity market have achieved good volume and scale, and newer innovative methods are still proving themselves. Byunn highlights the opportunity in a third layer that’s “all about how you stitch these things together, because so far no one has produced a single solution that addresses everything.” This layer, he says, includes both “orchestration” and elements of holistic scoring (heuristics etc.) “that are not fully covered by what the market calls orchestration.” Earlier waves of technology serving financial services companies were thoroughly penetrated by fraudsters, and in some cases offered poor user experience, Byunn says. One example of this, knowledge-based authentication, remains “shockingly still prevalent in the industry.” ... The threat of deepfakes to financial service institutions seems to be commonly overstated at this time, according to Byunn, at least in part because conventional wisdom is also somewhat underestimating the effectiveness of market leaders’ defense against genAI and deepfakes. However, he notes that the threat has the potential to grow significantly.


The world is running short of copper - telecoms networks could be the answer

Copper remains foundational in older telecom networks, particularly in Europe and North America, with incumbent operators like AT&T, Orange, and BT. However, networks are actively transitioning from copper to fiber optics particularly with ‘last mile connectivity’ and the replacement of infrastructure like Public Switched Telephone Networks (PSTN). While recycling from these sources may not completely plug the 20 percent gap in supply, it can go a long way. It almost goes without saying, that precious metals reclaimed this way have far less environmental impact - around 15 times less. Purchasing copper from these sources is still often cheaper than mining it. ... Over the next eight to ten years, an estimated 800,000 tons of copper could be extracted from telecom networks as part of the global shift to fiber optics. ... Unlocking the value of reclaimed copper is both an environmental and strategic win, especially with the soaring demand for this vital resource. Through effective partnerships and advanced material recovery processes, telecom companies can transform what was once surplus to requirements into a valuable asset. Extracted copper can re-enter the supply chain, supporting the broader green transition and reducing reliance on new mining operations. 


8 biggest cybersecurity threats manufacturers face

The manufacturing sector’s rapid digital transformation, complex supply chains, and reliance on third-party vendors make for a challenging cyber threat environment for CISOs. Manufacturers — often prime targets for state-sponsored malicious actors and ransomware gangs — face the difficult task of maintaining cost-effective operations while modernizing their network infrastructure. “Many manufacturing systems rely on outdated technology that lacks modern security measures, creating exploitable vulnerabilities,” says Paul Cragg, CTO at managed security services firm NormCyber. “This is exacerbated by the integration of industrial internet of things [IIoT] devices, which expand the attack surface.” ... “While industries like chemicals and semiconductors exhibit relatively higher cybersecurity maturity, others, such as food and beverage or textiles, lag significantly,” Belal says. “Even within advanced sectors, inconsistencies persist across organizations.” Operational technology systems — which may include complex robotics and automation components — are typically replaced far more slowly than components of IT networks are, contributing to the growing security debt that many manufacturers carry.


What is a data scientist? A key data analytics role and a lucrative career

Data scientists often work with data analysts, but their roles differ considerably. Data scientists are often engaged in long-term research and prediction, while data analysts seek to support business leaders in making tactical decisions through reporting and ad hoc queries aimed at describing the current state of reality for their organizations based on present and historical data. So the difference between the work of data analysts and that of data scientists often comes down to timescale. A data analyst might help an organization better understand how its customers use its product in the present moment, whereas a data scientist might use insights generated from that data analysis to help design a new product that anticipates future customer needs. ... Data scientists need to manipulate data, implement algorithms, and automate tasks, and proficiency in programming is essential. Van Loon notes that critical languages include Python, R, and SQL. ... They need a strong foundation in both to analyze data accurately and make informed decisions. They also need to understand statistical tests, distributions, likelihoods, and concepts such as hypothesis testing, regression analysis, and Bayesian inference. 


How Active Archives Address AI’s Growing Energy and Storage Demands

Archives were once considered repositories of data that would only be accessed occasionally, if at all. The advent of modern AI has changed the equation. Almost all enterprise data could be valuable if made available to an AI engine. Therefore, many enterprises are turning to archiving to gather organizational data in one place and make it available for AI and GenAI tools to access. Massive data archives can be stored in an active archive at a cost-efficient price and at very low energy consumption levels, all while keeping that data readily available on the network. Decades of archived data can then be analyzed as part of an LLM or other machine learning or deep learning algorithm. ... An intelligent data management software layer is the foundation of an active archive. This software layer plays a vital role in automatically moving data according to user-defined policies to where it belongs for cost, performance, and workload priorities. High-value data that is often accessed can be retained in memory. Other data can reside on SSDs, lower tiers of disks, and within a tape- or cloud-based active archive. This allows AI applications to mine all that data without being subjected to delays due to content being stored offsite or having to be transferred to where AI can process it.


The Growing Importance of AI Governance

The goal of AI governance is to ensure that the benefits of machine learning algorithms and other forms of artificial intelligence are available to everyone in a fair and equitable manner. AI governance is intended to promote the ethical application of the technology so that its use is transparent, safe, private, accountable, and free of bias. To be effective, AI governance must bring together government agencies, researchers, system designers, industry organizations, and public interest groups. ... The long-term success of AI depends on gaining public trust as much as it does on the technical capabilities of AI systems. In response to the potential threats posed by artificial intelligence, the U.S. Office of Science and Technology Policy (OSTP) has issued a Blueprint for an AI Bill of Rights that’s intended to serve as “a guide for a society that protects all people” from misuse of the technology. ... As AI systems become more powerful and complex, businesses and regulatory agencies face two formidable obstacles: The complexity of the systems requires rule-making by technologists rather than politicians, bureaucrats, and judges. The thorniest issues in AI governance involve value-based decisions rather than purely technical ones.


The Role of AI in Cybersecurity: 5 Trends to Watch in 2025

The integration of AI into Software-as-As-Service (SaaS) platforms is changing how businesses manage security. For example, AI-enhanced tools are helping organizations automate threat detection, analyze vast data sets more efficiently, and respond to breaches or incidents more quickly. However, this innovation also introduces new risks such as hallucinations and an over-reliance on potentially poor data quality, meaning AI-powered systems need to be carefully configured to avoid outputs that mislead and are disadvantageous to defenders. ... AI auditing tools will help organizations assess whether AI models are making decisions based on biased or discriminatory data – a concern that could lead to legal and reputational challenges. As AI technology becomes more embedded in organizational operations, ethical considerations must be at the forefront of AI governance to help businesses avoid unintended consequences. Board members must be proactive in understanding the implications of AI on data security and ensuring that their companies are following best practices in AI governance for compliance with evolving legislation. Without C-suite support and understanding, and collaboration between executives and security teams, organizations will be more vulnerable to the potential risks AI poses to data and intellectual property.



Quote for the day:

"Leadership is about making others better as a result of your presence and making sure that impact lasts in your absence." -- Sheryl Sandberg

Daily Tech Digest - December 05, 2024

Fintech Partnership Streamlines Banking Data Integrations

“We’re on the brink of enabling non-programmers to build integrations with minimal effort,” Skye Isard, Sandbox Banking co-founder and CTO, told The New Stack. “AI-driven tools can automate the creation of logic for integrations, seriously reducing the complexity and time required to deploy new workflows. “AI is empowering ‘citizen developers‘ — individuals without coding expertise — to create automations and integrations, further democratizing access to technology. AI allows us to leverage our vast library of API documentation and prebuilt integrations to create even more intelligent and automated solutions. We envision a future where AI can generate integration logic, making it easier for non-programmers to build and deploy integrations.” ... Given the sensitive nature of banking data, Sandbox Banking prioritizes security, Isard said. Its platform adheres to stringent compliance standards, including SOC2 audits, recurring penetration testing and advanced encryption protocols. Data persistence is minimized, with live databases retaining sensitive information for no more than 14 days. These measures ensure that Sandbox Banking’s solutions not only improve efficiency but also meet high standards of data protection and privacy, Isard said.


Dear CEO: It’s time to rethink security leadership and empower your CISO

The stakes have never been higher. Every week, another breach makes headlines, costing millions in losses, irreparable damage to reputations, and a wave of uncertainty that ripples through customers and stakeholders alike. But consider this: Who is truly liable when things go wrong? You might assume the CISO holds the liability, but if they aren’t empowered with the authority, resources, and support to act effectively, can we honestly place the blame there? ... Giving the CISO a seat at the table isn’t a symbolic gesture — it’s a practical necessity. It allows us to align security strategies with business goals, identify risks before they become roadblocks, and ensure that opportunities are pursued without unnecessary exposure. When CISOs are integrated into the executive team, they’re not just protecting the business; they’re enabling it to grow with confidence. That said, some CEOs reading this may not have this type of CISO in their organization today. If that’s the case, it’s worth asking why. Is the person in the CISO seat there to simply tick a box? If so, that’s a recipe for disaster. The No. 1 core competency a CISO should possess is leadership — the ability to inspire, align, and drive a security strategy that supports and advances the business.


What are AI agents and why are they now so pervasive?

Agentic AI operates in two key ways. First, it offers specialized agents capable of autonomously completing tasks across the open web, in mobile apps, or as an operating system. A specific type of agentic AI, called conversational web agents, functions much like chatbots. In this case, the agentic AI engages users through multimodal conversations, extending beyond simple text chats to accompany them as they navigate the open web or use apps ... AI agents are already showing up in places you might not expect. For example, most self-driving vehicles today use sensors to collect data about their surroundings, which is then processed by AI agentic software to create a map and navigate the vehicle. AI agents play several other critical roles in autonomous vehicle route optimization, traffic management, and real-time decision-making — they can even predict when a vehicle needs maintenance. ... Notably, AI agents also have the ability to remember past interactions and behaviors. They can store those experiences and even perform “self-reflection” or evaluation to inform future actions, according to IDC. ... And while most agentic AI tools claim to be safe and secure, a lot depends on the information sources they use. That’s because the source of data used by the agents could vary — from more limited corporate data to the wide open internet.


Failover vs. Failback: Two Disaster Recovery Methods

Failover is critical in a business continuity event because it keeps operations running. By having a system to which your business can transition when a primary system is unavailable, you're able to continue doing business. People can work, revenue streams are preserved, and customers can be served. Without failover, these functions could grind to a halt, leading to significant disruption. Many organizations depend on technology for critical processes, and when those processes are unavailable, analog alternatives may be insufficient or entirely obsolete. Failover ensures that even in a disaster, the business keeps moving. Failback comes into play once the need for failover ends. As the disaster is resolved, failback allows the organization to return to normal operations. Typically, failback is necessary when the standby system cannot sustain operations as effectively as the primary system. For instance, a standby system may not be a full replica of the primary system and might be designed only for temporary use during an emergency. In an ideal world, every business would maintain two fully operational environments: a primary environment and an identical standby environment. This setup would allow for seamless transitions during disasters, ensuring that business operations are completely unaffected.


Burnout: A chronic epidemic in the IT industry

For IT leaders aware of the impact burnout can have on their staff, the reality of exhaustion in IT and tech is further complicated by the fact that burnout isn’t caused by just one thing. It’s a problem that builds slowly over time, leading to disengaged and unmotivated employees with one foot out the door. It can be hard to spot, too. ... Another contributing factor to burnout is the rapid adoption of AI, which has left a lot of workers feeling overwhelmed by keeping up with the latest industry trends. While it’s often touted as a productivity booster, 85% of IT leaders plan to make AI technology mandatory or encourage it’s use in the coming year, which is increasing pressure on workers to upskill, according to Upwork. In fact, 77% of employees said AI has added to their workloads, rather than relieved their daily responsibilities. Cybersecurity professionals feel the pressure of AI, too, with 42% reporting they have concerns about AI-powered attacks, according to BlackFog. To help combat this, 41% also say they need bigger budgets for security tools, along with more support to alleviate pressure to keep the organization safe. Burnout can lead to dangerous results when it comes to security as 63% of respondents said their team experiences alert fatigue, which desensitizes them to the urgency of security events.


Why Banks Need Flexible Tech Architecture — and How to Build It

To operate and launch the banking experience of the future, banks and credit unions must implement “MACH” and “composable” technologies that allow digital teams to take advantage of future-proofed, in-the-moment innovations.Composable technology stresses a modular approach that enables organizations to obtain the best options for their needs in each aspect of their tech, all options working together regardless of the source. MACH — I’ll get into the details below — is an approach to achieving composability. ... Considered a more modern approach, MACH is a standard way of building technology that enables organizations to develop a flexible enterprise tech stack in which each component is modular, scalable and easily replaced. MACH technologies must be microservices-based, API-first, cloud-native SaaS, and “headless,” in which the customer’s front-end digital experience is decoupled from the back-end programming. Companies that leverage a composable approach using MACH principles can prepare for future innovation through a more resilient and modern tech stack. ... The advantage of a MACH architecture includes being able to select modular, best-of-breed solutions to integrate into the overall tech stack, while ensuring each of the pieces work together seamlessly. 


Analysing Linus Torvald’s Critique of Docker

First off, we ditch the shared-kernel approach entirely. We need to build a micro-hypervisor model, where each container runs its own minimal kernel. This ensures that every container is genuinely isolated, similar to a lightweight VM but without the bloat. By employing a microkernel architecture, you’re essentially granting each container its own mini-OS that only loads essential components, drastically reducing the attack surface. This step eliminates the primary flaw of Docker’s shared-kernel model. Next, leverage hardware-assisted virtualisation like Intel VT-x or AMD-V to handle isolation efficiently. This is where we’ll differentiate ourselves from Docker’s reliance on namespaces. With hardware support, each container will get near-native performance while maintaining strict separation. For example, instead of binding everything to a Linux kernel, containers will interact directly with hardware-level isolation, meaning exploits won’t have the chance to jump from one container to another. We can’t ignore orchestration. Rather than bolting on security later, build an orchestration layer that enforces strict security policies from the get-go. This orchestration tool, think Kubernetes but with security baked in, will enforce seccomp, AppArmor, and SELinux profiles automatically based on container configurations. 


Leaders must balance humility with inspiration to foster a culture of curiosity and courage

It is important how do we, as leaders, show up. The second is culture. What kind of culture do we create as leaders? Fostering an environment that encourages adaptability, resilience, and openness to change, rather than rigidity or resistance. And, the third important factor is the system. What kind of systems do we establish to continuously adopt and adapt to change, ensuring the organization remains flexible and forward-looking? To inspire collaboration and trust among the team, Divya sees humility as a crucial factor. Leaders must first acknowledge that they don’t have all the answers. “When leaders demonstrate vulnerability, team members are more likely to step forward with their knowledge and ideas.” Citing the example of leading by example, she mentioned how her current CFO attended a two-month machine learning course at the London School of Economics, signaling his willingness to learn and adapt. This motivated the entire organisation to upskill and embrace new technologies. Creating the right culture is the next step. Leaders must foster curiosity by rewarding those who explore new knowledge and share their insights. For example, celebrating a retail employee who transitioned into data analytics inspires others to follow suit.


How to Keep IT Team Boredom From Killing Productivity

A bored IT team is a ticking time bomb, Herberger warns. "The risks are clear: increased turnover as talent walks out the door, underperformance that drags down productivity, and a contagious drop in morale that can spread like a virus across the organization," he says. "Worse, in a competitive industry, boredom kills innovation, leaving your company vulnerable to being outpaced by more engaged and agile competitors." A disengaged IT team, or team subset, can negatively impact business performance, since members are probably not contributing to their full abilities. ... To reinvigorate a sagging IT team, Herberger recommends shaking things up by introducing fresh challenges and innovation opportunities: "Whether it's rotating team roles, fostering a culture of collaboration, or carving out time for passion projects, the goal is clear: disrupt the routine, reawaken creativity, and make the team feel like they're part of something bigger than just punching the clock." ... Daly recommends that IT leaders stay close to their workforce in order to understand their engagement levels, manage mundane tasks effectively, and create space for more interesting assignments. To help prevent disengagement, he suggests offering learning opportunities and activities that promote development and growth.


Why and how to craft an effective hyperscale cloud exit strategy

If a business does choose to go down the hyperscale route, my advice is to formulate an exit plan before onboarding. It’s a key part of contingency planning and should be thought through and finalized before any vendor contract is signed. A cloud exit strategy acts as an insurance policy for events that are both inside and outside of an organization’s control. ... An organization should bring together representatives from each area of a business, ranging from the IT leadership and technology architecture teams, to procurement and sourcing, legal and compliance, and finance. Together, they need to understand how the current infrastructure set up is designed and the specific servers that are being used. They also need to carry out a detailed audit of what’s included in their monthly bills, any major inefficiencies, and details of platform integrations and tightly coupled systems. Having this information will make it far easier to plan out a phased exit from hyperscale cloud, or better facilitate a seamless move to a smaller, private cloud environment. ... And lastly, any exit plan should budget for migration costs, which are often overlooked. The budget should include the cost of hardware for on-prem and colocation options, the cost of hosting for infrastructure as a service (IaaS) options, data migration fees, labor costs, post migration expenses and costs of any service overlaps. 



Quote for the day:

"What seems to us as bitter trials are often blessings in disguise." -- Oscar Wilde

Daily Tech Digest - December 04, 2024

Will AI help doctors decide whether you live or die?

One of the things GPT-4 “was terrible at” compared to human doctors is causally linked diagnoses, Rodman said. “There was a case where you had to recognize that a patient had dermatomyositis, an autoimmune condition responding to cancer, because of colon cancer. The physicians mostly recognized that the patient had colon cancer, and it was causing dermatomyositis. GPT got really stuck,” he said. IDC’s Shegewi points out that if AI models are not tuned rigorously and with “proper guardrails” or safety mechanisms, the technology can provide “plausible but incorrect information, leading to misinformation. “Clinicians may also become de-skilled as over-reliance on the outputs of AI diminishes critical thinking,” Shegewi said. “Large-scale deployments will likely raise issues concerning patient data privacy and regulatory compliance. The risk for bias, inherent in any AI model, is also huge and might harm underrepresented populations.” Additionally, AI’s increasing use by healthcare insurance companies doesn’t typically translate into what’s best for a patient. Doctors who face an onslaught of AI-generated patient care denials from insurance companies are fighting back — and they’re using the same technology to automate their appeals.


The Rise Of ‘Quiet Hiring’: 5 Ways To Use Trend For A Career Advantage

Adaptability is key in quiet hiring. When I interviewed Ross Thornley, Co-founder of AQai, an organization that provides adaptability training, he said, "We’re entering a period of volatility where expanding adaptability skills is essential." Whether it’s learning to manage budgets, mastering new software, or brushing up on leadership skills, the more versatile you are, the more indispensable you become. ... You might feel uncomfortable tooting your own horn, but staying silent about your successes can hurt you in the long run. Keep track of your achievements as you take on extra responsibilities. Highlight the skills you’re building and the results you’re delivering. Then, share them in conversations with your manager or during performance reviews. By showcasing your value, you ensure your work doesn’t go unnoticed. ... When holding onto status-quo ways, employees limit themselves from reaching heights that might improve engagement. Without exploration, there’s a greater potential to be misaligned with a job or responsibility that isn’t motivating. Every new role—whether formal or not—is an opportunity to grow and explore. Use this time to test out roles you might not have considered. See if you enjoy the work or if it’s a stepping stone to something even better.


Creating a unified data, AI and infrastructure strategy to scale innovation ambitions

To effectively leverage data and AI, organisations must first shift their mindset from merely collecting data to actively connecting the dots. This involves identifying the core problem that needs to be addressed and focusing on use cases that will yield maximum business impact, rather than isolating data collection and AI model development. ... To enhance AI implementation, organisations should shift from a use-case-driven approach to a capability-driven strategy, focusing on building reusable AI capabilities such as conversational AI and voice analytics for both internal and external service desks. A company exploring numerous use cases can then group them into distinct capabilities for greater efficiency. Establishing a centralised team dedicated to data, AI and infrastructure is essential to create a robust foundation and platform while allowing business units to develop their own AI-powered applications on top, ensuring consistency across the organisation. ... To succeed in scaling innovation and AI, organisations must move from merely collecting data to actively connecting data, AI and infrastructure. Today’s advancements in cloud and data management technologies enable this integration, fostering collaboration and driving innovation at scale.


AWS introduces S3 Tables, a new bucket type for data analytics

The new bucket type is S3 Table, for storing data in Apache Iceberg format. Iceberg is an open table format (OTF) used for storing data for analytics, and with richer features than Parquet alone. Parquet is the format used by Hadoop and by many data processing frameworks. Parquet and Iceberg are already widely used on S3, so why a new bucket type? Warfield said the popularity of Parquet in S3 was the rationale for S3 Tables. "We actually serve about 15 million requests per second to Parquet tables," he told us, but there is a maintenance burden. Internally, he said, "the structure of them is a lot like git, a ledger of changes, and the mutations get added as snapshots. Even with a relatively low rate of updates into your OTF you can quickly end up with hundreds of thousands of objects under your table." The consequence is poor performance. "In the OTF world it was anticipated that this would happen, but it was left to the customer to do the table maintenance tasks," Warfield said. The Iceberg project includes code to expire snapshots and clean up metadata, but it is still necessary "to go and schedule and run those Spark jobs." Apache Spark is a SQL engine for large scale data. Parquet on S3 was "a storage system on top of a storage system," said Warfield, making it sub-optimal.


Innovation Is Fun, but Infrastructure Pays the Bills

Innovation and platform infrastructure are intertwined — each move affects the other. Yet, many companies are stumbling because they’re too focused on innovation. They’re churning out apps, features, and updates at breakneck speed, all while standing on a wobbly foundation. It’s a classic case of putting the cart before the horse, and it affects the intended impact of some really great ideas. A strong platform infrastructure is your ticket to scalability and flexibility. It lets you pivot quickly to meet new market demands, integrate cutting-edge technologies, and expand your services without tearing everything down and starting from scratch. Plus, it trims the fat off your development and deployment times, letting you bring innovative ideas to market faster. Sidestepping platform infrastructure is a recipe for disaster. It can make your application sluggish, prone to crashes, and a sitting duck for cyberattacks. This isn’t just a headache for users — it’s a surefire way to tarnish your product’s reputation and negatively affect its success. Think of it like building a mansion on a shaky foundation; it doesn’t matter how grand it looks if it’s doomed to collapse.


Open-washing and the illusion of AI openness

Open-washing in AI refers to companies overstating their commitment to openness while keeping critical components proprietary. This approach isn’t new. We’ve seen cloud-washing, AI-washing, and now open-washing, all called out here. Marketing firms want the concept of being “open” to put them in a virtuous category of companies that save baby seals from oil spills. I don’t knock them, but let’s not get too far over our skis, billion-dollar tech companies. ... At the heart of open-washing is a distortion of the principles of openness, transparency, and reusability. Transparency in AI would entail publicly documenting how models are developed, trained, fine-tuned, and deployed. This would include full access to the data sets, weights, architectures, and decision-making processes involved in the models’ construction. Most AI companies fall short of this level of transparency. By selectively releasing parts of their models—often stripped of key details—they craft an illusion of openness. Reusability, another pillar of openness, is much the same. Companies allow access to their models via APIs or lightweight downloadable versions but prevent meaningful adaptation by tying usage to proprietary ecosystems. 


Microsoft hit with more litigation accusing it of predatory pricing

“All UK businesses and organizations that bought licenses for Windows Server via Amazon’s AWS, Google Cloud Platform, and Alibaba Cloud may have been overcharged and will be represented in this new ‘opt-out’ collective action,” the law firm statement said. The accusations make sense when viewed from a compliance/regulatory perspective. Although companies are allowed to give volume discounts and to offer other pricing differences for different customers, compliance issues kick in when the company controls an especially high percentage of the market. ... “Put simply, Microsoft is punishing UK businesses and organizations for using Google, Amazon, and Alibaba for cloud computing by forcing them to pay more money for Windows Server. By doing so, Microsoft is trying to force customers into using its cloud computing service, Azure, and restricting competition in the sector,” Stasi said. “This lawsuit aims to challenge Microsoft’s anti-competitive behavior, push them to reveal exactly how much businesses in the UK have been illegally penalized, and return the money to organizations that have been unfairly overcharged.”


Balancing tradition and innovation in the digital age

It’s easy to get carried away by the hype of cutting-edge technology. For me, it’s about making sure that you always ask yourself if you’re solving an actual business problem. That has to be front of mind, as opposed to being solution- or tech-first. You also have to ask yourself if the business problem requires nascent or proven tech? Once you figure that out, the tech side answer is relatively straightforward. So, even with leveraging emerging tech, you need to think congruently about your business model. ... Security is the first thing I looked at. Even in my interview, I said it would be the first thing I looked at, and it has been. Security and privacy are the basic foundations of trust, and customer and community trust is what our business is built on. So, my approach is to spend money to bring in deep expertise, which I have, and empower them to go deep into our current state and be honest about any gaps we might have. And to think about where we implement both tactical and strategic ways to bridge those gaps. It’s also important to be clear about the risk we hold and how long we want to hold it for and focus on building a response plan. So, if and when an incident occurs, we can recover and respond gracefully and have solid comms plans and playbooks in place. 


Threat intelligence and why it matters for cybersecurity

Cyber threat intelligence – who needs it? The short answer is everyone. Cyber threat intelligence is for anyone with a vested interest in the cybersecurity infrastructure of an organization. Although CTI can be tailored to suit any audience, in most cases, threat intelligence teams work closely with the Security Operation Centre (SOC) that monitors and protects a business on a daily basis. Research shows that CTI has proved beneficial to people at all levels of government (national, regional or local), from security officers, police chiefs and policymakers, to information technology specialists and law enforcement officers. It also provides value to many other professionals, such as IT managers, accountants and criminal analysts. ... The creation of cyber threat intelligence is a circular process known as an “intelligence cycle”. In this cycle, which consists of five stages, data collection is planned, implemented and evaluated; the results are then analysed to produce intelligence, which is later disseminated and re-evaluated against new information and consumer feedback. The circularity of the process means that gaps are identified in the intelligence delivered, initiating new collection requirements and launching the intelligence cycle all over again.


Securing AI’s new frontier: Visibility, governance, and mitigating compliance risks

Securing and governing the use of data for AI/ML model training is perhaps the most challenging and pressing issue in AI security. Using confidential or protected information during the training or fine-tuning process comes with the risk that data could be recoverable through model extraction techniques or using common adversarial techniques (i.e., prompt injection, jailbreak). Following data security and least-privilege access best practices is essential for protecting data during development, but bespoke AI runtime threat detection is response is required to avoid exfiltration of data via model responses. ... Securing AI applications in production is equally important as securing the underlying infrastructure and is a key component of maintaining a secure data and AI lifecycle. This requires real-time monitoring of both prompts and responses to identify, notify, and block security and safety threats. A robust AI security solution prevents adversarial attacks like prompt injection, masks sensitive data to prevent exfiltration via a model response, and also addresses safety concerns such as bias, fairness, and harmful content. 



Quote for the day:

"Leading people is like cooking. Don_t stir too much; It annoys the ingredients_and spoils the food" -- Rick Julian