Daily Tech Digest - December 12, 2024

The future of AI regulation is up in the air: What’s your next move?

The problem is, Jones says, is that lack of regulations boils down to a lack of accountability, when it comes to what your large language models are doing — and that includes hoovering up intellectual property. Without regulations and legal ramifications, resolving issues of IP theft will either boil down to court cases, or more likely, especially in cases where the LLM belongs to a company with deep pockets, the responsibility will slide downhill to the end users. And when profitability outweighs the risk of a financial hit, some companies are going to push the boundaries. “I think it’s fair to say that the courts aren’t enough, and the fact is that people are going to have to poison their public content to avoid losing their IP,” Jones says. “And it’s sad that it’s going to have to get there, but it’s absolutely going to have to get there if the risk is, you put it on the internet, suddenly somebody’s just ripped off your entire catalog and they’re off selling it directly as well.” ... “These massive weapons of mass destruction, from an AI perspective, they’re phenomenally powerful things. There should be accountability for the control of them,” Jones says. “What it will take to put that accountability onto the companies that create the products, I believe firmly that that’s only going to happen if there’s an impetus for it.”


Leading VPN CCO says digital privacy is "a game of chess we need to play"

Sthanu calls VPNs a first step, and puts forward secure browsers as a second. IPVanish recently launched a secure browser, which is an industry first, and something not offered by other top VPN providers. "It keeps your browser private, blocking tracking, encrypting the sessions, but also protecting your device from any malware," Sthanu said. IPVanish's secure browser utilises the cloud. Session tracking, cookies, and targeting are all eliminated, as web browsing operates in a cloud sandbox. ... Encrypting your data is a vital part of what VPNs do. AES 256-bit and ChaCha20 encryption are currently the standards for the most secure VPNs, which do an excellent job at encrypting and protecting your data. These encryption ciphers can protect you against the vast majority of cyber threats out there right now – but as computers and threats develop, security will need to develop too. Quantum computers are the next stage in computing evolution, and there will come a time, predicted to be in the next five years, where these computers can break 256-bit encryption – this is being referred to as "Q-day." Quantum computers are not readily available at this moment in time, with most found in universities or research labs, but they will become more widespread. 


Kintsugi Leaders: Conservers of talent who convert the weak into winners

Profligate Leaders are not only gluttonous in their appetite for consuming resources, they are usually also choosy about the kind they will order. Not for them the tedious effort of using their own best-selling, training cookbook for seasoning and stirring the youth coming out from the country’s stretched and creaking educational system. On the contrary, they push their HR to queue up for ready-cooked candidates outside the portals of elite institutes on day zero. ... Kintsugi Leaders can create nobility of a different kind if they follow three precepts. The central one is the willingness to bet big and take risks on untried talent. In one of my Group HR roles, eyebrows were raised when I placed the HR leadership of large businesses in the hands of young, internally groomed talent instead of picking stars from the market. ... There is a third (albeit rare) kind of HR leader: the Trusted Transformer who can convert a Profligate Leader into the Kintsugi kind. Revealable corporate examples are thin on the ground. In keeping with the Kintsugi theme, then, I have to fall back on Japan. Itō Hirobumi had a profound influence on Emperor Meiji and played a pivotal role in shaping the political landscape of Meiji-era Japan.


4 North Star Metrics for Platform Engineering Teams

“Acknowledging that DORA, SPACE and DevEx provide different slivers or different perspectives into the problem, our goal was to create a framework that encapsulates all the frameworks,” Noda said, “like one framework to rule them all, that is prescriptive and encapsulates all the existing knowledge and research we have.” DORA metrics don’t mean much at the team level, but, he continued, developer satisfaction — a key measurement of platform engineering success — doesn’t matter to a CFO. “There’s a very intentional goal of making especially the key metrics, but really all the metrics, meaningful to all stakeholders, including managers,” Noda said. “That enables the organization to create a single, shared and aligned definition of productivity so everyone can row in the same direction.” The Core 4 key metrics are:An average of diffs per engineer is used to measure speed. The Developer Experience Index, or homegrown developer experience surveys, is used to measure effectiveness. A change failure rate is used to measure quality. The percentage of time spent on new capabilities to measure impact. DX’s own DXI, which uses a standardized set of 14 Likert-scale questions — from strongly agree to strongly disagree — is currently only available to DX users.


The future of data: A 5-pillar approach to modern data management

To succeed in today’s landscape, every company — small, mid-sized or large — must embrace a data-centric mindset. This article proposes a methodology for organizations to implement a modern data management function that can be tailored to meet their unique needs. By “modern”, I refer to an engineering-driven methodology that fully capitalizes on automation and software engineering best practices. This approach is repeatable, minimizes dependence on manual controls, harnesses technology and AI for data management and integrates seamlessly into the digital product development process. ... Unlike the technology-focused Data Platform pillar, Data Engineering concentrates on building distributed parallel data pipelines with embedded business rules. It is crucial to remember that business needs should drive the pipeline configuration, not the other way around. For example, if preserving the order of events is essential for business needs, the appropriate batch, micro-batch or streaming configuration must be implemented to meet these requirements. Another key area involves managing the operational health of data pipelines, with an even greater emphasis on monitoring the quality of the data flowing through the pipeline. 


How and Why the Developer-First Approach Is Changing the Observability Landscape

First and foremost, developers aim to avoid issues altogether. They seek modern observability solutions that can prevent problems before they occur. This goes beyond merely monitoring metrics: it encompasses the entire software development lifecycle (SDLC) and every stage of development within the organization. Production issues don't begin with a sudden surge in traffic; they originate much earlier when developers first implement their solutions. Issues begin to surface as these solutions are deployed to production and customers start using them. Observability solutions must shift to monitoring all the aspects of SDLC and all the activities that happen throughout the development pipeline. This includes the production code and how it’s running, but also the CI/CD pipeline, development activities, and every single test executed against the database. Second, developers deal with hundreds of applications each day. They can’t waste their time manually tuning alerting for each application separately. The monitoring solutions must automatically detect anomalies, fix issues before they happen, and tune the alarms based on the real traffic. They shouldn’t raise alarms based on hard limits like 80% of the CPU load.


We must adjust expectations for the CISO role

The sense of vulnerability CISOs feel today is compounded by a shifting accountability model in the boardroom. As cybersecurity incidents make front-page news more frequently, boards and executive teams are paying closer attention. This increased scrutiny is a double-edged sword: on the one hand, it can mean greater support and resources; on the other, it often translates to CISOs being in the proverbial hot seat. What’s more, cybersecurity is still a rapidly evolving field with few long-standing best practices. It’s a space marked by constant adaptation, bringing a certain degree of trial and error. When an error occurs—especially one that leads to a breach—the CISO’s role is scrutinized. While the entire organization might have a role in cybersecurity, CISOs are often expected to bear the brunt of accountability. This dynamic is unsettling for many in the position, and the 99% of CISOs who fear for their job security in the event of a breach clearly illustrates this point. So, what can be done? Both organizations and CISOs are responsible for recalibrating expectations and addressing the root causes of these pervasive job security fears. For organizations, a starting point is to shift cybersecurity from a reactive to a proactive stance. Investing in continuous improvement—whether through advanced security technologies, employee training, or cyber insurance—is crucial.


Bug bounty programs can deliver significant benefits, but only if you’re ready

The most significant benefit of a bug bounty program is finding vulnerabilities an organization might not have otherwise discovered. “A bug bounty program gives you another avenue of identifying vulnerabilities that you’re not finding through other processes,” such as internal vulnerability scans, Stefanie Bartak, associate director of the vulnerability management team at NCC Group, tells CSO. Establishing a bug bounty program signals to the broader security research community that an organization is serious about fixing bugs. “For an enterprise, it’s a really good way for researchers, or anyone, to be able to contact them and report something that may not be right in their security,” Louis Nyffenegger, CEO of PentesterLab, tells CSO. Moreover, a bug bounty program will offer an organization a wider array of talent to bring perspectives that in-house personnel don’t have. “You get access to a large community of diverse thinkers, which help you find vulnerabilities you may otherwise not get good access to,” Synack’s Lance says. “That diversity of thought can’t be underestimated. Diversity of thought and diversity of researchers is a big benefit. You get a more hardened environment because you get better or additional testing in some cases.”


Harnessing SaaS to elevate your digital transformation journey

The impact of AI-driven SaaS solutions can be seen across multiple industries. In retail, AI-powered SaaS platforms enable businesses to analyze consumer behavior in real-time, providing personalized recommendations that drive sales. In manufacturing, AI optimizes supply chain management, reducing waste and increasing productivity. In the finance sector, AI-driven SaaS automates risk assessment, improving decision-making and reducing operational costs. ... As businesses continue to adopt SaaS and AI-driven solutions, the future of digital transformation looks promising. Companies are no longer just thinking about automating processes or improving efficiency, they are investing in technologies that will help them shape the future of their industries. From developing the next generation of products to understanding their customers better, SaaS and AI are at the heart of this evolution. CTOs, like myself, are now not only responsible for technological innovation but are also seen as key contributors to shaping the company’s overall business strategy. This shift in leadership focus will be critical in helping organizations navigate the challenges and opportunities of digital transformation. By leveraging AI and SaaS, we can build scalable, efficient, and innovative systems that will drive growth for years to come. 


What makes product teams effective?

More enterprises are adopting a cross-functional team model, yet many still tend to underinvest in product management. While they make sure to fill the product owner role—a person accountable for translating business needs into technology requirements—they do not always choose the right individual for the product manager role. Effective product managers are business leaders with the mindset and technical skills to guide multiple product teams simultaneously. They shape product strategy, define requirements, and uphold the bar on delivery quality, usually partnering with an engineering or technology lead in a two-in-a-box model. ... Unsurprisingly, when organizations recognize individual expertise, provide options for career progression, and base promotions on capabilities, employees are more engaged and satisfied with their teams. Similarly, by standardizing and reducing the overall number of roles, organizations naturally shift to a balanced ratio of orchestrators (minority) to doers (majority), which increases team capacity without hiring more employees. This shift helps ensure teams can meet their delivery commitments and creates a transparent environment where individuals feel empowered and informed.



Quote for the day:

“Things come to those who wait, but only the things left by those who hustle” -- Abraham Lincoln

Daily Tech Digest - December 11, 2024

Low-tech solutions to high-tech cybercrimes

The growing quality of deepfakes, including real-time deepfakes during live video calls, invites scammers, criminals, and even state-sponsored attackers to convincingly bypass security measures and steal identities for all kinds of nefarious purposes. AI-enabled voice cloning has already proved to be a massive boon for phone-related identity theft. AI enables malicious actors to bypass face recognition. protection And AI-powered bots are being deployed to intercept and use one-time passwords in real time. More broadly, AI can accelerate and automate just about any cyberattack. ... Once established (not in writing… ), the secret word can serve as a fast, powerful way to instantly identify someone. And because it’s not digital or stored anywhere on the Internet, it can’t be stolen. So if your “boss” or your spouse calls you to ask you for data or to transfer funds, you can ask for the secret word to verify it’s really them. ... Farrow emphasizes a simple way to foil spyware: reboot your phone every day. He points out that most spyware is purged with a reboot. So rebooting every day makes sure that no spyware remains on your phone. He also stresses the importance of keeping your OS and apps updated to the latest version.


7 Essential Trends IT Departments Must Tackle In 2025

Taking responsibility for cybersecurity will remain a key function of IT departments in 2025 as organizations face off against increasingly sophisticated and frequent attacks. Even as businesses come to understand that everyone from the boardroom to the shop floor has a part to play in preventing attacks, IT teams will inevitably be on the front line, with the job of securing networks, managing update and installation schedules, administering access protocols and implementing zero-trust measures. ... In 2025, AIOps are critical to enabling businesses to benefit from real-time resource optimization, automated decision-making and predictive incident resolution. This should empower the entire workforce, from marketing to manufacturing, to focus on innovation and high-value tasks rather than repetitive technical work best left to machines. ... with technology functions playing an increasingly integral role in business growth, other C-level roles have emerged to take on some of the responsibilities. As well as Chief Data Officers (CDOs) and Chief Information Security Officers (CISOs), it’s increasingly common for organizations to appoint Chief AI Officers (CIAOs), and as the role of technology in organizations continues to evolve, more C-level positions are likely to become critical.


Passkey adoption by Australian govt, banks drives wider passwordless authentication

“A key change has been to the operation of the security protocols that underpin passkeys and passwordless authentication. As this has improved over time, it has engendered more trust in the technology among technology teams and organisations, leading to increased adoption and use.” “At the same time, users have become more comfortable with biometrics to authenticate to digital services.” Implementation and enablement have also improved, leveraging templates and no-code, drag-and-drop orchestration to “allow administrators to swiftly design, test and deploy various out-of-the-box passwordless registration and authentication experiences for diverse customer identity types, all at scale, with minimal manual setup.” ... Banks are among the major drivers of passkey adoption in Australia. According to an article in the Sydney Morning Herald, National Australia Bank (NAB) chief security officer Sandro Bucchianeri says passwords are “terrible” – and on the way out. ... Specific questions pertaining to passkeys include, “Do you agree or disagree with including use of a passkey as an alternative first-factor identity authentication process?” and “Does it pose any security or fraud risks? If so, please describe these in detail.”


Why crisis simulations fail and how to fix them

Communication gaps are particularly common between technical leadership and business executives. These teams work in silos, which often causes misalignment and miscommunication. Technical staff use jargon that executives don’t fully understand, while business priorities may be unclear to the technical team. As a result, it becomes difficult to discern what requires immediate attention and communication versus what constitutes noise. This slows down critical decisions. Now throw in third-party vendors or MSPs, and this just amplifies the confusion and adds to the chaos. Role confusion is an interesting challenge. Crisis management playbooks typically have roles assigned to tasks, but no detail on what these roles mean. I have seen teams come into an exercise confident about the name of their role, but no idea what the role means in terms of actual execution. Many times, teams don’t even know that a role exists within the team or who owns it. A fitting example is a “crisis simulation secretary” — someone tasked with recording the notes for the meetings, scheduling the calls, making sure everyone has the correct numbers to dial in, etc. This may seem trivial, but it is a critical role, as you do not want to waste precious minutes trying to dial into a call. 


What CIOs are in for with the EU’s Data Act

There are many things the CIO will have to perform in light of Data Act provisions. In the meantime, as explained by Perugini, CIOs must do due diligence on the data their companies collect from connected devices and understand where they are in the value chain — whether they are the owners, users, or recipients. “If the company produces a connected industrial machine and gives it to a customer and then maintains the machine, it finds itself collecting the data as the owner,” she says. “If the company is a customer of the machine, it’s a user and co-generates the data. But if it’s a company that acquires the data of the machine, it’s a recipient because the user or the manufacturer has allowed it to make them available or participates in a data marketplace. CIOs can also see if there’s data generated by others on the market that can be used for internal analysis, and procure it. Any use or exchange of data must be regulated by an agreement between the interested parties with contracts.” The CIO will also have to evaluate contracts with suppliers, ensuring terms are compliant, and negotiate with suppliers to access data in a direct and interoperable way. Plus, the CIO has to evaluate whether the company’s IT infrastructure is suitable to guarantee interoperability and security of data as per GDPR. 


How slowing down can accelerate your startup’s growth

WIn startup culture, there’s a pervasive pressure to say “yes” to every opportunity, to grow at all costs. But I’ve learned that restraint is an underrated virtue in business. At Aloha, we had to make tough choices to stay on the path of sustainable growth. We focused on our core mission and turned down attractive but potentially distracting opportunities that would have taken resources away from what mattered most. ... One of the most persistent traps for startups is the “growth at all costs” mindset. Top-line growth can be impressive, but if it’s achieved without a path to profitability, it’s a house of cards. When I joined Aloha, we refocused our efforts on creating a financially sustainable business. This meant dialing back on some of our expansion plans to ensure we were growing within our means. ... In a world that worships speed, it takes courage to slow down. It’s not easy to resist the siren call of hypergrowth. But when you do, you create the conditions for a business that can weather storms, adapt to change, and keep thriving. Building a company on these principles doesn’t mean abandoning growth—it means ensuring that growth is meaningful and sustainable. Slow and steady may not be glamorous, but it works. 


Why business teams must stay out of application development

Citizen development is when non-tech users build business applications using no-code/low-code platforms, which automate code generation. Imagine that you need a simple leave application tool within the organization. Enterprises can’t afford to deploy their busy and expensive professional resources to build an internal tool. So, they go the citizen development way. ... Proponents of citizen development argue that the apps built with low-code platforms are highly customizable. What they mean is that they have the ability to mix and match elements and change colors. For enterprise apps, this is all in a day’s work. True customizability comes from real editable code that empowers developers to hand-code parts to handle complex and edge cases. Business users cannot build these types of features because low-code platforms themselves are not designed to handle this. ... Finally, the most important loophole that citizen development creates is security. A vast majority of security attacks happen due to human error, such as phishing scams, downloading ransomware, or improper credential management. In fact, IBM found that there has been a 71% increase this year in cyberattacks that used stolen or compromised credentials.


The rise of observability: A new era in IT Operations

Observability empowers organisations to not just detect that a problem exists, but to understand why it’s happening and how to resolve it. It’s the difference between knowing that a car has broken down and having a detailed diagnostic report that pinpoints the exact issue and suggests an effective repair. The transition from monitoring to observability is not without its challenges. Some organisations find themselves struggling with legacy systems and entrenched processes that resist change. Observability represents a shift from traditional IT operations, requiring a new mindset and skill set. However, the benefits of implementing observability practices far outweigh the initial challenges. While there may be concerns about skill gaps, modern observability platforms are designed to be user-friendly and accessible to team members at all levels. ... Implementing observability results in clear, measurable benefits, especially around improved service reliability. Because teams can identify and resolve issues quickly and proactively, downtime is minimised or eradicated. Enhanced reliability leads to better customer experiences, which is a crucial differentiator in a competitive market where user satisfaction is key.


5 Trends Reshaping the Data Landscape

With increased interest in generative AI and predictive AI, as well as supporting traditional analytical workloads, “we’re seeing a pretty massive increase of data sprawl across industries,” he observed. “They track with the realization among many of our customers that they’ve created a lot of different versions of the truth and silos of data which have different systems, both on-prem and in the cloud.” ... If a data team “can’t get the data where it needs to go, they’re not going to be able to analyze it in an efficient, secure way,” he said. “Leaders have to think about scale in new ways. There are so many systems downstream that consume data. Scaling these environments as the data is growing in many cases by almost double-digit percentages year over year is becoming unwieldy.” A proactive approach is to address these costs and silos through streamlining and simplification on a single common platform, Kethireddy urged, noting Ocient’s approach to “take the path to reducing the amount of hardware and cloud instances it takes to analyze compute-intensive workloads. We focus on minimizing costs associated with the system footprint and energy consumption.”


Serverless Computing: The Future of Programming and Application Deployment Innovations

Serverless computing enhances automated scaling for handling workload by shifting developers' focus on code development by adding and removing instances from serverless functions. This approach leads cloud providers to automate the distribution of incoming traffic from interconnected multiple instances in serverless functions. The scalability nature of serverless computing emphasizes that developers should build applications for handling large volumes of traffic with an effective cloud infrastructure environment. On the other hand, serverless functions assist in limited time within the range of milliseconds to several minutes by optimization of the application code in performance management. ... Cloud providers integrated security features of encryption and access control in infrastructure in cloud services. This measure applied automated security updates and patches in infrastructure with rapid prototype creation. However, serverless computing issues in cloud infrastructure reflect cloud services negatively. The time is taken to respond for the first time when a serverless function has been initiated. The constraints of a serverless architecture reflect a limited function lifecycle, which drastically affects its performance.
 


Quote for the day:

"If you want to be successful prepare to be doubted and tested." -- @PilotSpeaker

Daily Tech Digest - December 08, 2024

Here’s the one thing you should never outsource to an AI model

One of the biggest dangers in letting AI take the reins of your product ideation process is that AI processes content — be it designs, solutions or technical configurations — in ways that lead to convergence rather than divergence. Given the overlapping bases of training data, AI-driven R&D will result in homogenized products across the market. Yes, different flavors of the same concept, but still the same concept. Imagine this: Four of your competitors implement gen AI systems to design their phones’ user interfaces (UIs). Each system is trained on more or less the same corpus of information — data scraped from the web about consumer preferences, existing designs, bestseller products and so on. What do all those AI systems produce? Variations of a similar result. What you’ll see develop over time is a disturbing visual and conceptual cohesion where rival products start mirroring one another. ... In platforms like ArtStation, many artists have raised concerns regarding the influx of AI-produced content that, instead of showing unique human creativity, feels like recycled aesthetics remixing popular cultural references, broad visual tropes and styles. This is not the cutting-edge innovation you want powering your R&D engine.


How much capacity is in aging data centers?

Individual data centers have considerable differences between them, and one of the most critical is their size. With this weighting factor, the average moves — but not by much. The “average megawatt” is 10.2 years old. Whereas older data centers (10-plus years) represent 48 percent of the survey sample, they contain 38 percent of the total IT capacity — still a large minority. Interestingly, a more dramatic shift occurs within the population of data centers that have been operating for less than 10 years — well within the typical design lifespan. By facility count alone, there is an even split between the data centers that are one to five years old and those that have been in operation for six to ten years. But when measuring in megawatts, the newest data centers hold significantly more capacity (38 percent) than those with six to ten years of service. This is intuitive; in the past five years, some data center projects have reached unprecedented sizes. Very recent builds are overshadowing the capacity of data centers that are only slightly older, even though the designs are not dramatically different. However, the weighted figures above suggest that even this massive build-out has not yet overcome the moderating influence of much older, potentially less efficient facilities.


Generative AI is making traditional ways to measure business success obsolete

Often touted as the “iron triangle” from the perspective of operational efficiency, this equation implies that, in order to attain a degree of quality, firms must balance cost with the time spent to achieve that level of quality. ... AI has upended this thinking, as firms can now achieve both speed and accuracy at the same time by leveraging AI. This can enhance productivity and drive innovation without losing out on quality. Likewise, through generative AI, smaller companies with fewer resources are able to rub shoulders and compete with larger firms using AI-powered tools. They can do this by streamlining operations, creating cost-effective marketing content and delivering personalised customer experiences. This can make existing businesses more efficient, competitive and creative. It can also lower the barriers to entry into markets for prospective small and medium-sized business owners. ... The UK government’s recent autumn budget included a number of tax rises that will hit businesses, especially some small and medium-sized enterprises (SMEs) that don’t have the financial buffers to weather severe economic challenges. Generative AI has reconfigured the Cost x Time = Quality formula and has enabled firms to do things both quickly and accurately without a trade off.


UK Cyber Risks Are ‘Widely Underestimated,’ Warns Country’s Security Chief

“What has struck me more forcefully than anything else since taking the helm at the NCSC is the clearly widening gap between the exposure and threat we face, and the defences that are in place to protect us,” he said. “And what is equally clear to me is that we all need to increase the pace we are working at to keep ahead of our adversaries.”  ... Horne added that the guidance and frameworks drawn up by the NCSC are not widely used. Ultimately, businesses need to change their perspective on cyber security from a “necessary evil” or “compliance function” to “an integral part of achieving their purpose.” ... “The defence and resilience of critical infrastructure, supply chains, the public sector and our wider economy must improve” to protect against these nation-state threats, Horne said. Ian Birdsey, partner and cyber specialist at law firm Clyde & Co, told TechRepublic in an email: “The UK has increasingly become a target for hostile nations due to the redrawing of geopolitical battle lines and the rise in global conflicts in recent years. In turn, threat actors based in those territories are increasingly launching more severe and sophisticated cyberattacks on UK organisations, particularly within critical national infrastructure and its supply chain.


5 JavaScript Libraries You Should Say Goodbye to in 2025

jQuery is the grandparent of modern JavaScript libraries, loved for its cross-browser support, simple DOM manipulation, and concise syntax. However, in 2025, it’s time to officially let go. Native JavaScript APIs and modern frameworks like React, Vue, and Angular have rendered jQuery’s core utilities obsolete. Not to mention, vanilla JavaScript now includes native methods such as querySelector, addEventListener, and fetch that more conveniently provide the functionality we once relied on jQuery to deliver. Also, modern browsers have standardized, making the need for a cross-browser solution like jQuery redundant. Not to mention, bundling jQuery into an application today can add unnecessary bloat, slowing down load times in an age when speed is king. ... Moment.js was the default date-handling library for a long time, and it was celebrated for its ability to parse, validate, manipulate, and display dates. However, it’s now heavy and inflexible compared to newer alternatives, not to mention it’s been deprecated. Moment.js clocks in at around 66 KB (minified), which can be a significant payload in an era where smaller bundle sizes lead to faster performance and better UX.


How media, publishing and entertainment organizations can master Data Governance in the age of AI

One of the reasons AI governance has proven to be such a challenging new discipline is that it’s so multifaceted. Tiankai explained that it’s comprised of several key elements: Ownership and stewardship: AI models need ownership, and so does AI governance. The right people must be accountable for ensuring AI models are used in the right ways. Cross-functional decision-making: A cross-domain thinking and decision-making model is essential. One central function can’t make every AI-relevant governance decision, so you need ways to bring the accountable people together. Processes and metadata: Teams must make their models explainable, so everyone can understand the quality of their outputs and the root causes of any negative outcomes. Technology enablement: Technology must support governance frameworks and make them work at scale. This shows that AI governance requires a combination of people, process and technology change. The panel agreed that the ‘people’ element is the toughest to manage effectively. Nathalie Berdat, Head of Data and AI Governance, BBC, explained some of the people-specific challenges that she has encountered along its AI governance journey. 


5 ways to tell people what to do at work

Nick Woods, CIO of airport group MAG, said dialogue is the priority for any professional who wants to avoid ambiguity. "If you're telling somebody what to do, you're already in the wrong place," he said. "Success is about a coaching, conversational dialogue that you need to have that ultimately comes down to a handshake on, 'Are we clear on what's next?'" Woods told ZDNET that most management decisions involve an ongoing debate. He doesn't believe in being directive about outputs and telling people what they need to go and do. "I think I'm much more in a space of, 'Actually, I've hired good people. I'm going to allow you to go and tell me what we need to do, and then we're going to have a dialogue about it,'" he said. ... Niall Robinson, head of product innovation at the Met Office, said talented staff should be given space to express their creativity. "There's a temptation as a leader to tell people how to do stuff -- and that can be a trap," he said. Robinson told ZDNET that he focuses on avoiding that problem by trusting his staff to generate recommended actions. "A habit I've been trying to practice is to tell people what success looks like and then giving them the agency to describe the options to me because they're closer to many of the solutions. So, success is about giving people the power to advise me."


Navigating NextGen Enterprise Architecture with GenAI

GenAI can modernize technology architecture by facilitating optimal best-of-breed solutions selection based on diverse criteria deep analyses. It offers tailored guidance aligned with business requirements as well as key capabilities such as scalability, resilience, and reversibility. This dynamic capacity adapts to evolving IT landscapes and business requirements, continuously refining recommendations based on the changing need and technological state-of-art. Moreover, GenAI accelerates homemade solutions development by generating code snippets. It produces-free functions and classes code segments written in any programming language, which improves efficiency and reduces manual coding efforts. This capacity improves developers' productivity and allows teams to focus more on high-level design. It also ensures that generated code is aligned with coding standards related to maintainability, readability, collaboration, and consistency. GenAI has amazing advantages, but it also has some major challenges. One of them is sustainability issues, which are increasingly important in technology adoption. In fact, many enterprises take this criterion into account in their technology architecture principles and assess it when they select a new solution to enhance their IT landscape.


The 7 R's of cloud migration: How to choose the right method

The R's model isn't new, but it has evolved significantly over the years. Its genesis is usually attributed to Gartner, who came up with the 5 R's model back in 2010. The original five were rehost, refactor, revise, rebuild and replace. As the cloud continued to evolve and more diverse workloads were being migrated to the cloud, AWS added a sixth R -- retire -- and eventually, a seventh, for retain. This seventh R is effectively an acknowledgment that not all workloads are suited to being hosted in the cloud. ... Rehosting can be done in a few ways, but it often means creating cloud-based virtual machines that mimic the infrastructure an application is currently running on. ... Rehosting an application requires you to create a cloud VM instance and then move the application onto that instance. Relocating, on the other hand, involves moving an existing VM from an on-premises environment to the cloud without making significant changes to it. ... A workload might be suitable for retirement if it is no longer actively supported by the vendor. In such cases, it's important to make sure you have a workaround before retiring an application the organization still uses. That might mean adopting a competing application that offers similar functionality or developing one in-house.


Evolving Your Architecture: Essential Steps and Tools for Modernization

Tech debt, lack of modernization can also get you out there in the news, and not as a very good thing, as we could see for SWA a couple years ago when they had a pretty huge meltdown with their booking systems and all that. It damaged their image, but also got them pretty down on their plans in revenue and all that, and still, nowadays they are facing the consequences of that meltdown, which was basically because of ignoring and putting aside the conversations about tech debt and application modernization as a whole. ... It's basically looking at the inventory of applications that you have in your organization, and understanding, what are the critical ones? What is the value that it adds? Alignment with the business goals. Really like, is it commodity? Can I just go and buy one out of the shelf, two? Then it's fine, go and buy it. If it's something that differentiates you, you got to innovate, then it might be worth looking at building it and hence modernizing it. ... The other thing is the age of technology. If you have outdated technology, you very likely have vulnerabilities. If you have lack of support, either from the community or the vendors, there is a security vulnerability there, but there is no security patch being released because there is no support anymore.



Quote for the day:

"Do something today that your future self will thank you for." -- Unknown

Daily Tech Digest - December 07, 2024

In the recent past, people had the perception that HDD storage is slow and can only be used for backup. However, in the last 2 years, we have demonstrated in our European HDD laboratory how to combine multiple HDDs to test function and performance. If you have 100s of HDDs in your large-scale storage system, you also have around a billion different configuration possibilities. ... The demand for HDDs in surveillance applications continues to surge, with an increasing number of digital video recorder manufacturers entering the market. From relatively cheap surveillance systems for private homes, to medium priced surveillance systems to expensive surveillance systems for large-scale infrastructures like smart cities. The sequential nature of video surveillance data and the fact that it is over-written at some point in time, makes HDDs the uncontested choice at all levels for surveillance storage. ... At the very least, preserving a duplicate of one’s data using an alternative technology is a sensible measure. This could be a combination of cloud services or a mix of cloud and external storage, such as a USB-connected portable HDD like a Toshiba Canvio. It’s a small price to pay for peace of mind that your data is safe.


Top 3 Strategies for Leveraging AI to Transform Customer Intelligence

Transitioning from reactive to proactive engagement is one of AI's most transformative capabilities for customer intelligence. Predictive models trained on historical data allow organizations to anticipate customer needs, helping them deliver timely, relevant solutions. By recognizing patterns and trends, AI empowers businesses to forecast future customer actions — whether that's product preferences, the likelihood of churn, or upcoming purchase intent — enabling a more proactive approach to customer engagement. ... AI enables companies to personalize customer interactions dynamically across multiple channels. For instance, AI-powered chatbots can provide instant responses, creating a conversational experience that feels natural and responsive. By integrating these capabilities into CRM systems, companies ensure that every customer touchpoint — chat, email, or in-app messaging — is customized based on a customer's unique history and recent activities. This focus on personalization also extends to effective customer segmentation, as organizations aim to provide the right level of service to each customer based on their specific needs and entitlements.


Who’s the Bigger Villain? Data Debt vs. Technical Debt

Although data debt and tech debt are closely connected, there is a key distinction between them: you can declare bankruptcy on tech debt and start over, but doing the same with data debt is rarely an option. Reckless and unintentional data debt emerged from cheaper storage costs and a data-hoarding culture, where organizations amassed large volumes of data without establishing proper structures or ensuring shared context and meaning. It was further fueled by resistance to a design-first approach, often dismissed as a potential bottleneck to speed. ... With data debt, prevention is better than relying on a cure. Shift left is a practice that involves addressing critical processes earlier in the development lifecycle to identify and resolve issues before they grow into more significant problems. Applied to data management, shift left emphasizes prioritizing data modeling early, if possible — before data is collected or systems are built. Data modeling allows for following a design-first approach, where data structure, meaning, and relationships are thoughtfully planned and discussed before collection. This approach reduces data debt by ensuring clarity, consistency, and alignment across teams, enabling easier integration, analysis, and long-term value from the data.


Understanding NVMe RAID Mode: Unlocking Faster Storage Performance

While NVMe RAID mode offers excellent benefits, it’s not without its challenges. One of the most significant hurdles is the complexity of setting it up. RAID arrays, particularly with NVMe drives, require specialized hardware or software RAID controllers. Additionally, configuring RAID in the BIOS or UEFI settings can be tricky for less experienced users. Another challenge is cost. NVMe SSDs, while dropping in price over the years, are still generally more expensive than traditional SATA-based drives. Combining multiple NVMe drives into a RAID array can significantly increase the cost of the storage solution. For users on a budget, this might not be the most cost-effective option. Finally, RAID configurations that emphasize performance, like RAID 0, do not provide any data redundancy. If one drive fails, all data in the array is lost. ... NVMe RAID mode is ideal for users who need extremely fast read and write speeds, high storage capacity, and, in some cases, redundancy. This includes professionals who work with large video files, developers running complex simulations, and enthusiasts building high-end gaming PCs. Additionally, businesses that rely on fast access to large databases or those that run virtual machines may benefit from NVMe RAID configurations.


Supply chain compromise of Ultralytics AI library results in trojanized versions

According to researchers from ReversingLabs, the attackers leveraged a known exploit via GitHub Actions to introduce malicious code during the automated build process, therefore bypassing the usual code review process. As a result, the code was present only in the package pushed to PyPI and not in the code repository on GitHub. The trojanized version of Ultralytics on PyPI (8.3.41) was published on Dec. 4. Ultralytics developers were alerted Dec. 5, and attempted to push a new version (8.3.42) to resolve the issue, but because they didn’t initially understand the source of the compromise, this version ended up including the rogue code as well. A clean and safe version (8.3.43) was eventually published on the same day. ... According to ReversingLabs’ analysis of the malicious code, the attacker modified two files: downloads.py and model.py. The code injected in model.py checks the type of machine where the package is deployed to download a payload targeted for that platform and CPU architecture. The rogue code that performs the payload download is stored in downloads.py. “While in this case, based on the present information the RL research team has, it seems that the malicious payload served was simply an XMRig miner, and that the malicious functionality was aimed at cryptocurrency mining,” ReversingLabs’ researchers wrote. 


Data Governance Defying Gravitas

When it comes to formalizing data governance in a complex organization, there’s often an expectation of gravitas — a sense of seriousness, authority, and weight that makes the effort seem formidable and unyielding. But let’s be honest: Too much gravitas can weigh down your data governance program before it even begins. Enter the Non-Invasive Data Governance approach, which flips the script on gravitas by delivering effectiveness without the unnecessary posturing, proving that you can have impact without the drama. ... Complex organizations are not static, and neither should their data governance approach be. NIDG defies the traditional concept of gravitas by embracing adaptability. While other frameworks crumble under the weight of organizational change, NIDG thrives in dynamic environments. It’s built to flex and evolve, ensuring governance remains effective as technologies, priorities, and personnel shift. This adaptability fosters a sense of trust. People know that NIDG isn’t a rigid set of rules, but a living framework designed to support their needs. It’s this trust that gives NIDG its gravitas — not the false authority of inflexible mandates, but the real authority that comes from being a program people believe in and rely on. 


Weaponized AI: Hot for Fraud, Not for Election Interference

"Criminals use AI-generated text to appear believable to a reader in furtherance of social engineering, spear phishing and financial fraud schemes such as romance, investment and other confidence schemes, or to overcome common indicators of fraud schemes," it said. More advanced use cases investigated by law enforcement include criminals using AI-generated audio clips to fool banks into granting them access to accounts, or using "a loved one's voice to impersonate a close relative in a crisis situation, asking for immediate financial assistance or demanding a ransom," the bureau warned. Key defenses against such attacks, the FBI said, include creating "a secret word or phrase with your family to verify their identity," which can also work well in business settings - for example, as part of a more robust defense against CEO fraud (see: Top Cyber Extortion Defenses for Battling Virtual Kidnappers). Many fraudsters attempt to exploit victims before they have time to pause and think. Accordingly, never hesitate to hang up the phone, independently find a phone number for a caller's supposed organization, and contact them directly, it said.


Data Assurance Changes How We Network

Today, the simplest way to control the path data takes between two points is to use a private network (leased lines, for example). But today’s private networks are extremely expensive and don’t offer much in the way of visibility. They also take months to provision, which slows business agility. Even with MPLS, IGP shortest path routing will always follow the shortest IGP path. If alternate paths are available, traffic engineering (TE) with segment routing (SR) can utilize non-shortest paths. However, if the decision is made within the Provider Edge (PE) router in the service provider's network, it will necessitate source-based routing, which is not sustainable due to the challenges of implementing source routing on a per-customer basis within the service provider network. This approach will not scale effectively in an MPLS environment, and moreover, 99% of MPLS private networks do not encrypt traffic, leading to significant performance and scalability issues. Another option is to move your operations to a public cloud that can guarantee you meet data assurance goals. This, too, can be prohibitively expensive and also lacks visibility. 


Spotting the Charlatans: Red Flags for Enterprise Security Teams

Sadly, by the time most people catch on that there is a charlatan in the team, grave damage has been done to both the morale and progress of the security team. That being said, there are some clues that charlatans leave behind from time to time. If we are astute and perceptive, we can pick up on these clues and work to contain the damage that charlatans cause. ... Most talented security professionals I’ve worked with have a healthy amount of self-doubt and insecurity. This is completely normal, of course. Charlatans take advantage of this, cutting down talented professionals that they see as a threat. This causes those targeted to recoil in a moment of thought and introspection, which is all the charlatan needs to retake the spotlight. ... One of the strategies of a charlatan is to throw their perceived threat off their game. One way in which they do this is by taking pot shots. Charlatans throw subtle slights, passive-aggressive insults, and unpredictable surprises at their targets. If the targeted individual reacts to the tactic or calls the charlatan out, the target then seems like the aggressor. The best response is to ignore the pot shots and try to stay focused. In many cases, when the charlatan realizes they cannot rattle you, they will slowly lose interest.


Why ICS Cybersecurity Regulations Are Essential for Industrial Resilience

As the cybersecurity landscape becomes increasingly complex, industrial companies, especially those managing industrial control systems (ICS), face heightened risks. From protecting sensitive data to safeguarding critical infrastructure, compliance with cybersecurity regulations has become essential. Here, we explore why ICS cybersecurity is crucial, the risks involved, and key steps organizations can take to meet regulatory demands without compromising operational efficiency. ... Cybersecurity risks are no longer a secondary concern but a primary focus, especially for industries managing critical infrastructure such as energy, water, and transportation. Cyber threats targeting ICS environments have become more sophisticated, posing risks not only to individual companies but also to the broader economy and society. Regulatory adherence ensures these vulnerabilities are managed systematically, reducing potential downtime, data breaches, and even physical threats. ... Cybersecurity in ICS environments isn’t merely about meeting regulatory requirements; it’s a strategic priority that protects both assets and people. By focusing on identity management, automating updates, aligning with industry standards, and bridging IT-OT security gaps, organizations can enhance resilience against emerging threats.



Quote for the day:

“Identify your problems but give your power and energy to solutions.” -- Tony Robbins

Daily Tech Digest - December 06, 2024

Preparing for AI-Augmented Software Engineering

AI-augmented approaches will free software engineers to focus on tasks that require critical thinking and creativity, predicts John Robert, deputy director of the software solutions division of the Carnegie Mellon University Software Engineering Institute. "A key potential benefit that excites most enthusiasts of AI-augmented software engineering approaches is efficiency -- the ability to develop more code in less time and lower the barrier to entry for some tasks." Teaming humans and AI will shift the attention of humans to the conceptual tasks that computers aren't good at while reducing human error from tasks where AI can help, he observes in an email interview. ... Hall notes that GenAI can access vast amounts of data to analyze market trends, current user behavior, customer feedback, and usage data to help identify key features that are in high demand and have the potential to deliver significant value to users. "Once features are described and prioritized, multiple agents can create the software program's components." This approach breaks down big tasks into multiple activities with an overall architecture. "It truly changes how we solve complex issues and apply technology."


Code Busters: Are Ghost Engineers Haunting DevOps Productivity?

The assertion here is that almost 10% of software application developers do effectively nothing all day, or indeed all week. For wider clarification, the remote worker segment has more outlier positive performers, but in-office workers exhibit a higher average performance overall. ... “Many ghost engineers I’ve talked to share a common story, i.e. they become disengaged due to frustration or loss of motivation in their roles. Over time, they may test the limits of how much effort they can reduce without consequence. This gradual disengagement often results in them turning into ghosts; originally not out of malice, but as a by-product of their work environment.” He says that managers want to build high-performing teams but face conflicting incentives. A poorly performing team reflects badly on its leadership, leading some to downplay problems rather than address them head-on. Additionally, organizational politics may discourage reducing team sizes, even when smaller, more focused teams could be more effective. ... “There’s also the fact that senior leaders are often further removed from day-to-day operations. Their decisions are based on trust in middle management or flawed metrics, such as lines of code or commit counts. They, too, are sometimes not incentivized to reduce team sizes or deeply investigate performance issues, as their focus tends to be on higher-level strategic outcomes,” said Denisov-Blanch.


Why Data Centers Must Strengthen Network Resiliency In The Age of AI

If a network outage occurs, there will be widespread disruptions, negatively affecting businesses globally. In particular, network outages will compromise the accessibility of AI applications, the very thing data centers scaled to support. Outages—and even reduced performance—carry significant risks, both financial and reputational. Data centers must therefore adopt network solutions, like Failover to Cellular and out-of-band (OOB) management, to ensure AI services remain accessible amid disruptions to normal operations. ... OOB management capabilities and Failover to Cellular integration lay a solid foundation for network resilience. However, data centers don’t need to stop there. AI integrations promise further enhancements, elevating these tools to the next level through advanced intelligence and automation. While it may seem odd to use AI when the extra stress on data centers today comes from increased AI usage, the advanced capabilities and accompanying benefits of this technology speak for themselves. AI’s ability to analyze patterns allows it to detect connectivity issues that could cause failures. When combined with Failover to Cellular, for example, AI orchestrates a seamless Failover to Cellular backup, especially during peak traffic. AI can also automatically take proactive measures like predictive maintenance or rerouting traffic, reducing downtime and improving resilience.


Financial services need digital identity stitched together, investors take note

Financial institutions are all looking for a low friction, high accuracy way of authenticating customers, prospects and business partners that also keeps regulators happy. Some of the approaches and techniques used by established players in the digital identity market have achieved good volume and scale, and newer innovative methods are still proving themselves. Byunn highlights the opportunity in a third layer that’s “all about how you stitch these things together, because so far no one has produced a single solution that addresses everything.” This layer, he says, includes both “orchestration” and elements of holistic scoring (heuristics etc.) “that are not fully covered by what the market calls orchestration.” Earlier waves of technology serving financial services companies were thoroughly penetrated by fraudsters, and in some cases offered poor user experience, Byunn says. One example of this, knowledge-based authentication, remains “shockingly still prevalent in the industry.” ... The threat of deepfakes to financial service institutions seems to be commonly overstated at this time, according to Byunn, at least in part because conventional wisdom is also somewhat underestimating the effectiveness of market leaders’ defense against genAI and deepfakes. However, he notes that the threat has the potential to grow significantly.


The world is running short of copper - telecoms networks could be the answer

Copper remains foundational in older telecom networks, particularly in Europe and North America, with incumbent operators like AT&T, Orange, and BT. However, networks are actively transitioning from copper to fiber optics particularly with ‘last mile connectivity’ and the replacement of infrastructure like Public Switched Telephone Networks (PSTN). While recycling from these sources may not completely plug the 20 percent gap in supply, it can go a long way. It almost goes without saying, that precious metals reclaimed this way have far less environmental impact - around 15 times less. Purchasing copper from these sources is still often cheaper than mining it. ... Over the next eight to ten years, an estimated 800,000 tons of copper could be extracted from telecom networks as part of the global shift to fiber optics. ... Unlocking the value of reclaimed copper is both an environmental and strategic win, especially with the soaring demand for this vital resource. Through effective partnerships and advanced material recovery processes, telecom companies can transform what was once surplus to requirements into a valuable asset. Extracted copper can re-enter the supply chain, supporting the broader green transition and reducing reliance on new mining operations. 


8 biggest cybersecurity threats manufacturers face

The manufacturing sector’s rapid digital transformation, complex supply chains, and reliance on third-party vendors make for a challenging cyber threat environment for CISOs. Manufacturers — often prime targets for state-sponsored malicious actors and ransomware gangs — face the difficult task of maintaining cost-effective operations while modernizing their network infrastructure. “Many manufacturing systems rely on outdated technology that lacks modern security measures, creating exploitable vulnerabilities,” says Paul Cragg, CTO at managed security services firm NormCyber. “This is exacerbated by the integration of industrial internet of things [IIoT] devices, which expand the attack surface.” ... “While industries like chemicals and semiconductors exhibit relatively higher cybersecurity maturity, others, such as food and beverage or textiles, lag significantly,” Belal says. “Even within advanced sectors, inconsistencies persist across organizations.” Operational technology systems — which may include complex robotics and automation components — are typically replaced far more slowly than components of IT networks are, contributing to the growing security debt that many manufacturers carry.


What is a data scientist? A key data analytics role and a lucrative career

Data scientists often work with data analysts, but their roles differ considerably. Data scientists are often engaged in long-term research and prediction, while data analysts seek to support business leaders in making tactical decisions through reporting and ad hoc queries aimed at describing the current state of reality for their organizations based on present and historical data. So the difference between the work of data analysts and that of data scientists often comes down to timescale. A data analyst might help an organization better understand how its customers use its product in the present moment, whereas a data scientist might use insights generated from that data analysis to help design a new product that anticipates future customer needs. ... Data scientists need to manipulate data, implement algorithms, and automate tasks, and proficiency in programming is essential. Van Loon notes that critical languages include Python, R, and SQL. ... They need a strong foundation in both to analyze data accurately and make informed decisions. They also need to understand statistical tests, distributions, likelihoods, and concepts such as hypothesis testing, regression analysis, and Bayesian inference. 


How Active Archives Address AI’s Growing Energy and Storage Demands

Archives were once considered repositories of data that would only be accessed occasionally, if at all. The advent of modern AI has changed the equation. Almost all enterprise data could be valuable if made available to an AI engine. Therefore, many enterprises are turning to archiving to gather organizational data in one place and make it available for AI and GenAI tools to access. Massive data archives can be stored in an active archive at a cost-efficient price and at very low energy consumption levels, all while keeping that data readily available on the network. Decades of archived data can then be analyzed as part of an LLM or other machine learning or deep learning algorithm. ... An intelligent data management software layer is the foundation of an active archive. This software layer plays a vital role in automatically moving data according to user-defined policies to where it belongs for cost, performance, and workload priorities. High-value data that is often accessed can be retained in memory. Other data can reside on SSDs, lower tiers of disks, and within a tape- or cloud-based active archive. This allows AI applications to mine all that data without being subjected to delays due to content being stored offsite or having to be transferred to where AI can process it.


The Growing Importance of AI Governance

The goal of AI governance is to ensure that the benefits of machine learning algorithms and other forms of artificial intelligence are available to everyone in a fair and equitable manner. AI governance is intended to promote the ethical application of the technology so that its use is transparent, safe, private, accountable, and free of bias. To be effective, AI governance must bring together government agencies, researchers, system designers, industry organizations, and public interest groups. ... The long-term success of AI depends on gaining public trust as much as it does on the technical capabilities of AI systems. In response to the potential threats posed by artificial intelligence, the U.S. Office of Science and Technology Policy (OSTP) has issued a Blueprint for an AI Bill of Rights that’s intended to serve as “a guide for a society that protects all people” from misuse of the technology. ... As AI systems become more powerful and complex, businesses and regulatory agencies face two formidable obstacles: The complexity of the systems requires rule-making by technologists rather than politicians, bureaucrats, and judges. The thorniest issues in AI governance involve value-based decisions rather than purely technical ones.


The Role of AI in Cybersecurity: 5 Trends to Watch in 2025

The integration of AI into Software-as-As-Service (SaaS) platforms is changing how businesses manage security. For example, AI-enhanced tools are helping organizations automate threat detection, analyze vast data sets more efficiently, and respond to breaches or incidents more quickly. However, this innovation also introduces new risks such as hallucinations and an over-reliance on potentially poor data quality, meaning AI-powered systems need to be carefully configured to avoid outputs that mislead and are disadvantageous to defenders. ... AI auditing tools will help organizations assess whether AI models are making decisions based on biased or discriminatory data – a concern that could lead to legal and reputational challenges. As AI technology becomes more embedded in organizational operations, ethical considerations must be at the forefront of AI governance to help businesses avoid unintended consequences. Board members must be proactive in understanding the implications of AI on data security and ensuring that their companies are following best practices in AI governance for compliance with evolving legislation. Without C-suite support and understanding, and collaboration between executives and security teams, organizations will be more vulnerable to the potential risks AI poses to data and intellectual property.



Quote for the day:

"Leadership is about making others better as a result of your presence and making sure that impact lasts in your absence." -- Sheryl Sandberg