Daily Tech Digest - October 23, 2024

What Is Quantum Networking, and What Might It Mean for Data Centers?

Conventional networks shard data into packets and move them across wires or radio waves using long-established networking protocols, such as TCP/IP. In contrast, quantum networks move data using photons or electrons. It leverages unique aspects of quantum physics to enable powerful new features like entanglement, which effectively makes it possible to verify the source of data based on the quantum state of the data itself. ... Because quantum networking remains a theoretical and experimental domain, it's challenging to say at present exactly how quantum networks might change data centers. What does seem clear, however, is that data center operators seeking to offer full support for quantum devices will need to implement fundamentally new types of network infrastructure. They'll need to deploy infrastructure resources like quantum repeaters, while also ensuring that they can support whichever networking standards might emerge in the quantum space. The good news for the fledgling quantum data center ecosystem is that true quantum networks aren't a prerequisite for connecting quantum computers. It's possible for quantum machines themselves to send and receive data over classical networks by using traditional computers and networking devices as intermediaries.


Unmasking Big Tech’s AI Policy Playbook: A Warning to Global South Policymakers

Rather than a genuine, inclusive discussion about how governments should approach AI governance, what we are witnessing instead is a clash of seemingly competing narratives swirling together to obfuscate the real aspirations of big tech. The advocates of open-source large language models (LLMs) present themselves as civic-minded, democratic, and responsible, while closed-source proponents position themselves as the responsible stewards of secure, walled-garden AI development. Both sides dress their arguments with warnings about dire consequences if their views aren’t adopted by policymakers. ... For years, tech giants have employed scare tactics to convince policymakers that any regulation will stifle innovation, lead to economic decline, and exclude countries from the prestigious digital vanguard. These dire warnings are frequently targeted, especially in the Global South, where policymakers often lack the resources and expertise to keep pace with rapid technological advancements, including AI. Big tech’s polished lobbyists offer what seems like a reasonable solution, workable regulation" — which translates to delayed, light-touch, or self-regulation of emerging technologies. 


AI Agents: A Comprehensive Introduction for Developers

The best way to think about an AI agent is as a digital twin of an employee with a clear role. When any individual takes up a new job, there is a well-defined contract that establishes the essential elements — such as job definition, success metrics, reporting hierarchy, access to organizational information, and whether the role includes managing other people. These aspects ensure that the employee is most effective in their job and contributes to the overall success of an organization. ... The persona of an AI agent is the most crucial aspect that establishes the key trait of an agent. It is the equivalent of a title or a job function in the traditional environment. For example, a customer support engineer skilled in handling complaints from customers is a job function. It is also the persona of an individual who performs this job. You can easily extend this to an AI agent. ... A task is an extension of the instruction that focuses on a specific, actionable item within the broader scope of the agent’s responsibilities. While the instruction provides a general framework covering multiple potential actions, a task is a direct, concrete action that the agent must take in response to a particular user input.


AI in compliance: Streamlining HR processes to meet regulatory standards

With the increasing focus on data protection laws like the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and India’s Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 under the Information Technology Act, 2000, maintaining the privacy and security of employee data has become paramount. The Indian IT Privacy Law mandates that companies ensure the protection of sensitive personal data, including employee information, and imposes strict guidelines on how data must be collected, processed, and stored. AI can assist HR teams by automating data management processes and ensuring that sensitive information is stored securely and only accessed by authorized personnel. AI-driven tools can also help monitor compliance with data privacy regulations by tracking how employee data is collected, processed, and shared within the organization. ... This proactive monitoring reduces the likelihood of non-compliance and minimizes risks associated with data breaches, helping organizations align with both international and domestic privacy laws like the Indian IT Privacy Law.


Are humans reading your AI conversations?

Tools like OpenAI’s ChatGPT and Google’s Gemini are being used for all sorts of purposes. In the workplace, people use them to analyze data and speed up business tasks. At home, people use them as conversation partners, discussing the details of their lives — at least, that’s what many AI companies hope. After all, that’s what Microsoft’s new Copilot experience is all about — just vibing and having a chat about your day. But people might share data that’d be better kept private. Businesses everywhere are grappling with data security amid the rise of AI chatbots, with many banning their employees from using ChatGPT at work. They might have specific AI tools they require employees to use. Clearly, they realize that any data fed to a chatbot gets sent to that AI company’s servers. Even if it isn’t used to train genAI models in the future, the very act of uploading data could be a violation of privacy laws such as HIPAA in the US. ... Companies that need to safeguard business data and follow the relevant laws should carefully consider the genAI tools and plans they use. It’s not a good idea to have employees using a mishmash of tools with uncertain data protection agreements or to do anything business-related through a personal ChatGPT account.


CIOs recalibrate multicloud strategies as challenges remain

Like many enterprises, Ally Financial has embraced a primary public cloud provider, adding in other public clouds for smaller, more specialized workloads. It also runs private clouds from HPE and Dell for sensitive applications, such as generative AI and data workloads requiring the highest security levels. “The private cloud option provides us with full control over our infrastructure, allowing us to balance risks, costs, and execution flexibility for specific types of workloads,” says Sathish Muthukrishnan, Ally’s chief information, data, and digital officer. “On the other hand, the public cloud offers rapid access to evolving technologies and the ability to scale quickly, while minimizing our support efforts.” Yet, he acknowledges a multicloud strategy comes with challenges and complexities — such as moving gen AI workloads between public clouds or exchanging data from a private cloud to a public cloud — that require considerable investments and planning. “Aiming to make workloads portable between cloud service providers significantly limits the ability to leverage cloud-native features, which are perhaps the greatest advantage of public clouds,” Muthukrishnan says.


DevOps and Cloud Integration: Best Practices

CI/CD practices are crucial for DevOps implementation with cloud services. Continuous integration regularly merges code changes into a shared repository, where automated tests are run to spot issues early. On the other hand, continuous deployment improves this practice by automatically deploying changes (once they pass tests) to production. The CI/CD approach can accelerate the release cycle and enhance the overall quality of the software. ... Infrastructure as Code (IaC) empowers teams to oversee and provision infrastructure via code rather than manual processes. This DevOps methodology guarantees uniformity across environments and facilitates infrastructure scalability in cloud-based settings. It represents a pivotal element in transforming any enterprise's DevOps strategy. ... According to DevOps experts(link is external), security needs to be a part of every step in the DevOps process, called DevSecOps. This means adding security checks to the CI/CD pipeline, using security tools for the cloud, and always checking for security issues. DevOps professionals usually stress how important it is to tackle security problems early in the development process, called "shifting left."


Data Resilience & Protection In The Ransomware Age

Backups are considered the primary way to recover from a breach, but is this enough to ensure that the organisation will be up and running with minimal impact? Testing is a critical component to ensuring that a company can recover after a breach and provides valuable insight into the steps that the company will need to take to recover from a variety of scenarios. Unfortunately, many organisations implement measures to recover but fail on the last step of their resilience approach, namely testing. Without this step, they cannot know if their recovery strategy is effective. Testing is a critical component as it provides valuable insight into the steps it needs to take to recover, what works, and what areas it needs to focus on for the recovery process, the amount of time it will take to recover the files and more. Without this, companies will not know what processes to follow to restore data following a breach, as well as timelines to recovery. Equally, they will not know if they have backed up their data correctly before an attack if they have not performed adequate testing. Although many IT teams are stretched and struggle to find the time to do regular testing, it is possible to automate the testing process to ensure that it occurs frequently.


Is data gravity no longer centered in the cloud?

The need for data governance and security is escalating as AI becomes more prevalent. Organizations are increasingly aware of the risks associated with cloud environments, especially regarding regulatory compliance. Maintaining sensitive data on premises allows for tighter controls and adherence to industry standards, which are often critical in AI applications dealing with personal or confidential information. The convergence of these factors signals a broader reevaluation of cloud-first strategies, leading to hybrid models that balance the benefits of cloud computing with the reliability of traditional infrastructures. This hybrid approach facilitates a tailored fit for various workloads, optimizing performance while ensuring compliance and security. ... Data can exist on any platform, and accessibility should not be problematic regardless of whether data resides on public clouds or on premises. Indeed, the data location should be transparent. Storing data on-prem or with public cloud providers affects how much an enterprise spends and the data’s accessibility for major strategic applications, including AI. Currently, on-prem is the most cost-effective AI platform—for most data sets and most solutions. 


Choosing Between Cloud and On-Prem MLOps: What's Best for Your Needs?

The big benefit of cloud MLOps is the availability of virtually unlimited quantities of CPU, memory, and storage resources. Unlike on-prem environments, where resource capacity is limited by the amount of servers available and the resources each one provides, you can always acquire more infrastructure in the cloud. This makes cloud MLOps especially beneficial for ML use cases where resource needs vary widely or are unpredictable. ... On-prem MLOps may also offer better performance. On-prem environments don't require you to share hardware with other customers (which the cloud usually does), so you don't have to worry about "noisy neighbors" slowing down your MLOps pipeline. The ability to move data across fast local network connections can also boost on-prem MLOps performance, as can running workloads directly on bare metal, without a hypervisor layer reducing the amount of resources available to your workloads. ... You could also go on, under a hybrid MLOps approach, to deploy your model either on-prem or in the cloud depending on factors like how many resources inference will require. 



Quote for the day:

"You'll never get ahead of anyone as long as you try to get even with him." -- Lou Holtz

Daily Tech Digest - October 22, 2024

GenAI surges in law firms: Will it spell the end of the billable hour?

All areas of law will use genAI, according to Joshua Lenon, Clio’s Lawyer in Residence. That’s because AI content generation and task automation tools can help the business side and practice efforts of law firms. However, areas that have repetitive workflows and large document volumes – like civil litigation – will adopt genAI e-discovery tools more quickly. Practice areas that charge exclusively flat fees – like traffic offenses and immigration – are already the largest adopters of genAi. ... Nearly three-quarters of a law firm’s hourly billable tasks are exposed to AI automation, with 81% of legal secretaries’ and administrative assistants’ tasks being automatable, compared to 57% of lawyers’ tasks, according a survey of both legal professionals (1,028) and another adults (1,003) in the U.S. general population, by Clio. Hourly billing has long been the preference of many professionals, from lawyers to consultants, but AI adoption is upending this model where clients are charged for the time spent on services. ... People have been talking about the demise of the billable hour for about 30 years “and nothing’s killed it yet,” said Ryan O’Leary, research director for privacy and legal technology at IDC. “But if anything will, it’ll be this.”


IT security and government services: Balancing transparency and security

For cyber defenses, government IT leaders should invest in website hosting services with Secure Sockets Layer (SSL) encryption, and further enhancing security with HTTP Strict Transport Security (HSTS). These measures ensure that all data exchanged via government sites is encrypted, protecting resident self-service features such as online voter registration, permit submissions, utility bill payments, and more. By enforcing HSTS, websites are also protected from protocol downgrade attacks and cookie hijacking, ensuring that all connections remain secure, and reducing the risk of data interception. Other marks of a reliable website hosting solution provider include DDoS mitigation coverage and reliability around regular software patching and updates. For all digital partners, it’s essential to consider third-party risk. Some of the most valuable information residents should be able to access – meeting minutes, agendas, and other documents pertaining to local governing decisions – are hosted by document management vendors. To ensure this access is secure, each vendor must be vetted on its security capabilities, so that critical data is always protected, and hackers are not able to prevent access for residents or laterally move further into government networks.


Software buying trends are changing: From SaaS to outcome as a service

The last decade saw the rise of Software-as-a-Service (SaaS), transforming how businesses approached software deployment. This decade belongs to Outcomes-as-a-Service. CIOs are no longer interested in building large internal developer teams or experimenting with different platforms. They seek business impacting solutions with tangible outcomes that drive business success. Business teams need solutions that deliver results today, not tomorrow. ... AI-powered hyperautomation combines generative AI, BPM, RPA, integrations, analytics, and app-building to drive end-to-end outcomes. In today’s dynamic business environment, an integrated approach is essential. Siloed automation with narrowly focused platforms is no longer sufficient. ... AI-platforms excel in delivering outcomes at speed and scale. Leveraging automation expertise, they ensure outcomes linked to growth, efficiency, and compliance. The platform implements continuous cycles of process mining, implementation, adoption, and solution refinement until desired objectives are met.They also offer a comprehensive solution, managing everything from process definition and refinement to platform implementation, support, application development, and adoption. 


How Retailers Are Using Tech for Competitive Advantage

“While technology can streamline operations, an overreliance on automation without human touch can sometimes backfire,” Peters says. “Consumers still value human interaction, especially in complex support scenarios. It’s crucial for retailers to balance automation with human agents, particularly in areas that require empathy and nuanced decision-making.” ... Companies of all sizes benefit from greater organizational efficiency, and tech has been the fuel powering digital transformation. For example, Lowes uses AR for home improvement shopping while Sephora uses it for virtual make up try-ons. Walmart is stepping up automation in its battle against Amazon. But smaller retailers are benefiting, too. ... “One of our customer’s last large-scale automation took them five years from the time they started the concept to deployment,” Naslund says. “For context, the pandemic, was four and a half years, and the amount of volatility that the supply chain saw over the four years was insane. We saw inventory gluts, inventory shortages, and panic buying. Then you saw a warehouse shortage capacity, everybody's panicking to get warehouses. Then, they suddenly have too much space.”


Why and How IT Leaders Can Embrace the AI Revolution

AI software certainly has some consequences for IT departments. There may be some new types of workflows to manage, new user requests to support, and new application deployments to track. But unless your business is actually building complex AI solutions from scratch — which it probably isn't or shouldn't because sophisticated, mature AI tools and services are available from external vendors, complete with support plans and SLAs — implementing AI is not actually that challenging. That's because most third-party AI solutions boil down to SaaS apps that work just like any other SaaS: The vendor builds, manages, and supports them, with few resources and little effort necessary on the part of customers' IT departments. From the perspective of IT, implementing AI isn't all that different from implementing any other type of software. ... For IT, there are really not any novel data privacy or security risks at stake here. The app ingests financial data, but so do plenty of non-AI applications. IT's responsibility when it comes to managing data security for this type of app boils down to vetting the vendor by reviewing its data management and compliance practices. The fact that the app uses AI doesn't change this process.


Has the time come for integrated network and security platforms?

Interest in platformization is growing among enterprises, asserts Extreme Networks, which recently surveyed 200 CIOs and senior IT leaders for its research, CIO Insights Report: Priorities and Investment Plans in the Era of Platformization. ... A platform that helps organizations transition their network to the cloud to streamline IT efficiency and lower total cost of ownership is important, respondents said. In addition, 55% of respondents emphasized the need to integrate from a broad ecosystem of networking and security offerings, indicating a clear demand for unified platforms, Extreme concluded. ... “The message I got from the survey was that customers are operating in a world where there’s a massive proliferation of products, or applications, and that’s really translating into complexity. Complexity is equal to risk, and that complexity is happening in multiple places,” said Extreme Networks CTO Nabil Bukhari. Complexity is an interesting topic because it changes, Bukhari said. The first Ford cars were basically just an engine with brakes, but they were complicated to start and drive. “Now, if you look at a car, they are like data centers on wheels. But driving and owning them is exponentially easier,” Bukhari said.


How legacy IT systems can hold your business back

While legacy IT systems may still be functional, they can hold a business back from reaching its full potential – especially if market competitors are busy upgrading their own systems. Companies need to carefully evaluate the costs and benefits of keeping legacy systems in place and develop a plan to modernize their IT infrastructure. Investing in a modern data center solution can, over time, improve business agility, security, and your organization’s bottom line. ... This is especially true when it comes to next-generation applications using LLMs and machine learning (ML) for AI-dependent applications. Enterprise servers, storage and networking hardware, and software manufactured before about 2016 were not designed with scaled-up data workloads in mind – especially workloads for genAI, which just started to take off in 2021. This can hinder growth and force companies to invest in additional hardware or software just to maintain their current operations. Legacy systems are also more prone to failures and outages due to aging hardware and software. This downtime disrupts operations and leads to lost revenue, especially for critical business functions. Additionally, data loss from system crashes can be costly to recover from.


Architecture Inversion: Scale by Moving Computation, Not Data

Now why should the rest of us care, blessed as we are with a lack of most of the billions of users TikTok, Google and the likes are burdened with? A number of factors are becoming relevant:ML algorithms are improving and so is local compute capacity, meaning fully scoring items gives a larger boost in quality and ultimately profit than used to be the case. With the advent of vector embeddings, the signals consumed by such algorithms have grown by one to two orders of magnitude, making the network bottleneck more severe. Applying ever more data to solve problems is increasingly cost effective, which means more data needs to be rescored to maintain a constant quality loss. As the consumers of data from such systems move from being mostly humans to mostly LLMs in RAG solutions, it becomes beneficial to deliver larger amounts of scored data faster in more applications than before. ... For these reasons, the scaling tricks of the very biggest players are becoming increasingly relevant for the rest of us, which has led to the current proliferation of architecture inversion, going from traditional two-tier systems where data is looked up from a search engine or database and sent to a stateless compute tier to inserting that compute into the data itself.


The secret to successful digital initiatives is pretty simple, according to Gartner

As with all technologies, seeing results from AI comes down to focusing like a laser beam on the problem at hand: "In my experience, the businesses that start with a real use case and problem are seeing an ROI," Julian LaNeve, chief technology officer at Astronomer, a data platform company, told ZDNET. "They define a well-scoped, impactful problem and use gen AI to solve [it], and it's easy to measure success and ROI. The most successful business cases identify how to solve a problem that the business already cares deeply about and [will] deliver additional value to customers." Technology maturity also makes a difference in success rates. "Previous generations of AI were narrower in scope but have been successful," said Dominic Sartorio, vice president at Denodo, a data management provider. "AI is helping with predictive maintenance of manufactured goods, predicting demand spikes in [the] markets, and finding the optimal routes for logistics, and [has] been successful for many years." Furthermore, according to Gartner, companies that treat their digital initiatives in a collaborative fashion -- between business and IT leaders -- rather than leaving all things digital up to their IT departments are successful with technology. 


Showing AI users diversity in training data can boost perceived fairness and trust

The work investigated whether displaying racial diversity cues—the visual signals on AI interfaces that communicate the racial composition of the training data and the backgrounds of the typically crowd-sourced workers who labeled it—can enhance users' expectations of algorithmic fairness and trust. Their findings were recently published in the journal Human-Computer Interaction. AI training data is often systematically biased in terms of race, gender and other characteristics, according to S. Shyam Sundar, Evan Pugh University Professor and director of the Center for Socially Responsible Artificial Intelligence at Penn State. "Users may not realize that they could be perpetuating biased human decision-making by using certain AI systems," he said. Lead author Cheng "Chris" Chen, assistant professor of communication design at Elon University, who earned her doctorate in mass communications from Penn State, explained that users are often unable to evaluate biases embedded in the AI systems because they don't have information about the training data or the trainers. "This bias presents itself after the user has completed their task, meaning the harm has already been inflicted, so users don't have enough information to decide if they trust the AI before they use it," Chen said



Quote for the day:

"It takes courage and maturity to know the difference between a hoping and a wishing." -- Rashida Jourdain

Daily Tech Digest - October 21, 2024

Choosing the Right Tech Stack: The Key to Successful App Development

Choosing the right tech stack is critical because the tech stack you opt to use will shape virtually every aspect of your development project. It determines which programming language you can use, as well as which modules, libraries, and other pre-built components you can take advantage of to speed development. It has implications for security, since some tech stacks are easier to secure than others. It influences the application performance and operating cost because it plays an important role in determining how many resources the application will consume. And so on. ... Building a secure application is important in any context. But if you face special compliance requirements — for example, if you're building a finance or healthcare app, which are subject to special compliance mandates in many places — you may need to guarantee an extra level of security. To that end, make sure the tech stack you choose offers whichever level of security controls you need to meet your compliance requirements. A tech stack alone won't guarantee that your app is compliant, but choosing the right tech stack makes it easier for you to build a compliant app.


What is hybrid AI?

Rather than relying on a single method, hybrid AI integrates various systems, such as rule-based symbolic reasoning, machine learning and deep learning, to create systems that can reason, learn, and adapt more effectively than AI systems that have not been integrated with others. ... Symbolic AI, which is often referred to as rule-based AI, focuses on using logic and explicit rules to solve problems. It excels in reasoning, structured data processing and interpretability but struggles with handling unstructured data or large-scale problems. Machine learning (ML), on the other hand, is data-driven and excels at pattern recognition and prediction. It works well when paired with large datasets, identifying trends without needing explicit rules. However, ML models are often difficult to interpret and may struggle with tasks requiring logical reasoning. Hybrid AI that combines symbolic AI with machine learning makes the most of the reasoning power of symbolic systems as well as the adaptability of machine learning. For instance, a system could use symbolic AI to follow medical guidelines for diagnosing a patient, while machine learning analyses patient records and test results to offer individual recommendations.


6 Roadblocks to IT innovation

Innovation doesn’t happen by happenstance, says Sean McCormack, a seasoned tech exec who has led innovation efforts at multiple companies. True, someone might have an idea that seemingly comes out of the blue, but that person needs a runway to turn that inspiration into innovation that takes flight. That runway is missing in a lot of organizations. “Oftentimes there’s no formal process or approach,” McCormack says. Consequently, inspired workers must try to muscle through their bright ideas as best they can; they often fail due to the lack of support and structure that would bring the money, sponsors, and skills needed to build and test it. “You have to be purposeful with how you approach innovation,” says McCormack, now CIO at First Student, North America’s largest provider of student transportation. ... Taking a purposeful approach enables innovation in several ways, McCormack explains. First, it prioritizes promising ideas and funnels resources to those ideas, not weaker proposals. It also ensures promising ideas get attention rather than be put on a back burner while everyone deals with day-to-day tasks. And it prevents turf wars between groups, so, for example, a business unit won’t run away with an innovation that IT proposed.


Cyber Criminals Hate Cybersecurity Awareness Month

In the world of enterprises, the expectations for restoring data and backing up data at multi-petabyte scale have changed. IT teams need to increase next-generation data protection capabilities, while reducing overall IT spending. It gets even more complicated when you consider all the applications, databases, and file systems that generate different types of workloads. No matter what, the business needs the right data at the right time. To deliver this consistency, the data needs to be secured. Next-generation data protection starts when the data lands in the storage array. There needs to be high reliability with 100% availability. There also needs to be data integrity. Each time data is accessed, the storage system should check and verify the data to ensure the highest degree of data integrity. Cyber resilience best practices require that you ensure data validity, as well as near-instantaneous recovery of primary storage and backup repositories, regardless of the size. This accelerates disaster recovery when a cyberattack happens. Greater awareness of best practices in cyber resilience would be one of the crowning achievements of this October as Cybersecurity Awareness Month. Let’s make it so.


6 Strategies for Maximizing Cloud Storage ROI

Rising expenses in cloud data storage have prompted many organizations to reconsider their strategies, leading to a trend of repatriation as enterprises seek more control during these unpredictable economic times. A February 2024 Citrix poll revealed that 94% of organizations had shifted some workloads back to on-premises systems, driven by concerns over security, performance, costs, and compatibility. ... Common tactics of re-architecting applications, managing cloud sprawl and monitoring spend using the tools each cloud provides are a great first start. However, these methods are not the full picture. Storage optimization is an integral piece. Focusing on cloud storage costs first is a smart strategy since storage constitutes a large chunk of the overall spend. More than half of IT organizations (55%) will spend more than 30% of their IT budget on data storage and backup technology, according to our recent State of Unstructured Data Management report. The reality is that most organizations don’t have a clear idea on current and predicted storage costs. They do not know how to economize, how much data they have, or where it resides. 


As Software Code Proliferates, Security Debt Becomes a More Serious Threat

As AI-generated code proliferates, it compounds an already common problem, filling code bases with insecure code that will likely become security debt, increasing the risks to organizations. Just like financial debt, security debt can accrue quickly over time, the result of organizations compromising security measures in favor of convenience, speed or cost-cutting measures. Security debt, introduced by both first-party and third-party code, affects organizations of all sizes. More than 70% of organizations have security debt ingrained in their systems — and nearly half have critical debt. Over time, this accumulated debt poses serious risks because, as with financial debt, the bill will become due — potentially in the form of costly and consequential security breaches that can put an organization's data, reputation and overall stability at stake. ... Amid the dark clouds gathering over security debt, there is one silver lining. The number of high-severity flaws in organizations has been cut in half since 2016, which is clear evidence that organizations have made some progress in implementing secure software practices. It also demonstrates the tangible impact of quickly remediating critical security debt.


Why Liability Should Steer Compliance with the Cyber Security and Resilience Bill

First and foremost, the regulations are likely to involve an overhaul that will require a management focus. In the case of NIS2, for example, the board is tasked with taking responsibility for and maintaining oversight of the risk management strategy. This will require management bodies to undergo training themselves as well as to arrange training for their employees in order to equip themselves with sufficient knowledge and skills to identify risks and assess cybersecurity risk management practices. Yet NIS2 also breaks new ground in that it not only places responsibility for oversight of the risk strategy firmly at the feet of the board but goes on to state individuals could be held personally liable if they fail to exercise those responsibilities. Under article 32, authorities can temporarily prohibit any person responsible for discharging managerial responsibilities at CEO or a similar level from exercising managerial functions – in other words they can be suspended from office. We don’t know if the Cyber Security and Resilience Bill will take a similar tack but NIS2 is by no means alone in this approach. 


Tackling operational challenges in modern data centers

Supply chain bottlenecks continue to plague data centers, as shortages of critical components and materials lead to delays in shipping, sliding project timelines, and increased costs for customers. Many data center operators have become unable to meet their need for affected equipment such as generators, UPS batteries, transformers, servers, building materials, and other big-ticket items. This gap in availability is leading many to settle for any readily available items, even if not from their preferred vendor. ... The continuous heavy power consumption of data centers can strain local electrical utility systems with limited supply or transmission capacity. This poses a question of whether areas heavily populated with data centers, like Northern Virginia, Columbus, and Pittsburgh, have enough electricity capacity, and if they should only be permitted to use a certain percentage of grid power. ... Like the rest of the world, data centers are now facing a climate crisis as temperatures and weather events soar. Data centers are also seeking ways to increase their power load and serve higher client demand, without significantly increasing their electricity and emissions burdens. 


The AI-driven capabilities transforming the supply chain

In today’s supply chain environment, there really is no room for disruption — be it labor shortages, geopolitical strife or malfunctions within manufacturing. To keep up with demand, supply chain teams are focused on continuous improvement and finding ways to remove the burden on expensive manual labor in favor of automated, digital solutions. When faulty products come off the production line, it must be addressed quickly. AI can accelerate the resolution process faster than human labor in many instances — preventing production standstills and even catching errors before they occur. Engineers who are creating a product can lean on these insights too, using AI to assess all the errors that have happened in the past to make sure that they don’t happen in the future. ... Through camera footage and visual inspections, AI models can help detect errors, faults or defects in equipment before they happen. If the technology identifies an issue — or predicts the need for maintenance — teams can arrange for a technician to perform repairs. This predictive maintenance minimizes unplanned outages, reduces disruptions across the supply chain and optimizes asset performance.


What makes a great CISO

Security settings were once viewed as binary — on or off — but today, security programs need to be designed to help organizations adapt and respond with minimal impact when incidents occur. Response and resilience planning now involves cybersecurity and business operations teams, requiring the CISO to engage across the organization, especially during incidents. ... In the past, those with a SecOps background often focused on operational security, while those with a GRC background leaned toward prioritizing compliance to manage risk, according to Paul Connelly, former CISO now board advisor, independent director and CISO mentor. “Infosec requires a base competence in technology, but a CISO doesn’t have to be an engineer or developer,” says Connelly. A broad understanding of infosec responsibilities is needed, but the CISO can come from any part of the team, including IT or even internal audit. Exposure to different industries and companies brings a valuable diversity of thinking. Above all, modern CISOs must prioritize aligning security efforts with business objectives. “Individuals who have zig-zagged through an organization, getting wide exposure, are better prepared than someone who rose through the ranks focused in SecOps or another single area of focus,” says Connelly.



Quote for the day:

"The great leaders are like best conductors. They reach beyond the notes to reach the magic in the players." -- Blaine Lee

Daily Tech Digest - October 20, 2024

6 Strategies for Overcoming the Weight of Process Debt

While technical debt is a more familiar concept stemming from software development that describes the cost of taking shortcuts or using quick fixes in code, process debt relates to inefficiencies and redundancies within organizational workflows and procedures. Process debt can also have far-reaching effects that are often less obvious to business leaders, making it an insidious force that can silently undermine business operations. ... Rather than simply adding a new technology into an old process or duplicating legacy steps in a new application, organizations need to undertake a detailed audit of existing processes to uncover inefficiencies, redundancies, and inaccuracies that contribute to process debt. This audit should involve a systematic review of all workflows, procedures, and operational activities to identify areas where performance is falling short or where resources are being wasted. To gain a deeper understanding, leverage process mapping tools to create visual representations of workflows. These tools allow you to document each step of a process, highlight how tasks flow between different departments or systems, and uncover hidden bottlenecks or points of friction.


Domain-specific GenAI is Coming to a Network Near You

Now, we're seeing domain-specific models crop up. These are specialized models that focus on some industry or incorporate domain best practices that can be centrally trained and then deployed and fine-tuned by organizations. They are built on specific knowledge sets rather than the generalized corpus of information on which conversational AI is trained. ... By adopting domain-specific generative AI, companies can achieve more accurate and relevant outcomes, reducing the risks associated with general-purpose models. This approach not only enhances productivity but also aligns AI capabilities with specific business needs. ... The question now is whether this specialization can be applied to domains like networking, security, and application delivery. Yes, but no. The truth is that predictive (classic) AI is going to change these technical domains forever. But it will do so from the inside-out; that is, predictive AI will deliver real-time analysis of traffic that enables an operational AI to act. That may well be generative AI if we are including agentic AI in that broad category. But GenAI will have an impact on how we operate networking, security, and application delivery. 


The human factor: How companies can prevent cloud disasters

A company’s post-mortem process reveals a great deal about its culture. Each of the top tech companies require teams to write post-mortems for significant outages. The report should describe the incident, explore its root causes and identify preventative actions. The post-mortem should be rigorous and held to a high standard, but the process should never single out individuals to blame. Post-mortem writing is a corrective exercise, not a punitive one. If an engineer made a mistake, there are underlying issues that allowed that mistake to happen. Perhaps you need better testing, or better guardrails around your critical systems. Drill down to those systemic gaps and fix them. Designing a robust post-mortem process could be the subject of its own article, but it’s safe to say that having one will go a long way toward preventing the next outage. ... If engineers have a perception that only new features lead to raises and promotions, reliability work will take a back seat. Most engineers should be contributing to operational excellence, regardless of seniority. Reward reliability improvements in your performance reviews. Hold your senior-most engineers accountable for the stability of the systems they oversee.


Ransomware siege: Who’s targeting India’s digital frontier?

Small and medium-sized businesses (SMBs) are often the most vulnerable. This past July, a ransomware attack forced over 300 small Indian banks offline, cutting off access to essential financial services for millions of rural and urban customers. This disruption has severe consequences in a country where digital banking and online financial services are becoming lifelines for people’s day-to-day transactions. According to a report by Kaspersky, 53% of Indian SMBs experienced ransomware attacks in 2023, with 559 million attacks occurring between April and May of this year, making them the most targeted segment. ... For SMBs, the cost of paying ransomware, retrieving proprietary data, returning to full operations, and recovering lost revenue can be too much to bear. For this reason, many businesses opt to pay the ransom, even when there is no guarantee that their data will be fully restored. The Indian financial sector, in particular, has been a favourite target. This year the National Payment Corporation of India (NPCI), which runs the country’s digital payment systems, was forced to take systems offline temporarily due to an attack. Beyond the financial impact, these incidents erode trust in India’s push for a digital-first economy, impacting the country’s progress toward digital banking adoption.


What AMD and Intel’s Alliance Means for Data Center Operators

AMD and Intel’s alliance was a surprise for many. But industry analysts said their partnership makes sense and is much needed, given the threat that Arm poses in both the consumer and data center space. While x86 processors still dominate the data center space, Arm has made inroads with cloud providers Amazon Web Services, Google Cloud and Microsoft Azure building their own Arm-based CPUs and startups like Ampere having entered the market in recent years. Intel and AMD’s partnership confirms how strong Arm is as a platform in the PC, data center and smartphone markets, the Futurum Group's Newman said. But the two giant chipmakers still have the advantage of having a huge installed base and significant market share. Through the new x86 advisory group, AMD and Intel can benefit by making it easier for data center operators to leverage x86, he said. “This partnership is about the experience of the x86 customer base, trying to make it stickier and trying to give them less reason to potentially move off of the platform is valuable,” Newman said. “x86’s longevity will benefit meaningfully from less complexity and making it easier for customers.”


Cyber resilience is improving but managing systemic risk will be key

“Cyber insurance is recognised as a core component of a robust cyber risk management strategy. While we have seen fluctuations in cyber rates and capacity over the last five years, more recently we have seen rates softening in the market,” Cotelle said. “The emergence and adoption of AI has clear potential to revolutionise how businesses operate, which will create new opportunities but also new exposures. “In the cyber risk context, AI is a double-edged sword. First, it can be exploited by threat actors to conduct more sophisticated attacks between agencies to address ransomware,” he said. ... He stressed, however, that one of the biggest challenges facing the cyber market is how it understands and manages systemic cyber risks. He said there is a case for considering the use of reinsurance pools and public/private partnerships to do this. “The continued attractiveness of the cyber insurance solution is paramount to the sustainability and growth of the market. “In recent years, we have seen work by insurers to clarify particular aspects of coverage relating to areas such as cyber-related property damage, cyber war or infrastructure which has led to coverage restrictions.”


Cyber resilience vs. cybersecurity: Which is more critical?

A common misconception is that cyber resilience means strong cybersecurity and that the organization won’t be compromised because their defenses are impenetrable. No defense is ever 100 percent secure because IT products have flaws and cybercriminals, and nation state-sponsored threat actors are continually changing their tactics, techniques and procedures (TTPs) to take advantage of any weaknesses they can find. And, of course, any organization with cyber resilience still needs quality cyber security in the first place. Resilience isn’t promising that bad things won’t happen; resilience promises that when they do, the organization can overcome that and continue to thrive. Cybersecurity is one of the foundations upon which resilience stands. Although cyber threats have increased in frequency and sophistication in recent years, there’s a huge amount that businesses in every sector can do to reduce the chances of being compromised and to prepare for the worst. The investment in time, energy and resources to prepare for a cyber incident is well worth it for the results you’ll see. Being cyber resilient is becoming a selling point as well. 


Building Digital Resilience: Insider Insights for a Safer Cyber Landscape

These “basics” sound simple and are not difficult to implement, but we (IT, Security teams, and the Business) routinely fail at it. We tend to focus on the fancy new tool, the shiny new dashboard, quarterly profits, or even the latest analytical application. Yes, these are important and have their place, but we should ensure we have the “basics” down to protect the business so it can focus on profit and growth. Using patching as an example, if we can patch our prioritized vulnerabilities promptly, we reduce our threat landscape, which, in turn, offers attackers fewer doors and windows into our environment. The term may seem a little dated, but defense in depth is a solid method used to defend our often-porous environments. Using multiple levels of security, such as strong passwords, multi-factor authentication, resilience training, and patching strategies, makes it harder for threat actors, so they tend to move to another target with weaker defenses. ... In an increasingly digital world, robust recovery capabilities are not just a safety net but a strategic advantage and a tactical MUST. The actions taken before and after a breach are what truly matter to reduce the costliest impacts—business interruption. 


Information Integrity by Design: The Missing Piece of Values-Aligned Tech

To have any chance of fixing our dysfunctional relationship with information, we need solutions that can take on the powerful incentives, integration scale, and economic pull of the attention economy as we know it, and realign the market. One good example is the emerging platform Readocracy, designed from the outset with features that allow users to have much more control and context over their information experience. This includes offering users control over the algorithm, providing nudges to direct attention more mindfully, and providing information on how informed commenters are on subjects on which they are commenting. ... An information integrity by design initiative can focus on promoting the six components of information integrity outlined above so readers and researchers can make informed decisions on the integrity of the information provided. Government promotion and support can drive and support corporate adoption of the concept much like it's done for security by design, privacy by design, and, most recently, safety by design. ... Information integrity deserves fierce advocacy from governments, the intellectual ingenuity of civil society, and the creative muscle of industry. 


The backbone of security: How NIST 800-88 and 800-53 compliance safeguards data centers

When discussing data center compliance, it’s important to not leave out an important player: the National Institute of Standards and Technology (NIST). NIST is one of the most widely recognized and adopted cybersecurity frameworks, is the industry’s most comprehensive and in-depth set of framework controls, and is a non-regulatory federal agency. NIST’s mission is to educate citizens on information system security for all applications outside of national security, including industry, government, academia, and healthcare on both a national and global scale. Their strict and robust standards and guidelines are widely recognized and adopted by both data centers and government entities alike seeking to improve their processes, quality, and security. ... NIST 800-88 covers various types of media, including hard drives (HDDs), solid-state drives (SSDs), magnetic tapes, optical media, and other media storage devices. NIST 800-88 has quickly become the utmost standard for the U.S. Government and has been continuously referenced in federal data privacy laws. More so, NIST 800-88 regulations have been increasingly adopted by private companies and organizations, especially data centers. 



Quote for the day:

"To have long-term success as a coach or in any position of leadership, you have to be obsessed in some way." -- Pat Riley

Daily Tech Digest - October 19, 2024

DevSecOps: Building security into each stage of development

While it is important and becoming invaluable, it’s difficult to know how well open-source code has been maintained, Faus noted. A developer might incorporate third-party code and inadvertently introduce a vulnerability. DevSecOps allows security teams to flag that vulnerability and work with the development team to identify whether the code should be written differently or if the vulnerability is even dangerous. Ultimately, all parties can assure that they did everything they could to produce the most secure code possible. In both DevOps and DevSecOps, “the two primary principles are collaboration and transparency,” Faus said. Another core tenet is automation, which creates repeatability and reuse. If a developer knows how to resolve a specific vulnerability, they can reuse it across every other project with that same vulnerability. ... One of the biggest challenges in implementing security throughout the development cycle is the legacy mindset in how security is treated, Faus pointed out. Organizations must be willing to embrace cultural change and be open, transparent, and collaborative about fixing security issues. Another challenge lies in building in the right type of automation. “One of the first things is to make security a requirement for every new project,” Faus said.


Curb Your Hallucination: Open Source Vector Search for AI

Vector search—especially implementing a RAG approach utilizing vector data stores—is a stark alternative. Instead of relying on a traditional search engine approach, vector search uses the numerical embeddings of vectors to resolve queries. Therefore, searches examine a limited data set of more contextually relevant data. The results include improved performance, earned by efficiently utilizing massive data sets, and greatly decreased risk of AI hallucinations. At the same time, the more accurate answers that AI applications provide when backed by vector search enhance the outcomes and value delivered by those solutions. Combining both vector and traditional search methods into hybrid queries will give you the best of both worlds. Hybrid search ensures you cover all semantically related context, and traditional search can provide the specificity required for critical components ... Several open source technologies offer an easy on-ramp to building vector search capabilities and a path free from proprietary expenses, inflexibility, and vendor lock-in risks. To offer specific examples, Apache Cassandra 5.0, PostgreSQL (with pgvector), and OpenSearch are all open source data technologies that now offer enterprise-ready vector search capabilities and underlying data infrastructure well-suited for AI projects at scale.


Driving Serverless Productivity: More Responsibility for Developers

First, there are proactive controls, which prevent deployment of non-compliant resources by instilling best practices from the get-go. Second, detective controls, which identify violations that are already deployed, and then provide remediation steps. It’s important to recognize these controls must not be static. They need to evolve over time, just as your organization, processes, and production environments evolve. Think of them as checks that place more responsibility on developers to meet high standards, and also make it far easier for them to do so. Going further, a key -- and often overlooked -- part of any governance approach is its notification and supporting messaging system. As your policies mature over time, it is vitally important to have a sense of lineage. If we’re pushing for developers to take on more responsibility, and we’ve established that the controls are constantly evolving and changing, notifications cannot feel arbitrary or unsupported. Developers need to be able to understand the source of the standard driving the control and the symptoms of what they’re observing.


Mastering Observability: Unlocking Customer Insights

If we do something and the behaviour of our users changes in a negative way, if they start doing things slower, less efficiently, then we're not delivering value to the market. We're actually damaging the value we're delivering to the market. We're disrupting our users' flows. So a really good way to think about whether we are creating value or not is how is the behavior of our users, of our stakeholders or our customers changing as a result of us shipping things out? And this kind of behavior change is interesting because it is a measurement to whether we are solving the problem, not whether we're delivering a solution. And from that perspective, I can then offer five different solutions for the same behavior change. I can say, "Well, if that's the behavior change we want to create, this thing you proposed is going to cost five men millennia to make, but I can do it with a shell script and it's going to be done tomorrow. Or we can do it with an Excel export or we can do it with a PDF or we can do it through a mobile website not building a completely new app". And all of these things can address the same behavior change.


AI-generated code is causing major security headaches and slowing down development processes

The main priorities for DevSecOps in terms of security testing were the sensitivity of the information being handled, industry best practice, and easing the complexity of testing configuration through automation, all cited by around a third. Most survey respondents (85%) said they had at least some measures in place to address the challenges posed by AI-generated code, such as potential IP, copyright, and license issues that an AI tool may introduce into proprietary software. However, fewer than a quarter said they were ‘very confident' in their policies and processes for testing this code. ... The big conflict here appears to be security versus speed considerations, with around six-in-ten reporting that security testing significantly slows development. Half of respondents also said that most projects are still being added manually. Another major hurdle for teams is the dizzying number of security tools in use, the study noted. More than eight-in-ten organizations said they're using between six and 20 different security testing tools. This growing array of tools makes it harder to integrate and correlate results across platforms and pipelines, respondents noted, and is making it harder to distinguish between genuine issues and false positives.


How digital twins are revolutionising real-time decision making across industries

Despite the promise of digital twins, Bhonsle acknowledges that there are challenges to adoption. “Creating and maintaining a digital twin requires substantial investments in infrastructure, including sensors, IoT devices, and AI capabilities,” he points out. Security is another concern, particularly in industries like healthcare and energy, where compromised data streams could lead to life-threatening consequences. However, Bhonsle emphasises that the rewards far outweigh the risks. “As digital twin technology matures, it will become more accessible, even to smaller organisations, offering them a competitive edge through optimised operations and data-driven decisions.” ... Digital twins are transforming how businesses operate by providing real-time insights that drive smarter decisions. From manufacturing floors to operating rooms, and from energy grids to smart cities, this technology is reshaping industries in unprecedented ways. As Bhonsle aptly puts it, “The rise of digital twins signals a new era of efficiency and agility—an era where decisions are no longer based on assumptions but driven by data in real time.” As organisations embrace this evolving technology, they unlock new opportunities to optimise performance and stay ahead in a fast-changing world.


AI and tech in banking: Half the industry lags behind

Gupta emphasised that a superficial approach to digitalisation—what he called “putting lipstick on a pig”—is common in many institutions. These banks often adopt digital tools without rethinking the processes behind them, resulting in inefficiencies and missed opportunities for transformation. In addition, the culture of risk aversion in many financial institutions makes them slow to experiment with new technologies. According to a Deloitte survey, 62% of banking executives cited corporate culture as the biggest barrier to successful digital transformation. A fear of regulatory hurdles and data privacy issues also compounds this reluctance to fully embrace AI. ... The rise of fintech companies is also reshaping the financial landscape. Digital-first challengers like Revolut and Monzo are making waves by offering streamlined, customer-centric services that appeal to tech-savvy users. These companies, unencumbered by legacy systems, have been able to rapidly adopt AI, providing highly personalised products and services through their digital platforms. The UK fintech sector alone saw record investment in 2021, with $11.6 billion pouring into the industry, according to Innovate Finance. This influx of capital has enabled fintech firms to invest in AI technologies, providing stiff competition to traditional banks that are slower to adapt. 


The Era Of Core Banking Transformation Trade-offs Is Finally Over

There must be a better way than forcing banks to choose their compromises. Banking today needs a next-generation solution that blends the best of configuration neo cores – speed to market, lower cost, derisked migration – and combines it with the benefits of framework neo cores – full customization of products and even the core, with massive scale built in as standard. If banks and financial services aren’t forced to compromise because of their choice of cloud-native core solution, they can accelerate their innovation. Our research reveals that, while AI remains front of mind for many IT decision makers in banking and financial services, only one in three (32%) have so far integrated AI into their core solution. This is concerning. According to (McKinsey), banking is one of the top four sectors set to capitalize on that market opportunity – but that forecast will remain a pipe dream if banks can’t integrate AI efficiently, securely and at massive scale. ... One thing is certain, whether configuration or framework, neo cores are not the final destination for banking. They have been a helpful stepping stone over the last decade to cloud-native technology, but banks and financial services now need a next-generation core technology that doesn’t demand so many compromises. 


10 Risks of IoT Data Management

IoT data management faces significant security risks due to the large attack surface created by interconnected devices. Each device presents a potential entry point for cyberattacks, including data breaches and malware injections. Attackers may exploit vulnerabilities in device firmware, weak authentication methods, or unsecured network protocols. To mitigate these risks, implementing end-to-end encryption, device authentication, and secure communication channels is essential. ... IoT devices often collect sensitive personal information, which raises concerns about user privacy. The lack of transparency in how data is collected, processed, and shared can erode user trust, especially when data is shared with third parties without explicit consent. Addressing privacy concerns requires the anonymization and pseudonymization of personal data. ... The massive influx of data generated by IoT devices can overwhelm traditional storage systems, leading to data overload. Managing this data efficiently is a challenge, as continuous data generation requires significant storage capacity and processing power. To solve this, organizations can adopt edge computing, which processes data closer to the source, reducing the need for centralized storage. 


Managing bank IT spending: Five questions for tech leaders

The demand for development resources and the need to manage tech debt are only expected to increase. Tech talent has never been cheap, and inflation is pushing up salaries. Cybersecurity threats are also becoming more urgent, demanding greater funds to address them. And figuring out how to integrate generative AI takes time, personnel, and money. Despite these competing priorities and challenges, bank IT leaders have an opportunity to make their mark on their organizations and position themselves as central to their success—if they can address some key problems. ... In our experience, IT leaders should never underestimate the importance of controlling and reducing tech debt whenever possible. Actions to correct course could include conducting frequent assessments to determine which areas are accumulating tech debt and developing plans to reduce it as much as possible. More than many other industries, banking is a hotbed of new app development. Leaders who address these key questions can ensure they are directing their talent and resources to game-changing app development that directly contributes to their bank’s bottom line.



Quote for the day:

“It's failure that gives you the proper perspective on success.” -- Ellen DeGeneres

Daily Tech Digest - October 18, 2024

Breaking Barriers: The Power of Cross-Departmental Collaboration in Modern Business

In an era of rapid change and increasing complexity, cross-departmental collaboration is no longer a luxury but a necessity. By dismantling silos, fostering trust, and leveraging technology, organizations can unlock their full potential, drive innovation, and enhance customer satisfaction. While industry leaders have shown the way, the journey to a truly collaborative culture requires sustained effort and adaptation. To embark on this collaborative journey, organizations must prioritize collaboration as a core value, invest in leadership development, empower employees, leverage technology, and measure progress. Creating a collaborative culture is like building a bridge between departments: it requires strong foundations, continuous maintenance, and a shared vision. By doing so, they can create a culture where innovation thrives, employees are engaged, and customers benefit from improved products and services. Looking ahead, successful organizations will not only embrace collaboration but also anticipate its evolution in response to emerging trends like remote work, artificial intelligence, and data privacy. By proactively addressing these challenges and opportunities, businesses can position themselves as leaders in the collaborative economy.


Singapore releases guidelines for securing AI systems and prohibiting deepfakes in elections

"AI systems can be vulnerable to adversarial attacks, where malicious actors intentionally manipulate or deceive the AI system," said Singapore's Cyber Security Agency (CSA). "The adoption of AI can also exacerbate existing cybersecurity risks to enterprise systems, [which] can lead to risks such as data breaches or result in harmful, or otherwise undesired model outcomes." "As such, AI should be secure by design and secure by default, as with all software systems," the government agency said. ... "The Bill is scoped to address the most harmful types of content in the context of elections, which is content that misleads or deceives the public about a candidate, through a false representation of his speech or actions, that is realistic enough to be reasonably believed by some members of the public," Teo said. "The condition of being realistic will be objectively assessed. There is no one-size-fits-all set of criteria, but some general points can be made." These encompass content that "closely match[es]" the candidates' known features, expressions, and mannerisms, she explained. The content also may use actual persons, events, and places, so it appears more believable, she added.


2025 and Beyond: CIOs' Guide to Stay Ahead of Challenges

As enterprises move beyond the "experiment" or the "proof of concept" stage, it is time to design and formalize a well-thought-out AI strategy that is tailored to their unique business needs. According to Gartner, while 92% of CIOs anticipate AI will be integrated into their organizations by 2025 - broadly driven by increasing pressure from CEOs and boards - 49% of leaders admit their organizations struggle to assess and showcase AI's value. That's where the strategy kicks in. ... Forward-looking CIOs are focused on using data for decision-making while tackling challenges related to its quality and availability. Data governance is a crucial aspect to deal with. As data systems become more interconnected, managing complexity is crucial. Going forward, CIOs will have to focus on optimizing current systems, high level of data literacy, complexity management and establishing strong governance. The importance of shifting IT from a cost center to a profit driver lies in focusing on data-driven revenue generation, said Eric Johnson ... CIOs should be able to communicate the strategic use of IT investment and present it as a core enabler for competitiveness. 


5 Ways to Reduce SaaS Security Risks

It's important to understand what corporate assets are visible to attackers externally and, therefore, could be a target. Arguably, the SaaS attack surface extends to every SaaS, IaaS and PaaS application, account, user credential, OAuth grant, API, and SaaS supplier used in your organization—managed or unmanaged. Monitoring this attack surface can feel like a Sisyphean task, given that any user with a credit card, or even just a corporate email address, has the power to expand the organization's attack surface in just a few clicks. ... Single sign-on (SSO) provides a centralized place to manage employees' access to enterprise SaaS applications, which makes it an integral part of any modern SaaS identity and access governance program. Most organizations strive to ensure that all business-critical applications (i.e., those that handle customer data, financial data, source code, etc.) are enrolled in SSO. However, when new SaaS applications are introduced outside of IT governance processes, this makes it difficult to truly assess SSO coverage. ... Multi-factor authentication adds an extra layer of security to protect user accounts from unauthorized access. By requiring multiple factors for verification, such as a password and a unique code sent to a mobile device, it significantly decreases the chances of hackers gaining access to sensitive information. 


World’s smallest quantum computer unveiled, solves problems with just 1 photon

In the new study, the researchers successfully implemented Shor’s algorithm using a single photon by encoding and manipulating 32 time-bin modes within its wave packet. This achievement highlights the strong information-processing capabilities of a single photon in high dimensions. According to the team, with commercially available electro-optic modulators capable of 40 GHz bandwidth, it is feasible to encode over 5,000 time-bin modes on long single photons. While managing high-dimensional states can be more challenging than working with qubits, this work demonstrates that these time-bin states can be prepared and manipulated efficiently using a compact programmable fiber loop. Additionally, high-dimensional quantum gates can enhance manipulation, using multiple photons for scalability. Reducing the number of single-photon sources and detectors can improve the efficiency of counting coincidences over accidental counts. Research indicates that high-dimensional states are more resistant to noise in quantum channels, making time-bin-encoded states of long single photons promising for future high-dimensional quantum computing.


Google creates the Mother of all Computers: One trillion operations per second and a mystery

The capability of exascale computing to handle massive amounts of data and run through simulation has created new avenues for scientific modeling. From mimicking black holes and the birth of galaxies to introducing newer and evolved treatments and diagnoses through customized genome mapping across the globe, this technology has all the potential to burst open newer frontiers of knowledge about the cosmos. While current supercomputers would otherwise spend years solving computations, exascale machines will pave the way to areas of knowledge that were previously uncharted. For instance, the exascale solution in astrophysics holds the prospect of modeling many phenomena, such as star and galaxy formation, with higher accuracy. For example, these simulations could reveal new detections of the fundamental laws of physics and be used to answer questions about the universe’s formation. In addition, in fields like particle physics, researchers could analyze data from high-energy experiments far more efficiently and perhaps discover more about the nature of matter in the universe. AI is another area to benefit from exascale computing for a supercharge in performance. Present models of AI are very efficient, but the current computing machines constrain them. 


Taming the Perimeter-less Nature of Global Area Networks

The availability of data and intelligence from across the global span of the network is significantly effective in helping ITOps teams understand all the component services and providers their business has exposure to or reliance on. It means being able to pinpoint an impending problem or the root cause of a developing issue within their global area network and to pursue remediation with the right third-party provider ... Certain traffic engineering actions taken on owned infrastructure can alter connectivity and performance by altering the path that traffic takes in the unowned infrastructure portion of the global area network. Consider these actions as adjustments to a network segment that is within your control, such as a network prefix or a BGP route change to bypass a route hijack happening downstream in the unowned Internet-based segment. These traffic engineering actions are manageable tasks that ITOps teams or their automated systems can execute within a global area network setup. While they are implemented in the parts of the network directly controlled by ITOps, their impact is designed to span the entire service delivery chain and its performance. 


Firms use AI to keep reality from unreeling amid ‘global deepfake pandemic’

Seattle-based Nametag has announced the launch of its Nametag Deepfake Defense product. A release quotes security technologist and cryptography expert Bruce Schneier, who says “Nametag’s Deepfake Defense engine is the first scalable solution for remote identity verification that’s capable of blocking the AI deepfake attacks plaguing enterprises.” And make no mistake, says Nametag CEO Aaron Painter: “we’re facing a global deepfake pandemic that’s spreading ransomware and disinformation.” The company cites numbers from Deloitte showing that over 50 percent of C-suite executives expect an increase in the number and size of deepfake attacks over the next 12 months. Deepfake Defense consists of three core proprietary technologies: Cryptographic Attestation, Adaptive Document Verification and Spatial Selfie. The first “blocks digital injection attacks and ensures data integrity using hardware-backed keystore assurance and secure enclave technology from Apple and Google.” The second “prevents ID presentation attacks using proprietary AI models and device telemetry that detect even the most sophisticated digital manipulation or forgery.” 


Evolving Data Governance in the Age of AI: Insights from Industry Experts

While evolving existing data governance to meet AI needs is crucial, many organizations need to advance their DG first, before delving into AI governance. Existing data quality does not cover AI requirements. As mentioned in the previous section, currently DG programs enforce roles, procedures and tools for some structured data throughout the company. Yet AI models learn from and use very large data sets, containing structured and unstructured data. All this data needs to be of good quality too, so that the AI model can respond accurately, completely, consistently, and relevantly. Companies frequently struggle to determine if their unstructured data, including videos and PowerPoint slides, is of sufficient quality for AI training and implementation. If organizations don’t address this issue, Haskell said, they “throw dollars at AI and AI tools,” because the bad data quality inputted gets outputted. For this reason, the pressures of data quality fundamentals and clean-up take higher importance over the drive to implement AI. O’Neal likened AI and its governance to an iceberg. The CEO and senior management see only the tip, visible with all of AI’s promise and reward. 


On the Road to 2035, Banking Will Walk One of These Three Paths

Economist Impact’s latest report walks through three different potential scenarios that the banking sector will zero in on by 2035. Each paints a vivid picture of how technological advancements, shifting consumer expectations and evolving global dynamics could reshape the financial world as we know it. ... Digital transformation will be central to banking’s future, regardless of which scenario unfolds. Banks that fail to innovate and adapt to new technologies risk becoming obsolete. Trust will be a critical currency in the banking sector of 2035. Whether it’s through enhanced data protection, ethical AI use, or commitment to sustainability, banks must find ways to build and maintain customer trust in an increasingly complex world. The role of banks is likely to expand beyond traditional financial services. In all scenarios, we see banks taking on new responsibilities, whether it’s driving sustainable development, bridging geopolitical divides, or serving as the backbone for broader digital ecosystems. Flexibility and adaptability will be crucial for success. The future is uncertain and potentially fragmented, requiring banks to be agile in their strategies and operations to thrive in various possible environments.



Quote for the day:

"The secret of my success is a two word answer: Know people." -- Harvey S. Firestone