Daily Tech Digest - November 16, 2024

New framework aims to keep AI safe in US critical infrastructure

According to a release issued by DHS, “this first-of-its kind resource was developed by and for entities at each layer of the AI supply chain: cloud and compute providers, AI developers, and critical infrastructure owners and operators — as well as the civil society and public sector entities that protect and advocate for consumers.” ... Naveen Chhabra, principal analyst with Forrester, said, “while average enterprises may not directly benefit from it, this is going to be an important framework for those that are investing in AI models.” ... Asked why he thinks DHS felt the need to create the framework, Chhabra said that developments in the AI industry are “unique, in the sense that the industry is going back to the government and asking for intervention in ensuring that we, collectively, develop safe and secure AI.” ... David Brauchler, technical director at cybersecurity vendor NCC sees the guidelines as a beginning, pointing out that frameworks like this are just a starting point for organizations, providing them with big picture guidelines, not roadmaps. He described the DHS initiative in an email as “representing another step in the ongoing evolution of AI governance and security that we’ve seen develop over the past two years. It doesn’t revolutionize the discussion, but it aligns many of the concerns associated with AI/ML systems with their relevant stakeholders.”


Building an Augmented-Connected Workforce

An augmented workforce can work faster and more efficiently thanks to seamless access to real-time diagnostics and analytics, as well as live remote assistance, observes Peter Zornio, CTO at Emerson, an automation technology vendor serving critical industries. "An augmented-connected workforce institutionalizes best practices across the enterprise and sustains the value it delivers to operational and business performance regardless of workforce size or travel restrictions," he says in an email interview. An augmented-connected workforce can also help fill some of the gaps many manufacturers currently face, Gaus says. "There are many jobs unfilled because workers aren't attracted to manufacturing, or lack the technological skills needed to fill them," he explains. ... For enterprises that have already invested in advanced digital technologies, the path leading to an augmented-connected workforce is already underway. The next step is ensuring a holistic approach when looking at tangible ways to achieve such a workforce. "Look at the tools your organization is already using -- AI, AR, VR, and so on -- and think about how you can scale them or connect them with your human talent," Gaus says. Yet advanced technologies alone aren't enough to guarantee long-term success.


DORA and why resilience (once again) matters to the board

DORA, though, might be overlooked because of its finance-specific focus. The act has not attracted the attention of NIS2, which sets out cybersecurity standards for 15 critical sectors in the EU economy. And NIS2 came into force in October; CIOs and hard-pressed compliance teams could be forgiven for not focusing on another piece of legislation that is due in the New Year. But ignoring DORA altogether would be short-sighted. Firstly, as Rodrigo Marcos, chair of the EU Council at cybersecurity body CREST points out, DORA is a law, not a framework or best practice guidelines. Failing to comply could lead to penalties. But DORA also covers third-party risks, which includes digital supply chains. The legislation extends to any third party supplying a financial services firm, if the service they supply is critical. This will include IT and communications suppliers, including cloud and software vendors. ... And CIOs are also putting more emphasis on resilience and recovery. In some ways, we have come full circle. Disaster recovery and business continuity were once mainstays of IT operations planning but moved down the list with the move to the cloud. Cyber attacks, and especially ransomware, have pushed both resilience and recovery right back up the agenda.


Data Is Not the New Oil: It’s More Like Uranium

Comparing data to uranium is an accurate analogy. Uranium is radioactive and it is imperative to handle it carefully to avoid radiation exposure, the effects of which are linked to serious health and safety concerns. Issues with the deployment of uranium, such as in reactors, for instance, can lead to radioactive fallouts that are expensive to contain and have long-term health consequences for impacted individuals. The possibility of uranium being stolen poses significant risks and global repercussions. Data exhibits similar characteristics. It is critical for it to be stored safely, and those who experience data theft are forced to deal with long-term consequences – identity theft and financial concerns, for example. An organization experiencing a cyberattack must deal with regulatory oversight and fines. In some cases, losing sensitive data can trigger significant global consequences. ... Maintaining a data chain of custody is paramount. Some companies allow all employees access to all records, which increases the surface area of a cyberattack, and compromised employees could lead to a data breach. Even a single compromised employee computer can lead to a more extensive hack. Consider the case of the nonprofit healthcare network Ascension, which operates 140 hospitals and 40 senior care facilities.


Palo Alto Reports Firewalls Exploited Using an Unknown Flaw

Palo Alto said the flaw is being remotely exploited, has a "critical" severity rating of 9.3 out of 10 on the CVSS scale and that mitigating the vulnerability should be treated with the "highest" urgency. One challenge for users: no patch is yet available to fix the vulnerability. Also, no CVE code has been allocated for tracking it. "As we investigate the threat activity, we are preparing to release fixes and threat prevention signatures as early as possible," Palo Alto said. "At this time, securing access to the management interface is the best recommended action." The company said it doesn't believe its Prisma Access or Cloud NGFW are at risk from these attacks. Cybersecurity researchers confirm that real-world details surrounding the attacks and flaws remain scant. "Rapid7 threat intelligence teams have also been monitoring rumors of a possible zero-day vulnerability, but until now, those rumors have been unsubstantiated," the cybersecurity firm said in a Friday blog post. Palo Alto first warned customers on Nov. 8 that it was investigating reports of a zero-day vulnerability in the management interface for some types of firewalls and urged them to lock down the interfaces. 


Award-winning palm biometrics study promises low-cost authentication

“By harnessing high-resolution mmWave signals to extract detailed palm characteristics,” he continued, “mmPalm presents an ubiquitous, convenient and cost-efficient option to meet the growing needs for secure access in a smart, interconnected world.” The mmPalm method employs mmWave technology, which is widely used in 5G networks, to capture a person’s palm characteristics by sending and analyzing reflected signals and thereby creating a unique palm print for each user. Beyond this, mmPalm also meets the difficulties that can arise in authentication technology like distance and hand orientation. The system uses a type of AI called the Conditional Generative Adversarial Network (cGAN) to learn different palm orientations and distances, and generates virtual profiles to fill in gaps. In addition, the system will adapt to different environments using a transfer learning framework so that mmPalm is suited to various settings. The system also builds virtual antennas to increase the spatial resolution of a commercial mmWave device. Tested with 30 participants over six months, mmPalm displayed a 99 percent accuracy rate and was resistant to impersonation, spoofing and other potential breaches.


Scaling From Simple to Complex Cache: Challenges and Solutions

To scale a cache effectively, you need to distribute data across multiple nodes through techniques like sharding or partitioning. This improves storage efficiency and ensures that each node only stores a portion of the data. ... A simple cache can often handle node failures through manual intervention or basic failover mechanisms. A larger, more complex cache requires robust fault-tolerance mechanisms. This includes data replication across multiple nodes, so if one node fails, others can take over seamlessly. This also includes more catastrophic failures, which may lead to significant down time as the data is reloaded into memory from the persistent store, a process known as warming up the cache. ... As the cache gets larger, pure caching solutions struggle to provide linear performance in terms of latency while also allowing for the control of infrastructure costs. Many caching products were written to be fast at small scale. Pushing them beyond what they were designed for exposes inefficiencies in underlying internal processes. Potential latency issues may arise as more and more data are cached. As a consequence, cache lookup times can increase as the cache is devoting more resources to managing the increased scale rather than serving traffic.


Understanding the Modern Web and the Privacy Riddle

The main question is users’ willingness to surrender their data and not question the usage of this data. This could be attributed to the effect of the virtual panopticon, where users believe they are cooperating with agencies (government or private) that claim to respect their privacy in exchange for services. The Universal ID project (Aadhar project) in India, for instance, began as a means to provide identity to the poor in order to deliver social services, but has gradually expanded its scope, leading to significant function creep. Originally intended for de-duplication and preventing ‘leakages,’ it later became essential for enabling private businesses, fostering a cashless economy, and tracking digital footprints. ... In the modern web, users occupy multiple roles—as service providers, users, and visitors—while adopting multiple personas. This shift requires greater information disclosure, as users benefit from the web’s capabilities and treat their own data as currency. The unraveling of privacy has become the new norm, where withholding information is no longer an option due to the stigmatization of secrecy. Over the past few years, there has been a significant shift in how consumers and websites view privacy. Users have developed a heightened sensitivity to the use of their personal information and now recognize their basic right to internet privacy.


Databases Are a Top Target for Cybercriminals: How to Combat Them

Most ransomware can encrypt pages within a database—Mailto, Sodinokibi (REvil), and Ragnar Locker—and destroy the database pages. This means the slow, unknown encryption of everything, from sensitive customer records to critical networks resources, including Active Director, DNS, and Exchange, and lifesaving patient health information. Because databases can continue to run even with corrupted pages, it can take longer to realize that they have been attacked. Most often, it is the wreckage of the attack that is usually found when the database is taken down for routine maintenance, and by that time, thousands of records could be gone. Databases are an attractive target for cybercriminals because they offer a wealth of information that can be used or sold on the dark web, potentially leading to further breaches and attacks. Industries such as healthcare, finance, logistics, education, and transportation are particularly vulnerable. The information contained in these databases is highly valuable, as it can be exploited for spamming, phishing, financial fraud, and tax fraud. Additionally, cybercriminals can sell this data for significant sums of money on dark web auctions or marketplaces.


The Impact of Cloud Transformation on IT Infrastructure

With digital transformation accelerating across industries, the IT ecosystem comprises traditional and cloud-native applications. This mixed environment demands a flexible, multi-cloud strategy to accommodate diverse application requirements and operational models. The ability to move workloads between public and private clouds has become essential, allowing companies to dynamically balance performance and cost considerations. We are committed to delivering cloud solutions supporting seamless workload migration and interoperability, empowering businesses to leverage the best of public and private clouds. ... With today’s service offerings and various tools, migrating between on-premises and cloud environments has become straightforward, enabling continuous optimization rather than one-time changes. Cloud-native applications, particularly containerization and microservices, are inherently optimized for public and private cloud setups, allowing for dynamic scaling and efficient resource use. To fully optimize, companies should adopt cloud-native principles, including automation, continuous integration, and orchestration, which streamline performance and resource efficiency. Robust tools like identity and access management (IAM), encryption, and automated security updates address security and reliability, ensuring compliance and data protection.



Quote for the day:

"The elevator to success is out of order. You’ll have to use the stairs…. One step at a time.” -- Rande Wilson

Daily Tech Digest - November 15, 2024

Beyond the breach: How cloud ransomware is redefining cyber threats in 2024

Unlike conventional ransomware that targets individual computers or on-premises servers, attackers are now setting their sights on cloud infrastructures that host vast amounts of data and critical services. This evolution represents a new frontier in cyber threats, requiring Indian cybersecurity practitioners to rethink and relearn defence strategies. Traditional security measures and last year’s playbooks are no longer sufficient. Attackers are exploiting misconfigured or poorly secured cloud storage platforms such as Amazon Web Services (AWS) Simple Storage Service (S3) and Microsoft Azure Blob Storage. By identifying cloud storage buckets with overly permissive access controls, cybercriminals gain unauthorised entry, copy data to their own servers, encrypt or delete the original files, and then demand a ransom for their return. ... Collaboration and adaptability are essential. By understanding the unique challenges posed by cloud security, Indian organisations can implement comprehensive strategies that not only protect against current threats but also anticipate future ones. Proactive measures—such as strengthening access controls, adopting advanced threat detection technologies, training employees, and staying informed—are crucial steps in defending against these evolving attacks.


Harnessing AI’s Potential to Transform Payment Processing

There are many use cases that show how AI increases the speed and convenience of payment processing. For instance, Apple Pay now offers biometric authentication, which uses AI facial recognition and fingerprint scanning to authenticate users. This enables mobile payment customers to use quick and secure authentication without remembering passwords or PINs. Similarly, Apple Pay’s competitor, PayPal, uses AI for real-time fraud detection, employing ML algorithms to monitor transactions for signs of fraud and ensure that customers’ financial information remains secure. ... One issue is AI systems rely on massive amounts of data, including sensitive data, which can lead to data breaches, identity theft, and compliance issues. In addition, AI algorithms trained on biased data can perpetuate those biases. Making matters worse, many AI systems lack transparency, so the bias may grow and lead to unequal access to financial services. Another issue is the potential dependence on outside vendors, which is common with many AI technologies. ... To reduce the current risks associated with AI and safely unleash its full potential to improve payment processing, it is imperative for organizations to take a multi-layered approach that includes technical safeguards, organizational policies, and regulatory compliance. 


Do you need an AI ethicist?

The goal of advising on ethics is not to create a service desk model, where colleagues or clients always have to come back to the ethicist for additional guidance. Ethicists generally aim for their stakeholders to achieve some level of independence. “We really want to make our partners self-sufficient. We want to teach them to do this work on their own,” Sample said. Ethicists can promote ethics as a core company value, no different from teamwork, agility, or innovation. Key to this transformation is an understanding of the organization’s goal in implementing AI. “If we believe that artificial intelligence is going to transform business models…then it becomes incumbent on an organization to make sure that the senior executives and the board never become disconnected from what AI is doing for or to their organization, workforce, or customers,” Menachemson said. This alignment may be especially necessary in an environment where companies are diving head-first into AI without any clear strategic direction, simply because the technology is in vogue. A dedicated ethicist or team could address one of the most foundational issues surrounding AI, notes Gartner’s Willemsen. One of the most frequently asked questions at a board level, regardless of the project at hand, is whether the company can use AI for it, he said. 


Why We Need Inclusive Data Governance in the Age of AI

Inclusive data governance processes involve multiple stakeholders, giving equal space in this decision making to diverse groups from civil society, as well as space for direct representation of affected communities as active stakeholders. This links to, but is an idea broader than, the concept of multi-stakeholder governance for technology, which first came to prominence at the international level, in institutions such as the Internet Corporation for Assigned Names and Numbers and the Internet Governance Forum. ... Involving the public and civil society in decisions about data is not cost-free. Taking the steps that are needed to surmount the practical challenges, and skepticism about the utility of public involvement in a technical and technocratic field, frequently requires arguments that go beyond it being the right thing to do. ... The risks for people, communities and society, but also for organizations operating within the data and AI marketplace and supply chain, can be reduced through greater inclusion earlier in the design process. But organizational self-interest will not motivate the scope or depth that is required. Reducing the reality and perception of “participation-washing” means requirements for consultation in the design of data and AI systems need to be robust and enforceable. 


Strategies to navigate the pitfalls of cloud costs

If cloud customers spend too much money, it’s usually because they created cost-ineffective deployments. It’s common knowledge that many enterprises “lifted and shifted” their way to the clouds with little thought about how inefficient those systems would be in the new infrastructure. ... Purposely or not, public cloud providers created intricate pricing structures that are nearly incomprehensible to anyone who does not spend each day creating cloud pricing structures to cover every possible use. As a result, enterprises often face unexpected expenses. Many of my clients frequently complain that they have no idea how to manage their cloud bills because they don’t know what they’re paying for. ... Cloud providers often encourage enterprises to overprovision resources “just in case.” Enterprises still pay for that unused capacity, so the misalignment dramatically elevates costs without adding business value. When I ask my clients why they provision so much more storage or computing resources beyond what their workload requires, the most common answer is, “My cloud provider told me to.” ... One of the best features of public cloud computing is autoscaling so you’ll never run out of resources or suffer from bad performance due to insufficient resource provisioning. However, autoscaling often leads to colossal cloud bills because it often is triggered without good governance or purpose.


Your IT Team Isn't Ready For Change Management If They Can't Answer These 3 Questions

Testing software before you encounter failure rates is key, but never should you be exposed to failure rates with this level of real world impact. Whether it’s due to third party systems or the companies themselves, their brand will be the one in tatters due to the end customer experience. Enter Change Management and the possibility for, if done right, the prevention of these kinds of enormous IT failures. ... The ever-evolving nature of technology, including cloud scaling, infrastructure as code, and frequent updates such as ‘Patch Tuesday’ means that organisations must constantly adapt to change. However, this constant change introduces challenges such as “drift”—a term that refers to the unplanned deviations from standard configurations or expected states within an IT environment. Think of it like a pesky monkey in the machine. Drift can occur subtly and often goes unnoticed until it causes significant disruptions. It also increases uncertainty and doubt in the organisation making Change Management and Release Management harder, creating difficulties to plan and execute changes safely. ... To be effective, Change Management needs to be able to detect and understand drift in the environment to have a full understanding of Current State, Risk Assessment and Expected Outcomes. 


RIP Open Core — Long Live Open Source

Open-core was originally popular because it allowed companies to build a community around a free product version while charging for a more full, enterprise-grade version. This setup thrived in the 2010s, helping companies like MongoDB and Redis gain traction. But times have changed, and today, instead of enhancing a company’s standing, open-core models often create more problems than they solve. ... While open-core and source-available models had their moment, companies are beginning to realize the importance of true open source values and are finding their way back. This return to open source is a sign of growth, with businesses realigning with the collaborative spirit at the core (wink) of the OSS community. More companies are adopting models that genuinely prioritize community engagement and transparency rather than using them as marketing or growth tactics. ... As the open-core model fades, we’re seeing a more sustainable approach take shape: the Open-Foundation model. This model allows the open-source offering to be the backbone of a commercial offering without compromising the integrity of the OSS project. Rather, it reinforces it as a valuable, standalone product that supports the commercial offering instead of competing against it.


Why SaaS Backup Matters: Protecting Data Beyond Vendor Guarantees

Most IT departments have long recognized the importance of backup and recovery for applications and data that they host themselves. When no one else is managing your workloads and backing them up, having a recovery plan to restore them if necessary is essential for minimizing the risk that a failure could disrupt business operations. But when it comes to SaaS, IT operations teams sometimes think in different terms. That's because SaaS applications are hosted and managed by external vendors, not the IT departments of the businesses that use SaaS apps. In many cases, SaaS vendors provide uptime or availability guarantees. They don't typically offer details about exactly how they back up applications and data or how they'll recover data in the event of a failure, but a backup guarantee is typically implicit in SaaS products. ... Historically, SaaS apps haven't featured prominently, if at all, in backup and recovery strategies. But the growing reliance on SaaS apps — combined with the many risks that can befall SaaS application data even if the SaaS vendor provides its own backup or availability guarantees — makes it critical to integrate SaaS apps into backup and recovery plans. The risks of not backing up SaaS have simply become too great.


Biometrics in the Cyber World

Biometrics is known to strengthen security in many ways. Some ways this could include stronger authentication, user convenience, and reduction of risk of identity theft. With the uniqueness of the user, this adds a layer of security to authentication. Traditional authentication, such as passwords, contains weak combinations that can easily be breached, so using biometrics can prevent that. People constantly forget their passwords and always end up resetting them. Since biometrics is entirely connected with one’s identity, it will no longer be an inconvenience to forget a password. Since it is very usual for hackers to attempt to get into an account by guessing the password, biometrics does not accommodate this since a hacker cannot guess a password when the uniqueness of one’s identity is what is sought for authentication. This is one of the pros of biometric systems, which should reduce the risk of identity theft. Some challenges that biometrics could have include privacy concerns, false positives and negatives, and bias. Since it is personal information, biometric data can lead to privacy violations. Storage of personal data, which is sensitive, needs to be following the regulations on privacy such as GDPR and CCPA. 


Data Architectures in the AI Era: Key Strategies and Insights

Accelerating results in data architecture initiatives can be achieved in a much quicker fashion if you start with the minimum needed and build from there for your data storage. Begin by considering all use cases and finding the one component needed to develop so a data product can be delivered. Expansion can happen over time with use and feedback, which will actually create a more tailored and desirable product. ... Educating your key personnel on the importance of being able and ready to make the shift from previously familiar legacy data systems to modern architectures like data lakehouses or hybrid cloud platforms. Migration to a unified, hybrid, or cloud-based data management system may seem challenging initially, but it is essential for enabling comprehensive data lifecycle management and AI-readiness. By investing in continuous education and training, organizations can enhance data literacy, simplify processes, and improve long-term data governance, positioning themselves for scalable and secure analytics practices. ... By being prepared for the typical challenges of AI, problems can be predicted and anticipated which can help to reduce downtime and frustration in the modernization of data architecture. 



Quote for the day:

“The final test of a leader is that he leaves behind him in other men the conviction and the will to carry on.” – Walter Lippmann

Daily Tech Digest - November 14, 2024

Where IT Consultancies Expect to Focus in 2025

“Much of what’s driving conversations around AI today is not just the technology itself, but the need for businesses to rethink how they use data to unlock new opportunities,” says Chaplin. “AI is part of this equation, but data remains the foundation that everything else builds upon.” West Monroe also sees a shift toward platform-enabled environments where software, data, and platforms converge. “Rather than creating everything from scratch, companies are focusing on selecting, configuring, and integrating the right platforms to drive value. The key challenge now is helping clients leverage the platforms they already have and making sure they can get the most out of them,” says Chaplin. “As a result, IT teams need to develop cross-functional skills that blend software development, platform integration and data management. This convergence of skills is where we see impact -- helping clients navigate the complexities of platform integration and optimization in a fast-evolving landscape.” ... “This isn’t just about implementing new technologies, it’s about preparing the workforce and the organization to operate in a world where AI plays a significant role. ...”


How Is AI Shaping the Future of the Data Pipeline?

AI’s role in the data pipeline begins with automation, especially in handling and processing raw data – a traditionally labor-intensive task. AI can automate workflows and allow data pipelines to adapt to new data formats with minimal human intervention. With this in mind, Harrisburg University is actively exploring AI-driven tools for data integration that leverage LLMs and machine learning models to enhance and optimize ETL processes, including web scraping, data cleaning, augmentation, code generation, mapping, and error handling. These adaptive pipelines, which automatically adjust to new data structures, allow companies to manage large and evolving datasets without the need for extensive manual coding. ... Beyond immediate operational improvements, AI is shaping the future of scalable and sustainable data pipelines. As industries collect data at an accelerating rate, traditional pipelines often struggle to keep pace. AI’s ability to scale data handling across various formats and volumes makes it ideal for supporting industries with massive data needs, such as retail, logistics, and telecommunications. In logistics, for example, AI-driven pipelines streamline inventory management and optimize route planning based on real-time traffic data. 


Innovating with Data Mesh and Data Governance

Companies choose a data mesh to overcome the limitations of “centralized and monolithic” data platforms, as noted by Zhamak Dehghani, the director of emerging technologies at Thoughtworks. Technologies like data lakes and warehouses try to consolidate all data in one place, but enterprises can find that the data gets stuck there. A company might have only one centralized data repository – typically a team such as IT – that serves the data up to everyone else in the company. This slows down data access because of bottlenecks. For example, having already taken days to get HR privacy approval, the finance department’s data access requests might then sit in the inbox of one or two people in IT for additional days. Instead, a data mesh puts data control in the hands of each domain that serves that data. Subject matter experts (SMEs) in the domain control how this data is organized, managed, and delivered. ... Data mesh with federated Data Governance balances expertise, flexibility, and speed with data product interoperability among different domains. With a data mesh, the people with the most knowledge about their subject matter take charge of their data. In the future, organizations will continue to face challenges in providing good, federated Data Governance to access data through a data mesh.


The Agile Manifesto was ahead of its time

A fundamental idea of the agile methodology is to alleviate this and allow for flexibility and changing requirements. The software development process should ebb and flow as features are developed and requirements change. The software should adapt quickly to these changes. That is the heart and soul of the whole Agile Manifesto. However, when the Agile Manifesto was conceived, the state of software development and software delivery technology was not flexible enough to fulfill what the manifesto was espousing. But this has changed with the advent of the SaaS (software as a service) model. It’s all well and good to want to maximize flexibility, but for many years, software had to be delivered all at once. Multiple features had to be coordinated to be ready for a single release date. Time had to be allocated for bug fixing. The limits of the technology forced software development teams to be disciplined, rigid, and inflexible. Delivery dates had to be met, after all. And once the software was delivered, changing it meant delivering all over again. Updates were often a cumbersome and arduous process. A Windows program of any complexity could be difficult to install and configure. Delivering or upgrading software at a site with 200 computers running Windows could be a major challenge.


Improving the Developer Experience by Deploying CI/CD in Databases

Characteristically less mature than CI/CD for application code, CI/CD for databases enables developers to manage schema updates such as changes to table structures and relationships. This management ability means developers can execute software updates to applications quickly and continuously without disrupting database users. It also helps improve quality and governance, creating a pipeline everyone follows. The CI stage typically involves developers working on code simultaneously, helping to fix bugs and address integration issues in the initial testing process. With the help of automation, businesses can move faster, with fewer dependencies and errors and greater accuracy — especially when backed up by automated testing and validation of database changes. Human intervention is not needed, resulting in fewer hours spent on change management. ... Deploying CI/CD for databases empowers developers to focus on what they do best: Building better applications. Businesses today should decide when, not if, they plan to implement these practices. For development leaders looking to start deploying CI/CD in databases, standardization — such as how certain things are named and organized — is a solid first step and can set the stage for automation in the future. 


To Dare or not to Dare: the MVA Dilemma

Business stakeholders must understand the benefits of technology experiments in terms they are familiar with, regarding how the technology will better satisfy customer needs. Operations stakeholders need to be satisfied that the technology is stable and supportable, or at least that stability and supportability are part of the criteria that will be used to evaluate the technology. Wholly avoiding technology experiments is usually a bad thing because it may miss opportunities to solve business problems in a better way, which can lead to solutions that are less effective than they would be otherwise. Over time, this can increase technical debt. ... These trade-offs are constrained by two simple truths: the development team doesn’t have much time to acquire and master new technologies, and they cannot put the business goals of the release at risk by adopting unproven or unsustainable technology. This often leads the team to stick with tried-and-true technologies, but this strategy also has risks, most notably those of the hammer-nail kind in which old technologies that are unsuited to novel problems are used anyway, as in the case where relational databases are used to store graph-like data structures.


2025 API Trend Reports: Avoid the Antipattern

Modern APIs aren’t all durable, full-featured products, and don’t need to be. If you’re taking multiple cross-functional agile sprints to design an API you’ll use for less than a year, you’re wasting resources building a system that will probably be overspecified and bloated. The alternative is to use tools and processes centered around an API developer’s unit of work, which is a single endpoint. No matter the scope or lifespan of an API, it will consist of endpoints, and each of those has to be written by a developer, one at a time. It’s another way that turning back to the fundamentals can help you adapt to new trends. ... Technology will keep evolving, and the way we employ AI might look quite different in a few years. Serverless architecture is the hot trend now, but something else will eventually overtake it. No doubt, cybercriminals will keep surprising us with new attacks. Trends evolve, but underlying fundamentals — like efficiency, the need for collaboration, the value of consistency and the need to adapt — will always be what drives business decisions. For the API industry, the key to keeping up with trends without sacrificing fundamentals is to take a developer-centric approach. Developers will always create the core value of your APIs. 


The targeted approach to cloud & data - CIOs' need for ROI gains

AI and DaaS are part of the pool of technologies that Pacetti also draws on, and the company also uses AI provided by Microsoft, both with ChatGPT and Copilot. Plus, AI has been integrated into the e-commerce site to support product research and recommendations. But there’s an even more essential area for Pacetti.“With the end of third-party cookies, AI is now essential to exploit the little data we can capture from the internet user browsing who accept tracking,” he says. “We use Google’s GA4 to compensate for missing analytics data, for example, by exploiting data from technical cookies.” ... CIOs discuss sales targets with CEOs and the board, cementing the IT and business bond. But another even more innovative aspect is to not only make IT a driver of revenues, but also have it measure IT with business indicators. This is a form of advanced convergence achieved by following specific methodologies. Sondrio People’s Bank (BPS), for example, adopted business relationship management, which deals with translating requests from operational functions to IT and, vice versa, bringing IT into operational functions. BPS also adopts proactive thinking, a risk-based framework for strategic alignment and compliance with business objectives. 


Hidden Threats Lurk in Outdated Java

How important are security updates? After all, Java is now nearly 30 years old; haven’t we eliminated all the vulnerabilities by now? Sadly not, and realistically, that will never happen. OpenJDK contains 7.5 million lines of code and relies on many external libraries, all of which can be subject to undiscovered vulnerabilities. ... Since Oracle changed its distributions and licensing, there have been 22 updates. Of these, six PSUs required a modification and new release to address a regression that had been introduced. The time to create the new update has varied from just under two weeks to over five weeks. At no time have any of the CPUs been affected like this. Access to a CPU is essential to maintain the maximum level of security for your applications. Since all free binary distributions of OpenJDK only provide the PSU version, some users may consider a couple of weeks before being able to deploy as an acceptable risk. ... When an update to the JDK is released, all vulnerabilities addressed are disclosed in the release notes. Bad actors now have information enabling them to try and find ways to exploit unpatched applications.


How to defend Microsoft networks from adversary-in-the-middle attacks

Depending on the impact of the attack, start the cleanup process. Start by forcing a password change on the user account, ensuring that you have revoked all tokens to block the attacker’s fake credentials. If the consequences of the attack were severe, consider disabling the user’s primary account and setting up a new temporary account as you investigate the extent of the intrusion. You may even consider quarantining the user’s devices and potentially taking forensic-level backups of workstations if you are unsure of the original source of the intrusion so you can best investigate. Next review all app registrations, changes to service principals, enterprise apps, and anything else the user may have changed or impacted since the time the intrusion was noted. You’ll want to do a deep investigation into the mailbox’s access and permissions. Mandiant has a PowerShell-based script that can assist you in investigating the impact of the intrusion “This repository contains a PowerShell module for detecting artifacts that may be indicators of UNC2452 and other threat actor activity,” Mandiant notes. “Some indicators are ‘high-fidelity’ indicators of compromise, while other artifacts are so-called ‘dual-use’ artifacts.”



Quote for the day:

"To think creatively, we must be able to look afresh to at what we normally take for granted." -- George Kneller

Daily Tech Digest - November 13, 2024

In response to current digital transformation demands, organizations are integrating emerging technologies at an unprecedented rate. Despite their numerous benefits, securing these technologies is challenging for technology leaders. The white paper identified more than 200 critical and emerging technologies reshaping the digital ecosystem. Beyond AI and IoT, technologies such as blockchain, biotechnology and quantum computing are rising on the hype cycle, introducing new cybersecurity risks. ... Quantum computing, while promising breakthrough computational power, presents grave cybersecurity risks. It threatens to break current encryption standards, and quantum computers can potentially decrypt data collected now for future access. "The threat of quantum computing underscores the need for quantum-resistant cryptographic solutions to secure our digital future," the white paper stated. ... The cybersecurity industry faces a critical shortage of skilled professionals capable of managing emerging technology security. Cybersecurity Ventures projected a shortfall of 3.5 million cybersecurity professionals by 2025. Gartner predicted this skills gap would cause more than 50% of significant incidents by 2025. 


Do You Need a Solution or Enterprise Architect?

A Solution Architect is more like a surgeon who operates on someone to fix a problem, and the patient returns to normal life in a short time. An Enterprise Architect is more like an internal medicine specialist who treats a patient with a chronic illness over a number of years to improve the person’s quality of life. ... Architects are most successful when they help projects to succeed. Commonality of process and technology can be beneficial for an organization. But once architects are merely policing projects and rejecting aspects based on strict criteria, they lose the ability to positively influence the initiatives. Solution alignment is best achieved through working collaboratively with projects early to convince them of the advantages of various design choices. The first deliverable many architecture teams produce is what I call the “red/yellow/green list”. You’ve all seen these. Each technology classification is listed down the page – for example: server type, operating system, network software, database technology, and programming language. Three “colour” columns follow across the page. “Red” items are forbidden to be used by new projects. Although some legacy applications may still use them, they need to be phased out. “Yellow” items can be used under certain circumstances, but must be pre-approved by some kind of review committee. 


DataRobot launches Enterprise AI Suite to bridge gap between AI development and business value

The agentic AI approach is designed to help organizations handle complex business queries and workflows. The system employs specialist agents that work together to solve multi-faceted business problems. This approach is particularly valuable for organizations dealing with complex data environments and multiple business systems. “You ask a question to your agentic workflow, it breaks up the questions into a set of more specific questions, and then it routes them to agents which are specialists in various different areas,” Saha explained. For instance, a business analyst’s question about revenue might be routed to multiple specialized agents – one handling SQL queries, another using Python – before combining results into a comprehensive response. ... “We have put together a lot of instrumentation which lets people visually understand, for example, if you have a lot of clustering of data in the vector database, you can get a spurious answer,” Saha said. “You would be able to see that, if you see your questions are landing in areas where you don’t have enough information.” This observability extends to the platform’s governance capabilities, with real-time monitoring and intervention features. 


Using AI for DevOps: What Developers and Ops Need To Know

“AI can be incredibly powerful in DevOps when it’s implemented with a clear framework that makes it easy for developers to do the right thing and hard for them to do the wrong thing,” says Durkin. “Making it easy to do the right thing starts with standardizing templates and policies to streamline workflows. Create templates and enforce policies that support easy, repeatable integration of AI tools. By establishing policies that automate security and compliance checks, AI tools can operate within these boundaries, providing valuable support without compromising standards. This approach simplifies adoption and makes it harder to skip essential steps, reinforcing best practices across teams.” ... While having a well-considered strategy in place before embracing AI and DevOps is a must, Durkin and Govrin both offered up some additional tips and advice for getting AI tools and technologies to integrate with DevOps ambitions more easily. “In enterprise environments, deploying AI applications locally can significantly improve adoption and integration,” said Govrin. “Unlike consumer apps, enterprise AI benefits greatly from self-hosted setups, where solutions like local inference, support for self-hosted models and edge inferencing play a key role. These methods keep data secure and mitigate risks associated with data transfer across public clouds.”


The CISO paradox: With great responsibility comes little or no power

The absence of command makes cybersecurity decision-making a tedious and often frustrating process for CISOs. They are expected to move fast, to anticipate and address security issues before they become realized. But without command, they’re stuck in a cycle of “selling” the importance of security investments, waiting for approvals, and relying on others to prioritize those investments. This constant need for buy-in slows down response times and creates opportunities for something bad to happen. In cybersecurity, where timing is everything, these delays can be costly. Beyond timing, the concept of command is critical for strategic alignment and empowerment. In organizations where the CISO lacks true command, they’re forced to operate reactively rather than proactively. ... If organizations want to truly protect themselves, they need to recognize that CISOs require true command. The most effective CISOs are those who can operate with full authority over their domain, free from constant internal roadblocks. As companies consider how best to secure their data, they should ask themselves whether they are genuinely setting their CISOs up for success. Are they empowering them with the resources, authority, and autonomy to act? Or are they merely assigning a high-stakes responsibility without the power to fulfill it?


Harnessing SaaS to elevate your digital transformation journey

While SaaS provides the infrastructure, AI is the catalyst that powers digital transformation at scale. Companies are increasingly adopting AI-driven SaaS platforms to streamline workflows, automate tasks, and make data-driven decisions. In the B2B SaaS sector, this combination is revolutionising how businesses operate, helping them personalize customer interactions, predict outcomes, and optimize operations. ... In manufacturing, AI optimizes supply chain management, reducing waste and increasing productivity. In the finance sector, AI-driven SaaS automates risk assessment, improving decision-making and reducing operational costs. The benefits of adopting AI and SaaS are clear: enhanced customer experience, streamlined operations, and the ability to innovate faster than ever before. Companies that fail to integrate these technologies risk falling behind as competitors capitalize on these advancements to deliver superior products and services. As businesses continue to adopt SaaS and AI-driven solutions, the future of digital transformation looks promising. Companies are no longer just thinking about automating processes or improving efficiency, they are investing in technologies that will help them shape the future of their industries. 


Tackling ransomware without banning ransom payments

Despite these somewhat muddied waters, the correct response to ransomware attacks is clear: paying demands should almost always be a last resort. The only exception should be where there is a risk to life. Paying because it’s easy, costs less and causes less disruption to the business is not a good enough reason to pay, regardless of whether it’s the business handing cashing out or an insurer. However, while a step in the right direction, totally banning ransom payments addresses only one form of attack and feels a bit like a ‘whack-a-mole’ strategy. It may ease the rise in attacks for a short while, but attackers will inevitably switch tactics, to compromising business email perhaps, or something we’ve not even heard of yet. So, what else can be done to slow the rise in ransomware attacks? Well, we can consider a few options, such as closing vulnerability trading brokers and regulating cryptocurrency transactions. To pick on the latter as an example, most cybercrime monetizes through cryptocurrency, so rather than simply banning payments, it could be a better option to regulate the crypto industry and flow of money. Alongside this kind of regulatory change, governments could also consider moving the decision of whether to pay or not to an independent body. 


CISOs in 2025: Balancing security, compliance, and accountability

The scope of the CISO role has expanded significantly over the past 10-15 years, and has moved from mainly technical oversight to strategic leadership, risk management, and regulatory compliance. The constant pressure to prevent breaches and manage incidents can lead to high stress and burnout, making the role less appealing. This also means that modern CISOs must possess a blend of technical expertise, strategic thinking, and strong interpersonal skills. The requirement for such a diverse skill set can limit the pool of qualified candidates, as not all cybersecurity professionals have the necessary combination of skills. ... CISOs will need to be able to effectively communicate complex cybersecurity issues to non-technical board members and executives. This involves translating technical jargon into business language, and clearly articulating the impact of cybersecurity risks on the organization’s overall business strategy. And as cybersecurity becomes integral to business strategy, CISOs must be able to think beyond immediate threats, and focus on long-term strategic planning. This includes understanding how cybersecurity initiatives align with business goals and contribute to competitive advantage.


Emergence of Preemptive Cyber Defense: The Key to Defusing Sophisticated Attacks

The frequency of attacks is only part of the problem. Perhaps the biggest concern is the sophistication of incidents. Right now, cybercriminals are using everything from AI and machine learning to polymorphic malware coupled with sophisticated psychological tactics that play off of breaking world events and geopolitical tension. ... The clear limitations of these reactive systems have many businesses looking to shift away from the “one-size-fits-all” approach to more dynamic options. ... With redundancy, security, and resiliency in mind, many companies are following the lead of government agencies and diversifying their cybersecurity investments across multiple providers. This includes the option of a preemptive cyber defense solution, which, rather than relying on a single offering, blends in three — a triad that addresses the complexities of modern cybersecurity challenges. ... The preemptive cyber defense triad offers businesses the ultimate protection—a security ecosystem where the attack surface is constantly changing (AMTD), the security controls are always optimized (ASCA), and the overall threat exposure is continuously managed and minimized (CTEM).


Insurance Firm Introduces Liability Coverage for CISOs

“CISOs are the front line of defense against cyber threats, yet their role may leave them exposed to personal liabilities – particularly in light of the Securities and Exchange Commission’s (SEC) new cyber disclosure rules,” Nick Economidis, senior vice president of eRisk at Crum and Forster, said in a statement. “Our CISO Professional Liability Insurance is designed to bridge that gap, providing an essential safety net by offering CISOs the protection they need to perform their jobs with confidence.” ... The new insurance program by the Morristown, New Jersey-based law firm comes in the wake of charges against software maker SolarWinds and its CISO, Tim Brown, being dismissed by a federal court judge. The charges were made in connection with the massive software supply chain attack in 2020 by a threat group supported by Russia’s foreign intelligence services. ... “As personal liability risks for CISOs continue to evolve, the availability and scope of D&O insurance will remain a critical factor in recruiting and retaining top cybersecurity talent,” Fehling wrote. “Companies that offer robust insurance protection may gain a competitive advantage in the tight market for skilled security leaders.” 



Quote for the day:

"If you want to achieve excellence, you can get there today. As of this second, quit doing less-than-excellent work." -- Thomas J. Watson

Daily Tech Digest - November 12, 2024

Researchers Focus In On ‘Lightcone Bound’ To Develop An Efficiency Benchmark For Quantum Computers

The researchers formulated this bound by first reinterpreting the quantum circuit mapping challenge through quantum information theory. They focused on the SWAP “uncomplexity,” the lowest number of SWAP operations needed, which they determined using graph theory and information geometry. By representing qubit interactions as density matrices, they applied concepts from network science to simplify circuit interactions. To establish the bound, in an interesting twist, the team employed a Penrose diagram — a tool from theoretical physics typically used to depict spacetime geometries — to visualize the paths required for minimal SWAP-gate application. They then compared their model against a brute-force method and IBM’s Qiskit compiler, with consistent results affirming that their bound offers a practical minimum SWAP requirement for near-term quantum circuits. The researchers acknowledge the lightcone model has some limitations that could be the focus of future work. For example, it assumes ideal conditions, such as a noiseless processor and indefinite parallelization, conditions not yet achievable with current quantum technology. The model also does not account for single-qubit gate interactions, focusing only on two-qubit operations, which limits its direct applicability for certain quantum circuits.


Evaluating your organization’s application risk management journey

One way CISOs can articulate application risk in financial terms is by linking security improvement efforts to measurable outcomes, like cost savings and reduced risk exposure. This means quantifying the potential financial fallout from security incidents and showing how preventative measures mitigate these costs. CISOs need to equip their teams with tools that will help them protect their business in the short and long term. A study we commissioned with Forrester found that putting application security measures in place could save average organization millions in terms of avoided breach costs. ... To keep application risk management a dynamic, continuous process, CISOs integrate security into every stage of software development. Instead of relying on periodic assessments, organisations should implement real-time risk analysis, continuous monitoring, and feedback mechanisms to enable teams to address vulnerabilities promptly as they arise, rather than waiting for scheduled evaluations. Incorporating automation can also play a key role in streamlining this process, enabling quicker remediation of identified risks. Building on this, creating a security-first mindset across the organisation – through training and clear communication – ensures risk management adapts to new threats, supporting both innovation and compliance.


How a Second Trump Presidency Could Shape the Data Center Industry

“We anticipate that the incoming administration will have a keen focus on AI and our nation’s ability to be the global leader in the space,” Andy Cvengros, managing director, co-lead of US data center markets for JLL, told Data Center Knowledge. He said to do that, the industry will need to solve the transmission delivery crisis and continue to increase generation capacity rapidly. This may include reactivating decommissioned coal and nuclear power plants, as well as commissioning more of them. “We also anticipate that state and federal governments will become much more active in enabling the utilities to proactively expand substations, procure long lead items and support key submarket expansion through planned developments,” Cvengros said. ... Despite the federal government’s likely hands-off approach, Harvey said he believes large corporations might support consistent, global standards – especially since European regulations are far stricter. “US companies would prefer a unified regulatory framework to avoid navigating a complex patchwork of rules across different regions,” he said. Still, Europe’s stronger regulatory stance on renewable power might lead some companies to prioritize US-based expansions, where subsidies and fewer regulations make operations more economically feasible.


Data Breaches are a Dime a Dozen: It’s Time for a New Cybersecurity Paradigm

The modern-day ‘stack’ includes many disparate technology layers—from physical and virtual servers to containers, Kubernetes clusters, DevOps dashboards, IoT, mobile platforms, cloud provider accounts, and, more recently, large language models for GenAI. This has created the perfect storm for threat actors, who are targeting the access and identity silos that significantly broaden the attack surface. The sheer volume of weekly breaches reported in the press underscores the importance of protecting the whole stack with Zero Trust principles. Too often, we see bad actors exploiting some long-lived, stale privilege that allows them to persist on a network and pivot to the part of a company’s infrastructure that houses the most sensitive data. ... Zero Trust access for modern infrastructure benefits from being coupled with a unified access mechanism that acts as a front-end to all the disparate infrastructure access protocols – a single control point for authentication and authorization. This provides visibility, auditing, enforcement of policies, and compliance with regulations, all in one place. These solutions already exist on the market, deployed by security-minded organizations. However, adoption is still in early days. 


AI’s math problem: FrontierMath benchmark shows how far technology still has to go

Mathematics, especially at the research level, is a unique domain for testing AI. Unlike natural language or image recognition, math requires precise, logical thinking, often over many steps. Each step in a proof or solution builds on the one before it, meaning that a single error can render the entire solution incorrect. “Mathematics offers a uniquely suitable sandbox for evaluating complex reasoning,” Epoch AI posted on X.com. “It requires creativity and extended chains of precise logic—often involving intricate proofs—that must be meticulously planned and executed, yet allows for objective verification of results.” This makes math an ideal testbed for AI’s reasoning capabilities. It’s not enough for the system to generate an answer—it has to understand the structure of the problem and navigate through multiple layers of logic to arrive at the correct solution. And unlike other domains, where evaluation can be subjective or noisy, math provides a clean, verifiable standard: either the problem is solved or it isn’t. But even with access to tools like Python, which allows AI models to write and run code to test hypotheses and verify intermediate results, the top models are still falling short.


Can Wasm replace containers?

One area where Wasm shines is edge computing. Here, Wasm’s lightweight, sandboxed nature makes it especially intriguing. “We need software isolation on the edge, but containers consume too many resources,” says Michael J. Yuan, founder of Second State and the Cloud Native Computing Foundation’s WasmEdge project. “Wasm can be used to isolate and manage software where containers are ‘too heavy.’” Whereas containers take up megabytes or gigabytes, Wasm modules take mere kilobytes or megabytes. Compared to containers, a .wasm file is smaller and agnostic to the runtime, notes Bailey Hayes, CTO of Cosmonic. “Wasm’s portability allows workloads to run across heterogeneous environments, such as cloud, edge, or even resource-constrained devices.” ... Wasm has a clear role in performance-critical workloads, including serverless functions and certain AI applications. “There are definitive applications where Wasm will be the first choice or be chosen over containers,” says Luke Wagner, distinguished engineer at Fastly, who notes that Wasm brings cost-savings and cold-start improvements to serverless-style workloads. “Wasm will be attractive for enterprises that don’t want to be locked into the current set of proprietary serverless offerings.”


Authentication Actions Boost Security and Customer Experience

Authentication actions can be used as effective tools for addressing the complex access scenarios organizations must manage and secure. They can be added to workflows to implement convenience and security measures after users have successfully proven their identity during the login process. ... When using authentication actions, first take some time to fully map out the customer journey you want to achieve, and most importantly, all of the possible variations of this journey. Think of your authentication requirements as a flowchart that you control. Start by mapping out your requirements for different users and how you want them to sign up and authenticate. Understand the trade-off between security and user experience. Consider using actions to enable a frictionless initial login with a simple authentication method. You can use step-up authentication as a technique that increases the level of assurance when the user needs to perform higher-privilege operations. You can also use actions to implement dynamic behavior per user. For instance, you can use an action that captures an identifier like an email to identify the user. Then you can use another action to look up the user’s preferred authentication method or methods to give each user a personalized experience.


How Businesses use Modern Development Platforms to Streamline Automation

APIs are essential for streamlining data flows between different systems. They enable various software applications to communicate with each other, automating data exchange and reducing manual input. For instance, integrating an API between a customer relationship management (CRM) system and an email marketing platform can automatically sync contact information and campaign data. This not only saves time, but also minimizes errors that can occur with manual data entry. ... Workflow automation tools are designed to streamline business processes by automating repetitive steps and ensuring smooth transitions between tasks. These tools help businesses design and manage workflows, automate task assignments, and monitor progress. For example, tools like Asana and Monday.com allow teams to automate task notifications, approvals, and status updates. By automating these processes, businesses can improve collaboration and reduce the risk of missed deadlines or overlooked tasks. Workflow automation tools also provide valuable insights into process performance, enabling companies to identify bottlenecks and optimize their operations. This leads to more efficient workflows and better resource management.

“Micromanagement is one of the fastest ways to destroy IT culture,” says Jay Ferro, EVP and chief information, technology, and product officer at Clario. “When CIOs don’t trust their teams to make decisions or constantly hover over every detail, it stifles creativity and innovation. High-performing professionals crave autonomy; if they feel suffocated by micromanagement, they’ll either disengage or leave for an environment where they’re empowered to do their best work.” ... One of the most challenging issues facing transformational CIOs is the overwhelming demand to take on more initiatives, deliver to greater scope, or accept challenging deadlines. Overcommitting to what IT can reasonably accomplish is an issue, but what kills IT culture is when the CIO leaves program leaders defenseless when stakeholders are frustrated or when executive detractors roadblock progress. “It demoralizes IT when there is a lack of direction, no IT strategy, and the CIO says yes to everything the business asks for regardless of whether the IT team has the capacity,” says Martin Davis, managing partner at Dunelm Associates. “But it totally kills IT culture when the CIO doesn’t shield teams from angry or disappointed business senior management and stakeholders.”


Understanding Data Governance Maturity: An In-Depth Exploration

Maturity in data governance is typically assessed through various models that measure different aspects of data management such as data quality and compliance and examines processes for managing data’s context (metadata) and its security. Maturity models provide a structured way to evaluate where an organization stands and how it can improve for a given function. ... Many maturity models are complex and may require significant time and resources to implement. Organizations need to ensure they have the capacity to effectively handle the complexity involved in using these models. Additionally, some data governance maturity models do not address the relevant related data management functions, such as metadata management, data quality management, or data security to a sufficient level of detail for some organizations. ... Implementing changes based on maturity model assessments can face resistance; organizational culture may not accept the views discovered in an assessment. Adopting and sustaining effective change management strategies and choosing a maturity model carefully can help overcome resistance and ensure successful implementation.



Quote for the day:

"Whenever you see a successful person, you only see the public glories, never the private sacrifices to reach them." -- Vaibhav Shah