Daily Tech Digest - November 10, 2023

The promise of collective superintelligence

The goal is not to replace human intellect, but to amplify it by connecting large groups of people into superintelligent systems that can solve problems no individual could solve on their own, while also ensuring that human values, morals and interests are inherent at every level. This might sound unnatural, but it’s a common step in the evolution of many social species. Biologists call the phenomenon Swarm Intelligence and it enables schools of fish, swarms of bees and flocks of birds to skillfully navigate their world without any individual being in charge. They don’t do this by taking votes or polls the way human groups make decisions. Instead, they form real-time interactive systems that push and pull on the decision-space and converge on optimized solutions. ... Can we enable conversational swarms in humans? It turns out, we can by using a concept developed in 2018 called hyperswarms that divides real-time human groups into overlapping subgroups. ... Of course, enabling parallel groups is not enough to create a Swarm Intelligence. That’s because information needs to propagate across the population. This was solved using AI agents to emulate the function of the lateral line organ in fish.


There's Only One Way to Solve the Cybersecurity Skills Gap

The plain truth is that it's not just a numbers game. Many of these roles are considered "hard to fill" because they are for specialist skill sets such as forensic analysis, security architecture, interpreting malicious code, or penetration testing. Or they're for senior roles with three to six years' experience. Even if companies recruit people with high potential but not the requisite background, it will take years for these recruits to upskill to reach a sufficient standard. Moreover, if we throw open the gates completely, we risk diluting the industry by introducing a whole swath of people with no technical skills. Yes, soft skills are valuable and in short supply too, but relying on these alone to fill the workforce gap does nothing to address the problem businesses have: a lack of trained, competent cybersecurity professionals, resulting, once again, in less resilience. Another major hurdle is that many organizations are reluctant to invest in training because the job market is so volatile. There's a fear that, by investing in new recruits, those staff members will become a flight risk and put themselves back into that talent pool. 


The Struggle for Microservice Integration Testing

Integration testing is crucial for microservices architectures. It validates the interactions between different services and components, and you can’t successfully run a large architecture of isolated microservices without integration testing. In a microservices setup, each service is designed to perform a specific function and often relies on other services to fulfill a complete user request. While unit tests ensure that individual services function as expected in isolation, they don’t test the system’s behavior when services communicate with each other. Integration tests fill this gap by simulating real-world scenarios where multiple services interact, helping to catch issues like data inconsistencies, network latency and fault tolerance early in the development cycle. Integration testing provides a safety net for CI/CD pipelines. Without comprehensive integration tests, it’s easy for automated deployments to introduce regressions that affect the system’s overall behavior. By automating these tests, you can ensure that new code changes don’t disrupt existing functionalities and that the system remains robust and scalable.


Google Cloud’s Cybersecurity Trends to Watch in 2024 Include Generative AI-Based Attacks

Threat actors will use generative AI and large language models in phishing and other social engineering scams, Google Cloud predicted. Because generative AI can create natural-sounding content, employees may struggle to identify scam emails through poor grammar or spam calls through robotic-sounding voices. Attackers could use generative AI to create fake news or fake content, Google Cloudwarned. LLMs and generative AI “will be increasingly offered in underground forums as a paid service, and used for various purposes such as phishing campaigns and spreading disinformation,” Google Cloud wrote. On the other hand, defenders can use generative AI in threat intelligence and data analysis. Generative AI could allow defenders to take action at greater speeds and scales, even when digesting very large amounts of data. “AI is already providing a tremendous advantage for our cyber defenders, enabling them to improve capabilities, reduce toil and better protect against threats,” said Phil Venables, chief information security officer at Google Cloud, in an email to TechRepublic.


OpenAI’s gen AI updates threaten the survival of many open source firms

The new API, according to OpenAI, is expected to provide new capabilities including a Code Interpreter, Retrieval Augmented Generation (RAG), and function calling to handle “heavy lifting” that would previously require developer expertise in order to build AI-driven applications. The Assistants API, specifically, may cause revenue losses for open source companies including LangChain, LLamaIndex, and ChromaDB, according to Andy Thurai, principal analyst at Constellation Research. “For organizations that want to standardize on OpenAI, the more their platform offers, the less organizations will need other frameworks such as Langchain and LlamaIndex. The new updates allow developers to create their applications within a single framework,” said David Menninger, executive director at Ventana Research. However, he pointed out that until the new features, such as the new API, are made generally available, enterprises will continue to put applications into production by relying on existing open source frameworks.


When net-zero goals meet harsh realities

There is a move towards greater precision and accountability at the non-governmental level, too. The principles of carbon emission measurement and reporting that underpin, for example, all corporate net-zero objectives tend to be agreed upon internationally by institutions such as the World Resources Institute and the World Business Council for Sustainable Development; in turn, these are used by bodies such as the SBTi and the CDP. Here too, standards are being rewritten, so that, for example, the use of carbon offsets is becoming less acceptable, forcing operators to buy carbon-free energy directly. With all these developments under way, there is a startling disconnect between many of the public commitments by countries and companies, and what most digital infrastructure organizations are currently doing or are able to do. ... The difference between the two surveys highlights a second disconnect. IBM’s findings, based on responses from senior IT and sustainability staff, show a much higher proportion of organizations collecting carbon emission data than Uptime’s.


CISOs Beware: SEC's SolarWinds Action Shows They're Scapegoating Us

The SEC had been trying to create accountability by holding a board accountable and liable for issues concerning cybersecurity incidents that inevitably occur from time to time. But now, in the case of SolarWinds, the SEC has turned around and directly gone after somebody who's only now the CISO. Brown wasn't the CISO when the breaches happened. He had been SolarWinds' VP of security and architecture and head of its information security group between July 2017 and December 2020, and he stepped into the role of CISO in January 2021. The result of the SEC's failure to mandate security leadership on corporate boards is that they've resorted to holding the CISO liable. This shift underscores a significant transformation in the CISO landscape. From my perspective as a CISO, it's increasingly clear that technical security expertise is an essential requirement for the role. Each day, CISOs are tasked with making critical decisions, such as approving or accepting timeline adjustments for security risks that have the potential to be exploited. 


Security in the impending age of quantum computers

The timeline for developing a cryptographically relevant quantum computer is highly contested, with estimates often ranging between 5 and 15 years. Although the date when such a quantum computer exists remains in the future, this does not mean this is a problem for future CIOs and IT professionals. The threat is live today due to the threat of “harvest now, decrypt later” attacks, whereby an adversary stores encrypted communications and data gleaned through classical cyberattacks and waits until a cryptographically relevant quantum computer is available to decrypt the information. To further highlight this threat, the encrypted data could be decrypted long before a cryptographically relevant quantum computer is available if the data is secured via weak encryption keys. While some data clearly loses its value in the short term, social security numbers, health and financial data, national security information, and intellectual property retain value for decades and the decryption of such data on a large scale could be catastrophic for governments and companies alike.


How the Online Safety Act will impact businesses beyond Big Tech

The requirements that apply to all regulated services, including those outside the special categories, are naturally the least onerous under the Act; however, because these still introduce new legal obligations, for many businesses these will require considering compliance through a new lens. ... Regulated services will have to conduct certain risk assessments at defined intervals. The type of risk assessments a service provider must conduct depends on the nature and users of the service.Illegal content assessment: all providers of regulated services must conduct a risk assessment of how likely users are to encounter and be harmed by illegal content, taking into account a range of factors including user base, design and functionalities of the service and its recommender systems, and the nature and severity of harm that individuals might suffer due to this content. ... all regulated services must carry out an assessment of whether the service is likely to be accessed by children, and if so they must carry out a children’s risk assessment of how likely children are to encounter and be harmed by content on the site, giving separate consideration to children in different age groups.


Enterprises vs. The Next-Generation of Hackers – Who’s Winning the AI Race?

Amidst a push for responsible AI development, major players in the space are on a mission to secure their tools from malicious use but bad actors have already started to take advantage of the same tech to boost their skill sets. Enterprises are increasingly finding new ways to integrate AI into internal workflows and external offerings, which in turn has created a new attack vector for hackers. This expanded surface has opened the door for a new wave of sophisticated attacks using advanced methods and unsuspecting entry points that enterprises previously didn’t have to secure against. ... Today’s threat landscape is transforming — hackers have tools at their fingertips that can rapidly advance their impact and an entirely new attack vector to explore. With growing enterprise use of AI offering an opportunity to expedite attacks, now is the time to focus on transforming security defenses. ... Despite scrutiny for its ability to equip cybercriminals with more advanced techniques, AI models can be used just as effectively among security and IT teams to mitigate these mounting threats. 



Quote for the day:

"Doing what you love is the cornerstone of having abundance in your life." -- Wayne Dyer

Daily Tech Digest - November 09, 2023

MIT Physicists Transform Pencil Lead Into Electronic “Gold”

MIT physicists have metaphorically turned graphite, or pencil lead, into gold by isolating five ultrathin flakes stacked in a specific order. The resulting material can then be tuned to exhibit three important properties never before seen in natural graphite. ... “We found that the material could be insulating, magnetic, or topological,” Ju says. The latter is somewhat related to both conductors and insulators. Essentially, Ju explains, a topological material allows the unimpeded movement of electrons around the edges of a material, but not through the middle. The electrons are traveling in one direction along a “highway” at the edge of the material separated by a median that makes up the center of the material. So the edge of a topological material is a perfect conductor, while the center is an insulator. “Our work establishes rhombohedral stacked multilayer graphene as a highly tunable platform to study these new possibilities of strongly correlated and topological physics,” Ju and his coauthors conclude in Nature Nanotechnology


Conscientious Computing – Facing into Big Tech Challenges

The tech industry has driven incredibly rapid innovation by taking advantage of increasingly cheap and more powerful computing – but at what unintended cost? What collateral damage has been created in our era of “move fast and break things”? Sadly, it’s now becoming apparent we have overlooked the broader impacts of our technological solutions. As software proliferates through every facet of life and the scale of it increases, we need to think more about where this leads us from people, planet and financial perspectives. ... The classic Scope, Cost, Time pyramid – but often it’s the **observable ** functional quality that is prioritised. For that I’ll use a somewhat surreal version of an iceberg – as so much of technical (and effectively sustainability debt – a topic for a future blog) is hidden below the water line. Every engineering decision (or indecision) has ethical and sustainability consequences, often invisible from within our isolated bubbles. Just as the industry has had to raise its game on topics such as security, privacy and compliance, we desperately need to raise our game holistically on sustainability.


The CIO’s fatal flaw: Too much leadership, not enough management

So why does leadership get all the buzz? A cynic might suggest that the more respect doing-the-work gets, the more the company might have to pay the people who do that work, which in turn would mean those who manage the work would get paid more than those who think and charismatically express deep and inspirational thoughts. And as there are more people who do work than those who manage it, respecting the work and those who do it would be expensive. Don’t misunderstand. Done properly, leading is a lot of work, and because leading is about people, not processes or tools and technology; it’s time consuming, too. And in fact, when I conduct leadership seminars, the biggest barrier to success for most participants is figuring out and committing to their time budget. Leadership, that is, involves setting direction, making or facilitating decisions, staffing, delegating, motivating, overseeing team dynamics, engineering the business culture, and communicating. Leaders who are committed to improving at their trade must figure out how much time they plan to devote to each of these eight tasks, which is hard enough.


The Next IT Challenge Is All about Speed and Self-Service

One of the most significant roadblocks to rapid cloud adoption is sheer complexity. Provisioning a cloud environment involves dozens of dependent services, intricate configurations, security policies and data governance issues. The cognitive load on IT teams is significant, and the situation is exacerbated by manual processes that are still in place. The vast majority of engineering teams still depend on legacy ticketing systems to request IT for cloud environments, which adds a significant load on IT and also slows engineering teams. This slows down the entire operation, making it difficult for IT and engineering to support business needs effectively. In fact, in one study conducted by Rafay Systems, application developers at enterprises revealed that 25% of organizations reportedly take three months or longer to deploy a modern application or service after its code is complete. The real goal for any IT department is to support the needs of the business. Today, they do that better, faster and more cost-effectively by leveraging cloud technologies to realize all the business benefits of the modern applications being deployed.


The DPDP Act: Bolstering data protection & privacy, making India future-ready

The DPDP Act has a direct impact across industries. Organisations not only need to reassess their existing compliance status and gear up to cope with the new norms but also create a phased action plan for various processes. Moreover, if labeled as SDF, organisations also need to appoint a Data Protection Officer (DPO). In addition, organisations need to devise appropriate data protection and privacy policy framework in alignment with the DPDP Act. Further, consent forms and mechanisms have to be developed to ensure standard procedures as laid out in the legislation. Companies have to additionally invest to adopt the necessary changes in compliance with the law. They need to list down their third-party data handlers, consent types and processes, privacy notices, contract clauses, categorise data, and develop breach management processes. Sharing his perspective on the DPDP Act, Amit Jaju, Senior Managing Director, Ankura Consulting Group (India) says, “The Digital Personal Data Protection Act 2023 has ushered in a new era of data privacy and protection, compelling solution providers to realign their business strategies with its mandates. 


Will AI hurt or help workers? It's complicated

Here's what is certain: CIOs see AI as being useful, but not replacing higher-level workers. JetRockets recently surveyed US CIOs. In its report, How Generative AI is Impacting IT Leaders & Organizations, the custom-software firm found that CIOs are primarily using AI for cybersecurity and threat detection (81%), with predictive maintenance and equipment monitoring (69%) and software development / product development (68%) in second and third place, respectively. Security, you ask? Yes, security. CrowdStrike, a security company, sees a huge demand building for AI-based security virtual assistants. A Gartner study on virtual assistants predicted, "By 2024, 40% of advanced virtual assistants will be industry-domain-specific; by 2025, advanced virtual assistants will provide advisory and intervention roles for 30% of knowledge workers, up from 5% in 2021." By CrowdStrike's reckoning, AI will "help organizations scale their cybersecurity workforce by three times and reduce operating costs by close to half a million dollars." That's serious cash.


From Chaos to Confidence: The Indispensable Role of Security Architecture

Beyond mere firefighting, security architecture embraces the proactive art of strategic defense. It takes a risk-based approach to identifying potential threats, assessing weak points in an organization's IT stack, architecting forward-looking designs and prioritizing security initiatives. By aligning security investments with the organization's risk tolerance and business priorities, security architecture ensures that precious resources are optimally allocated for maximum security defense designed with in-depth zero trust security principles in mind. This reduces enterprise application deployment and operational security costs. It is similar to designing high-rise buildings in a standard manner, following all safety codes and by-laws while still allowing individual apartment owners to design and create their homes as they would prefer. Cyberattacks have become increasingly sophisticated and frequent. As a result, it is imperative for defense systems to have comprehensive, purpose-built architectures and designs in place to protect against such threats. Security architecture provides a complete defense framework by integrating various security components


Top 5 IT disaster scenarios DR teams must test

Failed backups are some of the most frequent IT disasters. Businesses can replace hardware and software, but if the data and all backups are gone, bringing them back might be impossible or incredibly expensive. Sys admins must periodically test their ability to restore from backups to ensure backups are working correctly and the restore process does not have some unseen fatal flaw. At the same time, there should always be multiple generations of backups, with some of those backup sets off site. ... Hardware failure can take many forms, including a system not using RAID, a single disk loss taking down a whole system, faulty network switches and power supply failures. Most hardware-based IT disaster scenarios can be mitigated with relative ease, but at the cost of added complexity and a price tag. One example is a database server. Such a server can be turned into a database cluster with highly available storage and networking. The cost for doing this would easily double the cost of a single nonredundant server. Administrators would also have to undergo training to manage such an environment.


Mastering AI Quality: Strategies for CDOs and Tech Leaders

Most chief data officers (CDOs) work hard to make their data operations into “glass boxes” --transparent, explainable, explorable, trustworthy resources for their companies. Then comes artificial intelligence and machine learning (AI/ML), with their allure of using that data for ever-more impressive strategic leaps, efficiencies, and growth potential. However, there’s a problem. Nearly all AI/ML tools are “black boxes.” They are so inscrutable even their creators are concerned about how they produce their results. The speed and depth at which these tools can process data without human intervention or input presents a danger to technology leaders seeking control of their data and who want to ensure and verify the quality of analytics that use it. Combine this with a push to remove humans from the decision loop and you have a potent recipe for decisions to go off the rails. ... With a human collaborator or a human-designed algorithm, it is generally easy to elicit a meaningful response to the question, “Why is this result what it is?” With AI -- and generative AI in particular -- that may not be the case.


Revamping IT for AI System Support

“It’s important for everybody to understand how fast this [AI] is going to change,” said Eric Schmidt, former CEO and chairman of Google. “The negatives are quite profound.” Among the concerns is that AI firms still had “no solutions for issues around algorithmic bias or attribution, or for copyright disputes now in litigation over the use of writing, books, images, film, and artworks in AI model training. Many other as yet unforeseen legal, ethical, and cultural questions are expected to arise across all kinds of military, medical, educational, and manufacturing uses.” The challenge for companies and for IT is that the law always lags technology. There will be few hard and fast rules for AI as it advances relentlessly. So, AI runs the risk of running off ethical and legal guardrails. In this environment, legal cases are likely to arise that define case law and how AI issues will be addressed. The danger for IT and companies is that they don’t want to be become the defining cases for the law by getting sued. CIOs can take action by raising awareness of AI as a corporate risk management concern to their boards and CEOs.



Quote for the day:

"Holding on to the unchangeable past is a waste of energy and serves no purpose in creating a better future." -- Unknown

Daily Tech Digest - November 08, 2023

Iconic Singapore hotel caught up in major data breach

The breach was first identified on 20 October, having begun a day previously when an undisclosed third-party gained unauthorised access to the firm’s systems. “Upon discovery of the incident, our teams immediately took action to resolve it. Investigations have since determined that an unknown third party accessed customer data of about 665,000 non-casino rewards programme members,” MBS said in a statement. “Based on our investigation, we do not have evidence to date that the unauthorised third party has misused the data to cause harm to customers. “We do not believe that membership data from our casino rewards programme, Sands Rewards Club, was affected. “After learning of the issue, we quickly launched an investigation, have been working with a leading external cyber security firm, and have taken action to further strengthen our systems and protect data,” said the organisation. The compromised data is understood to include names, email addresses, mobile phone and landline numbers, countries of residence, and membership numbers and tier status. MBS is reaching out to those affected.


8 ways to fix open source funding

Richard Stallman famously said, “Free as in speech, not as in beer.” Now, some developers are creating licenses that don’t offer either senses of freedom—but they’re still delivering just enough of the kind of openness that satisfies their users’ curiosity. One version is the “free tier” that offers enough access to test new ideas and maybe run a small, personal website while still charging for more substantial use. Developers encounter no impediment when they’re just experimenting, but if they want to start something serious, they need to pay. Another example is the license that lets users read but not distribute. One developer told me that he routinely lets paying customers get full access to the code for audits or experimentation, but he does not release it into the open. The customers get to see what they want, but they can’t undercut the company or give away the software for free. These licenses deliver some of what made open source popular without sacrificing the ability to compel payment.


Meet Your New Cybersecurity Auditor: Your Insurer

The current state of cyber insurance offers some actionable opportunities for security decision-makers. First, don't underestimate the power of an accurate cyber-insurance self-assessment, which is how cyber insurers judge organizations during the auditing and claims processes. Current self-assessment surveys ask surprisingly challenging questions and cover a wide set of fields from backups to AD security to MFA. It is important not to treat this as a formality and to ensure that information is entirely accurate; insurers are more than willing to decline coverage and even sue if an enterprise falsely claims, for example, that it has MFA protection across all its digital assets. ... Therefore, the second step is for CISOs to prove they have the capability they have on those forms. Luckily, this is a landscape familiar to seasoned CISOs. Creating and maintaining detailed records, building reporting systems, documenting all relevant business and security processes, and creating tamper-proof data for cyber forensics are all possible with sophisticated cybersecurity tools.


Green data centres: Efforts to push sustainable IT developments

Green data centres are spearheading a transformative wave in the IT industry, bringing substantial benefits to both businesses and the environment. From a financial perspective, these eco-conscious facilities deliver remarkable cost savings. By leveraging energy-efficient technologies and renewable energy sources, companies can significantly reduce operational expenses. Innovative solutions like data reduction technology and automated resource optimisation further bolster these financial advantages. However, the influence of green data centres extends far beyond financial gains. They play a pivotal role in mitigating greenhouse gas emissions, actively contributing to the fight against climate change. As data centres and communication sectors are anticipated to account for up to 3.9% of global emissions, the adoption of renewable energy sources and energy-efficient practices dramatically reduces their carbon footprint. In doing so, green data centres are setting a commendable example for other industries, driving the broader adoption of sustainable practices. 


The 3 key stages of ransomware attacks and useful indicators of compromise

Once hackers find key data, they will begin to download the actual ransomware payload. They may exfiltrate data, set up an encryption key, and then encrypt the vital data. IoCs at this stage include communication with a C2 server, data movement (if the attacker is exfiltrating important data before they encrypt it) and unusual activity around encrypted traffic. Detecting at this stage involves more advanced security products working in unison. Model chaining different types of analytics together is an efficient way to catch minor indicators of compromise when it comes to ransomware because they gather context on the network in real-time, allowing SOC teams to identify anomalous behavior when it occurs. If a security alert is triggered, these other analytics can provide more context to help piece together if and how a larger attack is occurring. But many successful ransomware attacks will not trip antivirus at all, so assembling an accurate picture of user behaviors and compiling the numerous indicators into a coherent timeline is vital.


What's possible in a zero-ETL future?

ETL frequently requires data engineers to write custom code. Then, DevOps engineers or IT administrators have to deploy and manage the infrastructure to make sure the data pipelines scale. And when the data sources change, the data engineers have to manually change their code and deploy it again. Furthermore, when data engineers run into issues, such as data replication lag, breaking schema updates, and data inconsistency between the sources and destinations, they have to spend time and resources debugging and repairing the data pipelines. ... Zero-ETL enables querying data in place through federated queries and automates moving data from source to target with zero effort. This means you can do things like run analytics on transactional data in near real-time, connect to data in software applications, and generate ML predictions from within data stores to gain business insights faster, rather than having to move the data to a ML tool. You can also query multiple data sources across databases, data warehouses, and data lakes without having to move the data. To accomplish these tasks, we've built a variety of zero-ETL integrations between our services to address many different use cases.


An Ethical Approach to Employee Poaching

The practice of employee poaching isn’t without risk, because it hurts companies to lose good employees and relationships can get fractured. This is one reason why many public sector organizations insist on notifying the organization that could potentially lose an employee in advance of even scheduling an interview with a job candidate from that organization. On the private sector side, there are no such rules, but there is an etiquette for employee poaching that seems to work. ... When it comes to poaching, your employees need to know about tampering, too. It’s often employees who start the poaching process. They develop relationships with employees in a partner organization, and it is natural to want to work together. Nevertheless, there is a fine line between just wanting to work together and a situation that escalates into aggressive recruitment (and unacceptable tampering). The best practice is to remind employees about tampering, and to explain what it is, so that employees don’t actively recruit individuals from partner organizations without going through proper channels.


Data Management for M&A: 14 Best Practices Before and After the Deal

After the deal is complete, you can begin executing your integration strategy. You have already performed due diligence on the data landscape of both parties of the merger, created the integration plan, and estimated the workload. The steps and practices below do not have to be executed in the exact order or completeness. They represent the best practices for ensuring data quality, accessibility, privacy, usability, and transparency. You should start with the activity that best corresponds to your data pains and business objectives. ... After you have set the foundations for the effective use of data, you need to focus on getting data into shape (and keeping it that way) for critical business processes, reports, models, and data products. After all, to get real benefits from the acquired data, it is necessary to integrate it. However, as we know, 88% of data integration projects fail or overrun their budgets because of poor data quality


Keep it secret, keep it safe: the essential role of cybersecurity in document management

Solberg says security considerations should be an integral component of any strategic assessment for document management. “For example, when identifying the key objectives organizations may typically identify increased efficiency, reduced costs, increased collaboration,” he says. “Given the significant cyber risks organizations face in our rapidly digitized world, it's essential that the organization also clearly articulate an objective to protect the data, documents, and systems from the outset.” Security must also be incorporated in the phases of the document management assessment, including the analysis of the current state and the articulation of the roadmap, according to Solberg. “The integration of cybersecurity in these phases not only helps to identify the baseline compliance requirements that will inform the strategy but the capabilities that the organization will need to meet those requirements,” he adds. Security is a key enabler of success within any organization and has become a top strategic priority for all successful Internet-connected companies, says Jeffrey Bernstein


Many CIOs are better equipped to combat rising IT costs. Are you?

IT organizations can save substantial amounts on SaaS contracts by lowering service levels, CIOs say. “Too often we pay for the tier above what we need,” says McKee. But while Wiedenbeck did change service levels in one situation, he urges caution. “It’s dangerous to get so focused on cost that you start looking for ways to reduce it without better understanding the risks,” he says. “On the flip side, we shouldn’t be so fearful of any risk that we overpay for services and service levels. Inflation shouldn’t make us abandon balance management of cost, risk, and value, [but] I do see it as a great opportunity to revisit those areas and see if we’re willing to adjust that balance.” Partnering with software vendors is another key to keep costs under control. It should be a mutually beneficial relationship, CIOs say, so be prepared for some give and take. “There’s typically more flexibility on pricing if there’s added value that can found, for example, by introducing other clients or integrating products together, creating a win-win situation,” says McKee.



Quote for the day:

“Good manners sometimes means simply putting up with other people's bad manners. ” -- H. Jackson Brown, Jr

Daily Tech Digest - November 07, 2023

Where businesses use honeypots as part of their defences, they typically rely on traditional honeypots, i.e. a non-existent computer on the network or perhaps an entire network range, and then alert on any attempt to connect to the computer or range. This can be effective, for instance by identifying an attacker who has gained access to the internal network and is port-scanning the entire range. However, many advanced attackers do not resort to ‘noisy’ techniques such as port scanning once on the internal network, they instead often rely on subtle lateral movement such as obtaining network maps and connecting directly to servers of interest. To catch such advanced attackers requires more sophisticated honeypots. Attackers will often attempt to obtain administrative credentials to aid their movement around networks. They can do this by a number of means, from password-guessing attacks against administrative accounts, to more advanced attacks that allow them to carry out actions with the permissions of anyone using the computer they are accessing. 


What Happens When You Lose Your Cyber Insurance?

If there has been outright fraud or misrepresentation on the application, the loss of coverage could be sudden. In most cases, companies will not find themselves unexpectedly without insurance. “You're going to have notice, whether that's 60 to 90 days out,” says Cigarroa. Even with notice, organizations will be working against the clock. Can they get new coverage in time to avoid a gap? If an enterprise does experience a gap in coverage, any costs associated with a data breach or cyberattack that occurs during that period will not be offset by insurance. The prospect of getting new coverage also means that companies will have a new retroactive date for coverage. If an incident that dates back months or even years is uncovered, the new policy is very unlikely to cover it. “You are not going to be able to go back and cover things that happened under the prior insurance or especially during that window of time between when the last policy was cancelled or lapsed to when the new policy is placed,” says Moss.


Platform Engineering — Navigating Today, Forecasting Tomorrow

Platform engineering emerges as a radical shift in the world of software development and DevOps. It’s where traditional coding meets modern operational practices, aiming to keep pace with our ever-evolving tech scene. Here’s a quick dive into what’s shaking things up in platform engineering: Early bird gets the worm — The shift-left approach: The idea is simple. Tackle tasks upfront in the development process. By catching potential issues early, we boost efficiency and save headaches down the line. Walking the golden path: Imagine giving developers a roadmap through the software delivery life cycle, one that encourages freedom but within clear guidelines. ... As we dive deeper into the world of platform engineering, it’s clear that AI is shaking things up in a big way, making our work smoother and more user-friendly. Just imagine, operations teams, the unsung heroes behind the scenes, are starting to think more like product managers. Their focus is shifting from just keeping the lights on to genuinely enhancing the tools and systems developers use. And as with any major change, it’s crucial to manage the transition well. 


The impact of public cloud price hikes

Price hikes are inevitable. Those that can produce, operate, and maintain cloud services also have their own increasing bills for talent, power, and regulatory issues that make running a public cloud service challenging. Switching cloud service providers is not easy for enterprises that spent much money and time leveraging services native to specific public clouds. Moving to another cloud provider is out of the budget for most enterprises. It just costs too much to switch. Lock-in was always a known risk, and the more we optimize our applications and data sets for a specific cloud provider, the more we’re stuck with that provider. It’s a catch-22. If we choose not to use native cloud features, we give up the ability to get the most from the public cloud platforms. As a result, customers are effectively locked into their cloud service provider. The lack of mobility leaves customers at the mercy of price increases. ... Most need better options. Cloud services are vital for their operations; they feel they must find the money when the price rises. We may even see this with businesses that consume cloud services indirectly, such as retail, streaming services, and personal cloud storage systems. Those higher cloud costs are passed along to indirect cloud consumers as well.


7 Ways to Become a Software Engineer Without a Degree

You don't need a CS degree to work as a software engineer, but having some type of document that attests to your ability to program is important for landing job interviews. That's why it's worth investing time in earning at least one or two technical certifications related to programming. Some online programming courses, such as Coursera's "Introduction to Software Engineering," offer certificates of completion. You can also pay for training and certification in software engineering from organizations like the IEEE Computer Society. ... Enrolling in a coding bootcamp — which is an accelerated program that promises to teach students how to code in as little as a couple of months — has become a popular path for folks who want to work in software engineering without obtaining college degrees. There are many potential downsides of coding bootcamps. They often require you to attend class full-time, which means you can't work a job while attending the bootcamp, and there is no guarantee that you'll finish successfully. Nor is there a guarantee that employers will view completion of a coding bootcamp as evidence that you're truly prepared for a software engineering job.


3 Leadership Secrets That Lead to Team Empowerment

The biggest obstacle to team empowerment is the failure of leaders to enable people to make their own decisions — most often, because they did not provide the strategic blueprint to guide those decisions. Even I can occasionally be guilty of this. When our company hits a point of change, I can be inclined to require the escalation of a decision up to me. Sometimes, that is necessary, but even after making that decision, I still need to remember to let go again and re-establish the blueprint so everyone else will be empowered to make that decision moving forward. ... Part of empowering a team is recognizing those capable of working with greater responsibility and those who might not be comfortable with the level of risk involved. It can also mean compensating high-performers at the level of their contribution if they do not want to rise to another level. In my first job with a software company, I worked with a colleague who was a phenomenal technical writer and was comfortable remaining in that role. The head of the department recognized his skill, too, and changed the pay structure so individuals could be paid at a level similar to a manager or above because of the superior level of their work.


2024 network plans dogged by uncertainty, diverging strategies

Every network operator and vendor will stand up and pledge standards support, but every vendor will at the same time design their products and strategies to pull through their whole portfolio. Add different strokes for all those vendor folks to different mission drivers, and you understand why 79 of those 83 CIOs say they don’t really have a “single network architecture model” in place, and 48 say they’re actually moving away from a standard approach. Virtualization can make the unreal look real, so why not allow multiple personalized “unreals”? Look into the virtual-network mirror and you see...yourself. Nowhere is this more visible than in the management space. A couple decades ago, companies had a “network operations center” and a “single pane of glass” to show network status was the goal. Only 14 of 83 enterprises said they really had a NOC today, and when asked about a single pane of glass, one CIO quipped “I have five single panes of glass!” CIOs say that the current craze in “observability” is a response to the fact that it’s become very difficult to determine what the cause of an outage is.


Overheating datacenter stopped 2.5 million bank transactions

DBS and Citibank, the banks involved, experienced outages in the mid-afternoon of October 14, 2023 that resulted in full or partial unavailability of online banking apps for around two days – leaving customers and vendors without a way to make payments in a city-state that is increasingly reliant on digital financial systems. ... The root cause of the outages was issues in the cooling system that caused the temperature to rise above optimal operating range at the Equinix datacenter used by both institutions. Equinix has reportedly blamed a contractor, alleging that person "incorrectly sent a signal to close the valves from the chilled water buffer tanks" during a planned system upgrade. Upon the outage, both banks immediately activated IT disaster recovery and business continuity plans. "However," according to Tan, "both banks encountered technical issues which prevented them from fully recovering their affected systems at their respective backup datacenters – DBS due to a network misconfiguration and Citibank due to connectivity issues." Tan concluded that the two banks had "fallen short" of MAS requirements to ensure critical IT systems are resilient. 


Breaking down data silos for digital success

To break down data and political silos to drive democratization and standardization of data, technology research and advisory firm ISG started by rolling out an initiative to build and deliver a common data platform, says Kathy Rudy, chief data and analytics officer at ISG. Those spearheading the effort briefed leaders of the business units about it and asked for their support, as they approached data owners across their businesses. In preparation for the rollout, “we inventoried our data across the organization and categorized by type, owner, platform, data usage, data formats, terminology, etc.,” Rudy says. “With that knowledge, we built a data dictionary and common taxonomy.” Having this information ahead of time was critical to build trust and cooperation from data owners, whose participation was key to the program success, Rudy says. “Our understanding of their data and its structure allowed us to have pragmatic conversations about the effort required to create the common taxonomy and data structure necessary to allow for better access, usage, and monetization of data across the company,” Rudy says.


A Practical Guide to Mitigating DevOps Backlog

Business requirements are dynamic and evolve over time, adding complexity to the DevOps backlog. ... Managing these new environments and ensuring compliance with relevant regulations adds to the workload of the DevOps team and adds to the DevOps backlog stockpile. ... The symbiotic relationship between business growth and technological advancements is undeniable. As time progresses, the architecture inevitably becomes more intricate, resulting in a growing number of tasks and challenges within the DevOps backlog. The complexities surrounding DevOps resonate with numerous companies, yet the proactive approach to addressing them remains an issue. ... Observability is the key to implementing effective monitoring practices for applications. This involves setting up alerts, capturing relevant metrics and configuring dashboards to track application performance and resource utilization. Prioritizing observability empowers teams to proactively identify and resolve any issues within the DevOps pipeline, ensuring a stable and reliable environment for continuous improvement.



Quote for the day:

"Limitations live only in our minds. But if we use our imaginations, our possibilities become limitless." --Jamie Paolinetti

Daily Tech Digest - November 06, 2023

Business Continuity vs Disaster Recovery: A Guide to Key Differences

Business continuity is like an umbrella, covering every aspect of your business that could be impacted by disruptions – not just technology. Think of it as the master plan that keeps your entire operation functioning when faced with challenges. In contrast, IT disaster recovery is more specific; its focus lies in restoring systems, applications and data after an interruption occurs in tech infrastructure due to any number of causes – natural disasters, cyber-attacks or human error. The first major difference between these two concepts comes down to their scope. While business continuity covers all areas affected by potential disruptions, IT disaster recovery focuses on ensuring technological infrastructures remain functional following crises. Secondly, they have different end goals: while business continuity aims at maintaining essential functions across the organization during a crisis situation till normalcy returns; IT disaster recovery’s objective is getting systems back up and running post-interruption. A third distinction lies within timeframes: A Business Continuity Plan often has longer-term solutions compared to quicker response times expected from an effective Disaster Recovery Plan.


Unlocking the power of multi-cloud

In the era of digital transformation and widespread cloud migration, ensuring robust data security has become a paramount concern for enterprises. The introduction of regulations, such as the Digital Personal Data Protection Act 2023, extends the scope of compliance to smaller businesses, emphasizing the need for comprehensive data protection strategies. End-to-End Data Security Platforms: To address the evolving landscape of data security, businesses are advised to adopt full end-to-end data security platforms. These platforms serve a multifaceted role, helping organizations discover, protect, monitor, and respond to threats across on-premises and cloud environments. Structured and Unstructured Data Management: Platforms should enable the discovery and classification of both structured and unstructured data, providing a comprehensive view of data assets. This capability is crucial for effective data management and compliance efforts. Continuous Monitoring for Risk Mitigation: Implementing continuous monitoring practices is essential for reducing the risk of data breaches. This involves vigilant oversight of data access across on-premises and multiple cloud environments.


Shadow IT use at Okta behind series of damaging breaches

Okta CISO David Bradbury said: “The unauthorised access to Okta’s customer support system leveraged a service account stored in the system itself. This service account was granted permissions to view and update customer support cases. “During our investigation into suspicious use of this account, Okta Security identified that an employee had signed into their personal Google profile on the Chrome browser of their Okta-managed laptop. “The username and password of the service account had been saved into the employee’s personal Google account. The most likely avenue for exposure of this credential is the compromise of the employee’s personal Google account or personal device,” he said. Bradbury added: “We offer our apologies to those affected customers, and more broadly to all our customers that trust Okta as their identity provider. We are deeply committed to providing up-to-date information to all our customers.” Okta said its investigation had been complicated by a failure to identify file downloads in customer support vendor logs. 


Getting Aggressive with Cloud Cybersecurity

The best way to get started is by evaluating vendors that offer proactive cloud security tools and determining their capabilities, Dalling advises. He also suggests reviewing the existing cloud-native inventory and security techniques. “Work with your organization’s security operations center to determine the most effective way to integrate a proactive cloud security tool into their monitoring and incident response workflows,” Dalling adds. By adopting a proactive cloud security approach, organizations can safeguard themselves against security threats, ensure compliance, and increase customer trust, says Ravi Raghava, vice president of cloud solutions at technology integrator SAIC via email. “This approach is often more cost effective than dealing with the aftermath of a security breach, which can result in substantial financial and reputational losses.” He notes that business partners are more likely to trust organizations that prioritize the protection of their data through proactive security steps.


Lessons From 100+ Ransomware Recoveries

Your data retention policy is how long you keep data for regulatory or compliance reasons, and how you remove it when it’s no longer needed. Ransomware attackers have evolved their methods. They know you are less likely to pay out if you can quickly switch over to Disaster Recovery systems. They are now delaying detonation of ransomware to outlast typical retention policies. This is the limitation of DR solutions. While they are the fastest way to recover, they have a limited number of versions or days you can recover to. For one of our manufacturing customers – using both our BaaS and DRaaS products – the ransomware was present on their systems for around three months. That meant that every DR recovery point was compromised, and we had to recover from backups. The Recovery Time Objective (RTO) was a day. We recovered from backups, so it took longer than DR but relatively speaking, it was a fast recovery. The Recovery Point Objective (RPO), however, was from three months prior. The challenge that the organisation then faced was how to re-create that lost data. 


Exploring the global shift towards AI-specific legislation

It is vital that the public – but moreover, all stakeholders – be involved in discussions around AI. The technology companies developing AI, for example, are likely the best placed to understand the technology fully and can help guide any such discussion. Those organizations deploying the technology must also be closely involved, as they have a particular viewpoint to offer. Governments also need to be a part of the discussion. The position of various nations can offer value and help steer the decision-making of all those governments represented in this context. Finally, let’s not forget the general public, the individuals whose data will likely be processed by the technology. All play valuable yet different roles and will come with different viewpoints that should be aired and considered. ... Legislation or any form of regulation is often seen as restrictive: by its very nature, it comprises a set of rules that govern. That is often interpreted as “restrictive” and hinders development, innovation, and technological advancement in this context. That is a generalist, simplistic, and somewhat dismissive view.


The 10 Biggest Cyber Security Trends In 2024 Everyone Must Be Ready For Now

With the work-from-home revolution continuing, the risks posed by workers connecting or sharing data over improperly secured devices will continue to be a threat. Often, these devices are designed for ease of use and convenience rather than secure operations, and home consumer IoT devices may be at risk due to weak security protocols and passwords. The fact that industry has generally dragged its feet over the implementation of IoT security standards, despite the fact that the vulnerabilities have been apparent for many years, means it will continue to be a cyber security weak spot – though this is changing. ... Two terms that are often used interchangeably are cyber security and cyber resilience. However, the distinction will become increasingly important during 2024 and beyond. While the focus of cyber security is on preventing attacks, the growing value placed on resilience by many organizations reflects the hard truth that even the best security can’t guarantee 100 percent protection. Resilience measures are designed to ensure continuity of operations even in the wake of a successful breach. 


Andrew McAfee – ‘Human beings are chronically overconfident’

All of us, as human beings, are chronically overconfident. It’s the most common cognitive bias. That means that your brain children are going to be very, very dear to you, to the point that you’re probably unable to see the holes and the flaws. So that’s a problem. The solution is other people. This is how science works. This is why I describe one of the great geek norms is simply as “science”. Science is really subjecting your ideas to the scrutiny of other people, and then having evidence-based discussions about the merits of those ideas. Is this good? Is this correct or not? One thing you can absolutely start doing is being a little less fond of your own ideas and stress testing those ideas early and often with other people. Another thing we can do is acknowledge other people’s good ideas. Just start saying, “That’s a really good idea, thanks. I hadn’t thought of that. Maybe we should take a different approach here.” Those kinds of statements are super powerful, especially when they are coming from leader in an organisation, because as humans we are wired to take are cues from the people who have high status in an organisation. especially coming from leaders in an organisation. 


IT leader’s survival guide: 8 tips to thrive in the years ahead

With so many disruptive technologies emerging at once, and IT leaders pulled in to solving so many more business challenges, it’s easy to get caught up in the fervor. But in addition to embracing change, IT leaders need to develop a multifaceted approach to navigating current technology and business challenges, says Sanjay Srivastava, chief digital strategist at Genpact. “IT leaders need to adapt by adopting a holistic approach that focuses on resilience, agility, diversification, and collaboration,” Srivastava says. “In this evolving IT investment landscape, the definition of risk has not changed, but the timeframe for response has shortened.” ... It can be difficult to adapt quickly as technology advances, while working to comply with varying regulations across state lines and borders. “The challenge is that the technology footprint — and our understanding of potentials and pitfalls — is still maturing, for instance with generative AI. It’s understandable and expected that regulations will evolve, and working through the changes coming in an otherwise long-term tech stack will be key to getting it right,” he says.


Empowered Agile Transformation – Beyond the Framework

The Executive team could be working 10 to 20 years out of date, because their expertise and experience that got them to their current position has lost its relevance in a world of accelerated change. Their approach can be to apply past experience to current problems. Their 20-year-old solutions are incompatible with contemporary problems. They need to retrain to adopt flexible systems that adapt to new challenges. Otherwise, the workers are constrained by the level of understanding of group executives, and progress is inhibited. They are impeding their teams’ potential. We have all the tools to work contemporaneously today. We have the technology, tools, and experience to leverage agility in delivering value. It is now the executive leaders and company boards fighting the new way; a more collaborative way to generate value for businesses and their customers. The solution is to understand their current customers’ problems, and identify threats to their business models, while gaining the skills and competencies to apply contemporary ways of working.



Quote for the day:

"The most difficult thing is the decision to act, the rest is merely tenacity." -- Amelia Earhart

Daily Tech Digest - November 05, 2023

Less Code Alternatives to Low Code

Embracing a “minimalist coding” philosophy is foundational. It’s anchored in a gravitation toward clarity, prompting you to identify the indispensable elements in your code, and then discard the rest. Is there a more succinct solution? Can a tool achieve this outcome with less code? Am I building something unique and valuable or rehashing solved problems? Every line of code must be viewed for the potential value it delivers and the future burden it represents. Reduce that burden by avoiding or removing code when you can and leveraging the work of others. ... Modern frameworks offer a significant enhancement to development productivity, primarily by reducing the amount of code written to perform common tasks. Additionally, the underlying code of the framework is tested and maintained by the community, alleviating peripheral maintenance burdens. The same goes for code generators; they’re not merely about avoiding repetitive keystrokes, but about ensuring that the generated code itself is consistent and efficient. 


Software Deployment Security: Risks and Best Practices

Blue-Green Deployment is a release management strategy designed to reduce downtime and risk by running two identical production environments, known as Blue and Green. At any time, only one of these environments is live, with the live environment serving all production traffic. The primary security implication of the Blue-Green deployment is the risk of data inconsistency during the switchover. If not properly managed, sensitive data could be exposed, lost or corrupted. Furthermore, because two environments are maintained, security measures must be duplicated, potentially leading to inconsistencies and vulnerabilities if not properly managed. ... Canary deployment is a strategy where new software versions are gradually rolled out to a small subset of users before being deployed to the entire infrastructure. This strategy allows teams to test and monitor the performance of the new release in a live environment with less risk. Canary deployment can potentially expose new software versions to a smaller user base, potentially exposing vulnerabilities before a full-scale release. If a vulnerability is exploited during this stage, it could lead to a security breach affecting a subset of users. 


It’s time to take your genAI skills to the next level

The workforce of the future will learn AI in school and during the next 15 years, each successive generation of graduates will likely have much stronger AI kung fu than the last. In fact, my own son owns a Silicon Valley based startup called Chatterbox, which exists to teach AI literacy to kids as young as eight years old. Learning AI at that age is unimaginable to adults currently in the workforce. Young workers entering the workforce will have a vastly superior knowledge of, and ability with, AI than the workforce that went to school before the LLM-based genAI revolution of 2022 and 2023. That’s why one of the smartest things you can do now, regardless of your specific occupation, is to get very serious about learning a lot more about genAI. “Prompt engineering” — the ability to use words to get output from genAI tools — is the skill of the year. But it’s only a matter of time before basic proficiency in prompt engineering becomes commonplace and banal. It’s important to set yourself apart from the crowd by going further and really studying how generative AI works, its limitations and potentialities, and the ethical and legal issues are around its output.


Why digital banking is a crucial financial literacy skill for kids

By starting early and providing guidance, parents and educators play a crucial role in helping children develop strong financial literacy skills. Digital banking not only enables children to understand the mechanics of money but also fosters a healthy relationship with finances. For instance, some innovative neobanks in India are currently providing prepaid cards that are exceptionally user-friendly and intuitive. These cards offer a unique opportunity for children to develop crucial financial literacy skills, such as prudent money management, efficient budgeting, and smart savings habits. ... The positive impact of early financial education, including digital banking literacy, on long-term financial well-being cannot be overstated. Introducing children to digital banking at a young age provides them with the knowledge and skills needed to make informed financial decisions throughout their lives. It not only equips them with the tools to navigate the cashless economy effectively but also fosters financial independence, responsibility, and resilience in the face of evolving financial challenges.


Mastering a multi-cloud environment

It is essential to understand the challenges that exist while creating a robust multi-cloud architecture. You need to incorporate the right set of tools and technologies to support workload placement across diverse platforms and services. A solid operating model to effectively manage multi-cloud use is imperative – breaking it down into process security, technology, financial operations and people and skills. One of the keys is aligning IT service management with your multi-cloud operating model – implementing the right technology to effectively operate, manage, monitor and secure resources and services among providers – from data management, governance and security to vendor licenses, contracts and more. ... In today’s fast-changing and threat-laden environment, a new approach to resilience is indispensable – one that helps ensure your ability to ‘bounce back’ quickly from disruptions and maintain application availability. New functional capabilities and skills to embed resilience through design is the way forward and it will likely require businesses to give resilience greater priority as they invest in innovation.


Do we have enough GPUs to manifest AI’s potential?

The current production and availability of GPUs is insufficient to manifest AI’s ever-evolving potential. Many businesses face challenges in obtaining the necessary hardware for their operations, dampening their capacity for innovation. As manufacturers continue ramping up GPU unit production, many companies are already being hobbled by GPU accessibility. According to Fortune, OpenAI CEO Sam Altman privately acknowledged that GPU supply constraints were impacting the company’s business. ... Exploring alternative hardware to power AI applications presents a viable route for organizations striving for efficient processing. Depending on the specific AI workload requirements, CPUs, field-programmable gate arrays (FPGAs), and application-specific integrated circuits (ASICs) may be excellent alternatives. FPGAs, which are known for their customizable nature, and ASICs, specifically designed for a particular use case, both have the potential to effectively handle AI tasks. However, it’s crucial to note that these alternatives might exhibit different performance characteristics and trade-offs.


The state of API security in 2023

Only 38% of organizations have solutions that enable them to understand the context between API activities, user behaviors, data streams, and code execution. In hyper-connected digital ecosystems, understanding this data is crucial. An anomaly in user behavior or a suspicious data flow might be early indicators of a breach attempt or a vulnerability exploitation. Moreover, the capability to tailor security responses based on dynamic threat parameters is indispensable. While generalized security protocols can thwart common threats, customized defenses based on threat actors, compromised tokens, IP abuse velocity, geolocations, IP ASNs, and specific attack patterns can be the difference between a repelled threat and a security breach. Yet most organizations do not have this capability. Lastly, companies continue to overlook the need to monitor and understand the communication patterns between API endpoints and application services. An API might be functioning as intended, but if its communication pattern is anomalous or its interactions with other services are unexpected, it could be an indicator of underlying vulnerabilities or misconfigurations.


AI Safety? Rishi Sunak is all in for Elon Musk's work-free fantasy

“There will come a point where no job is needed,” Musk said. “You can have a job if you want to have a job, for personal satisfaction, but the AI will be able to do everything.” In this world, everyone would have what they wanted: “Not a universal basic income, we'll have universal high income.” Musk didn’t give any ideas on how that world will appear, either because he didn’t think he had to, or he didn’t want to face up to the idea that billionaires might have to learn to share. Instead, he cited the Civilisation science fiction novels of Iain Banks, which really just do the same thing better, placing people in a future quasi-Utopia, without giving any suggestion how society transferred from capitalism to a world of freely available high-tech. The real frustration of the AI Safety Summit, on display in the Musk-Sunak show, is that knowledge is power. Musk and the tech billionaires have both, while our elected representatives have neither [even if Sunak and his wife are personally close to billionaire status themselves].

Digital risk: Time to merge cyber security and data privacy

Taking an integrated business approach to managing digital risk delivers a number of key benefits to organisations. Firstly, it can help to bring forward digital transformation initiatives because the data classification and compliance that companies are undertaking across the business for various purposes is aligned and coordinated. Secondly, a digital risk function that conducts comprehensive assessments of third-party and supply chain digital risk is better positioned to ensure that risk is considered across the organisation. One way to do this is by pre-approving vendors from a risk perspective. “Businesses can digitally transform quicker if they do the supplier approval process up front,” says James Arthur, Partner, Head of Cyber Consulting, Grant Thornton UK. “It’s a lot easier to do this if you have a single digital risk function that proactively assesses cyber security and privacy risk together.” Thirdly, businesses continue to use new technologies to seek out commercial advantage, meaning their approach to data privacy and cyber security also needs to continually evolve, to address new threats and vulnerabilities. 


Where Does Cybersecurity Fit Into the Acquisition Process?

Acquiring and integrating an outside company also means inheriting a brand-new set of cybersecurity risks -- both direct and third-party. “If we make an acquisition, a lot of our customers will request to gain some understanding of the security of the company [we] acquired,” Huber explains. How will a company manage those newly acquired risks? Answering that question takes time and comes with a learning curve. Due diligence plays a big role in uncovering those risks, but the possibility that an unknown risk will emerge following the closing of a deal is almost certain. “I think that is always going to happen,” says Huber. “It’s not [a challenge] you can really plan for other than knowing that something’s going to happen.” Acquisitions can take months or quarters from deal consideration to closing. The first part of that process involves vetting the potential fit from business and technical perspectives. Once an acquisition appears to be a promising fit, the acquiring organization must go through its entire due diligence playbook to understand the opportunities and risks associated with its target.



Quote for the day:

"If it wasn't hard, everyone would do it. The hard is what makes it great." -- Tom Hanks