Daily Tech Digest - October 11, 2023

The CIO at a crossroads: Evolve or become a dead-end job

While the CIO role is undoubtedly changing, no business can afford to let their staff go out and buy whatever technology they want. The potential risks of leaving professionals to their own devices range from burgeoning costs in terms of cloud provision to the fear of sensitive enterprise data being pushed into public AI systems without due care and attention. Businesses need someone to ensure advanced digital technologies are exploited in a safe, secure, and cost-effective manner. And the person within the enterprise who holds that experience is still the CIO, says Richardson. “While things are now much more advanced, that core role — ensuring reliable, efficient, and secure business operations — is still crucially important,” he says. “There is certainly a very wide scope of functional and technical disciplines for modern CIOs to understand, such as cybersecurity, cloud infrastructure, AI and machine learning, end-user experience design, enterprise architecture, and more.” That’s a belief that chimes with Lily Haake, head of technology and digital executive search at recruiter Harvey Nash.


Victory by Surveillance Isn’t Possible—To Win, Engage the Adversary

Despite the profound intelligence exhibited by tools and methodologies originating from academia or the minds of Silicon Valley innovators, they inherently maintain a passive stance. More intelligent surveillance simply will not get the job done. Moreover, our faith in AI-based solutions may well be (dramatically) overstated. Like in any arms race, the enemy gains access to the same tools and techniques used in defense for the benefit of their offense. This fact does not make for any “we are pulling ahead” kind of thinking. As AI-designed offensive techniques come online, we might in fact be further behind, which is a terrible thing to say after laying out US$200 billion for security tools last year. ... When each new tool we innovate adds to our burden as defenders, but doesn’t, realistically, alter the awful trends, we need a rethink. Basic zero-sum game theory—my opponent’s gain is at my expense, and vice versa—ensures we will stay on the receiving end unless there is a cost associated with the attacker’s behavior.


Why are OpenAI, Microsoft and others looking to make their own chips?

“The obvious point is that they have some requirement nobody is serving, and I reckon it might be an inference part that’s cheaper to buy and cheaper to run than a big GPU, or even the top Sapphire Rapids CPUs, without making them beholden to either AWS or Google,” according to Omdia principal analyst Alexander Harrowell. He added that he was basing his opinion on CEO Sam Altman’s comments that GPT-4 is unlikely to scale further, and would rather need enhancing. Scaling an LLM requires more compute power when compared to inferencing a model. Inferencing is the process of using a trained LLM to generate more accurate predictions or results. Further, analysts said that acquiring a large chip designer might not be a sound decision for OpenAI as it would approximately cost around $100 million to design and get the chips ready for production. “While OpenAI can try and raise money from the market for the effort, the deal with Microsoft earlier this year essentially led to selling an option over half the company for $10 billion, of which some unspecified proportion is in non-cash Azure credits — not the move of a company that’s rolling in cash,” Harrowell said.


It’s Time to End the Battle Between Waterfall and Agile

Hybrid methodologies, such as the one implemented by Philips for their digital transformation initiatives, offer a mix of Agile’s flexibility and Waterfall’s structure. Philips adopted a hybrid approach for its HealthSuite digital platform, delivering rapid, iterative releases for software development while still adhering to strict documentation and safety guidelines. By combining these two approaches, Philips was able to create a hybrid approach that was both flexible and structured. This resulted in better product quality, reduced time to market, predictable costs and savings. ... A hybrid approach allows for risk mitigation by blending Agile’s adaptability with Waterfall’s structured planning, as demonstrated by Tesla’s hybrid approach to the development of their Model 3. To build the Gigafactory, where the vehicle’s batteries are produced, Tesla utilized rigorous planning and risk assessment methods. At the same time, Tesla’s capability to update vehicle software over the air allows for rapid issue resolution and feature addition post-production. This dual strategy enables Tesla to mitigate risks effectively while maintaining flexibility in a high-stakes manufacturing landscape.


5 Focus Areas for Better Cloud Security Programs in Financial Services

Finserv organizations need a strategy for discovering employees’ usage of cloud apps or services that haven’t been authorized by the IT department – also known as Shadow IT – that may lead to unsecured data. One popular method to monitor Shadow IT usage is a Cloud Access Security Broker (CASB): an intermediary between cloud consumers and providers that enforces security policies as cloud resources are accessed. Secure web gateways (SWG) and next-generation firewalls are other helpful tools used to inspect network traffic and provide advanced protection. But it’s not enough to just take stock of Shadow IT – organizations also need a plan for how to secure any unauthorized apps or services they discover. ... Finserv firms store an average of 61% of sensitive data in the public cloud – equal to other sectors. They also store similar types of vital data, but even more so in the way of competitor data, confidential internal documents, personal staff information, intellectual property, government identification, payment card information and network passwords. 


F5 Warns Australian IT of Social Engineering Risk Escalation Due to Generative AI

Australian IT teams can expect to be on the receiving end of social engineering attack growth. F5 said the main counter to changing bad actor techniques and capabilities will be education to ensure employees are made aware of increasing attack sophistication due to AI. “Scams that trick employees into doing something — like downloading a new version of a corporate VPN client or tricking accounts payable to pay some nonexistent merchant — will continue to happen,” Woods said. “They will be more persuasive and increase in volume.” Woods added that organizations will need to ensure protocols are put in place, similar to existing financial controls in an enterprise, to guard against criminals’ growing persuasive power. This could include measures such as payments over a certain amount requiring multiple people to approve. ... There have been warnings that armies of bots, supercharged by new AI tools, could be utilized by criminal organizations to launch more sophisticated automated attacks against enterprise cybersecurity defences, expanding a new front in organisations’ war against cyber criminals.


Translating Failures into Service-Level Objectives

We can be proactive about failure and creating SLOs from chaos engineering and game days. Chaos engineering is the discipline of experimenting on a system to build confidence in the system’s capability to withstand turbulent conditions in production. We want to inject failure into our systems to see how it would react if this failure were to happen on its own. This allows us to learn from failure, document it and prepare for failures like it. We can start practicing these types of experiments with game days. A game day is a time when your team or organization comes together to do chaos engineering. This can look different for each organization, depending on the organization’s maturity and architecture. These can be in the form of tabletop exercises, open source/internal tooling such as Chaos Monkey or LitmusChaos, or vendors like Harness Chaos Engineering or Gremlin. No matter how you go about starting this practice, you can get comfortable with failure and continue to build a culture of embracing failure at your organization. This also allows you to continue checking on those SLOs we just set up.


Turning military veterans into cybersecurity experts

Cyber threat intelligence (CTI) is a specialism within cybersecurity that has been built upon traditional military intelligence processes and theories and because of this, any who have served in a traditional intelligence role in the military will have a solid coverage of some of the hard skills needed for these roles. With support upskilling, they build their knowledge of IT networks and some CTI tooling and platforms, so this career path can be very accessible. Information security management careers also require people to know how to manage and lead, which is to manage people more often than tech. While a grounding in the technical aspects of cybersecurity is very important, ex-Forces commonly possess management and leadership skills and experience, often developed over years during their military careers, which is evidently useful the moment they step into a cyber team. To enhance this further, most have worked with sensitive data, stored and processed on sensitive systems, shaping or at least adhering to policy, and sometimes even managing large IT accounts.


6 Pain Points for CISOs and CIOs and What to Do About Them

“Everybody is struggling with the sprawling technology stack,” says Carl Froggett, CIO of cybersecurity company Deep Instinct. Many companies are working with a mix of legacy technology, like on-premises servers, and new cloud and SaaS systems. CIOs and CISOs are faced with the operational and security challenges that come with this disparate tech stack and the migration to new systems. With that sprawl comes the challenge of data governance. What data does a company have? Where does it reside? How can it be safeguarded? If CIOs and CISOs can’t answer the first two questions, they can’t even begin to collaborate on an effective strategy for protecting their organizations’ data. ... While talent is a scarce commodity, CIOs and CISOs can leverage third parties to get the skills they do not have internally and have yet to hire. They can also find ways to automate lower-level tasks, freeing staff to spend more time on other more important, less repetitive tasks. IT leadership can also retrain and upskill existing team members.


Using Visual Studio Code for C# development

The C# Dev Kit adds more in the way of code navigation tooling, using the Solution Explorer to work with test frameworks and the Roslyn tools to quickly jump to specific parts of your application, peeking at definitions and references to understand how classes and methods are used. The Solution Explorer helps manage complex projects, using virtual solution folders to group files without affecting your underlying file system. Solution folders let you separate code from tests, as well as managing different UIs for different device targets. The IntelliCode extension adds AI-supported code completion to your editor, with the ability to predict entire lines of code, based on what you’ve already written. This works alongside the normal IntelliSense features to guide code predictions, reducing the risk of errors. It will even highlight possible completions in IntelliSense and rank the members in a class based on your code to speed up selections. It’s important to understand that this is a local AI model. Unlike GitHub Copilot, IntelliCode operates disconnected from the internet, helping keep code secret and enabling you to work from anywhere.



Quote for the day:

“Identify your problems but give your power and energy to solutions.” -- Tony Robbins

Daily Tech Digest - October 10, 2023

Crafting Leaders: The finishing touches

The process of narrowing the funnel for identifying future leaders must commence soon after fresh talent is inducted within the organization and certainly long before organizational knocks have bled the spirit, energy and desire-to-be-different from these young men and women. An earlier column explained how alternative fast-track schemes function and ways to choose and groom future leaders from early stages. 2 More recently, I have added two coda to the exposition. When choosing leaders for facing the uncertainties of tomorrow it is not enough to capture their capabilities at the time of selection but take into account the steepness of the slope they have traversed to reach there. 3 That is the best guarantee of future resilience and continued development in spite of handicaps. Moreover, constraints of time and shortage of the right kind of teachers prevent those running to the top of the pyramid from formally refreshing their knowledge and capabilities as frequently as they should. ... The grooming of Fast-Trackers (FTers) must vary substantially from company to company and from individual to individual.


The undeniable benefits of making cyber resiliency the new standard

"It's about practicing due care and due diligence from a cybersecurity standpoint and having a layered defense with a layered people-process-and-technology-driven program with the right governance and services and tools to enable the mission of the organization so that if there's an event, you can recover and adapt to keep business running," he adds. To do that, CISOs and their executive colleagues must have their cybersecurity basics well established -- basics such as knowing their tolerance for risk, understanding their IT environment, their security controls, their vulnerabilities, and how those all could impact the organization's operations. CISOs aren't limited to these frameworks or the assessment tools created specifically to measure cyber resiliency, says Tenreiro de Magalhaes and others. CISOs can also run tabletop drills and red-team exercises to test, measure and report on resiliency. Repeating such drills and exercises can then track whether the organization's cybersecurity program as well as specific additions to it help improve resiliency over time, experts say.


Hybrid work is in trouble. Here are 4 ways to make it work in the longer term

"We're all humans and we work with each other," he says. "To make hybrid working effective, there must be an element of interaction. There must be a connectivity, both to the business and your team." Warne says balance is essential, so find the right reasons for bringing people together in the office. "At River Island, it's about making sure that people are in for a purpose and not just presenteeism, and making sure that the people who need to work together are able to work together," he says. "If you work with a colleague, it's crucial you don't have a situation where one of you comes into the office and the other one works from home." Warne says his team doesn't have mandated days in the office. Instead, his organization's hybrid-working strategy is all about collaboration. ... However, hybrid working has allowed for an even higher level of flexibility in her organization -- and the key to success has been constant communication. Cousineau continues to listen to feedback from her team. One staff member suggested hybrid all-team meetings were creating a big divide between those who were present and those who weren't.


Evolution of stronger cyber threat actors: The flip side of Gen AI story

Deepfake technology, a subset of Generative AI, allows threat actors to create convincing video and audio forgeries. This presents a substantial threat to organisations as deepfake attacks can tarnish reputations, manipulate public opinion, and even influence financial markets. Imagine a scenario where a CEO’s voice is convincingly mimicked, disseminating false information that impacts stock prices; or consider a deepfake video of a prominent figure endorsing a product or idea they never actually supported. Such manipulations can lead to severe consequences for businesses and society at large. Generative AI is revolutionising the way malware is created. Threat actors can use AI algorithms to generate highly evasive and adaptable malware variants that can easily evade traditional signature-based antivirus solutions. These AI-generated malware strains constantly evolve, making detection and containment a significant challenge for cybersecurity professionals. Moreover, Generative AI allows for the customisation of malware based on the target environment. 


The CIO’s primary job: Developing future IT leaders

The challenge for IT management is to find people who are good at their current job but are also interested in the management side that is necessary for departmental success. In my opinion, the reason many IT departments have decided to go outside IT to bring in CIOs is because IT has not fostered the kind of environment that develops these types of professionals. IT has not traditionally tried very hard to develop strong managers from within. Most people learn to manage by watching what their managers do. And if people have bad managers, the results can be less than optimum. So how do we change that conundrum? First, we must commit our current managers and supervisors to a strong management training program. Once they have been trained in the subtleties of management, then we hopefully will begin to see new managers with skills developed from within. Effective management training can, and should be, structured around techniques that current managers use to be successful. Delegating effectively and encouraging career growth among staff are two examples.


Evolution of Data Partitioning: Traditional vs. Modern Data Lakes

In modern data lakes, data is organized into logical partitions based on specific attributes or criteria, such as day, hour, year, or region. Each partition acts as a subset of the data, making it easier to manage, query, and optimize data retrieval. Partitioning enhances both data organization and query performance. Instead of relying solely on directory-based partitioning or basic column-based partitioning, these systems provide support for complex, nested, and multi-level partitioning structures. This means that data can be partitioned using multiple attributes simultaneously, allowing for highly efficient data pruning during queries. ... Snapshots are a fundamental concept used to capture and manage different versions or states of a table at specific points in time. Snapshots are a key feature that enables Time Travel, data auditing, schema evolution, and query consistency within modern Data Lakes like Iceberg tables. Some important features of snapshots are below : Each snapshot represents a specific version of the data table. When you create a snapshot, it essentially freezes the state of the table at the moment the snapshot is taken. 


Will Quantum Computers Become the Next Cyber-Attack Platform?

A quantum cyberattack would likely be similar to today’s identity theft and data breaches. “The only difference is that the damage would be more widespread, since quantum computers could attack a broad class of encryption algorithms rather than just the particular way that a company or data center implements the algorithm, which is how attacks are currently done,” explains Eric Chitambar, associate professor of electrical and computer engineering at the Grainger College of Engineering at the University of Illinois Urbana-Champaign. Chitambar also leads the college’s Quantum Information Group. ... Conducting an enterprise-wide quantum risk assessment to help identify systems that might be most vulnerable to a quantum attack would be a good place to start, Staab says. He also recommends deploying enterprise-wide Quantum Random Number Generator (QRNG) technology to generate quantum-resistant encryption keys. This approach promises crypto agility, implementation of Quantum Key Distribution (QKD) and the development of quantum-resistant algorithms. “As we head toward a quantum computing era, adopting a zero-trust architecture will become more important than ever,” Staab states.


6 Reasons Private LLMs Are Key for Enterprises

Private LLMs can be used with sensitive data — such as hospital patient records or financial data — and then use the power of generative AI to produce groundbreaking achievements in these fields. With the LLM running on your private infrastructure and only exposed to the people who should have access to it, you can build powerful customer-focused applications, chatbots or just provide an easier way for your employees to interact with your company data — without the risk of sending the data to a third party. ... With private LLMs, you can tailor the model and response to your company, industry or customers’ needs. Such specific information is not likely to be included in general or public LLMs. You can feed your LLM with customer support cases, internal knowledge-base articles, sales data, application usage data and so much more, ensuring that the responses you receive are what you’re looking for. ... Controlling versioning or the model you’re using is extremely important because if you change the model that you use to create embeddings, you will need to re-create (or version) all the embeddings you store.


Tech Revolution: The Rise of Automation and Its Impact on Society

To offset potential adverse effects, it is imperative for companies and governments to enact policies and initiatives supporting workers susceptible to automation’s impact. This may encompass training programs designed to furnish workers with the requisite skills to excel in the evolving job market, along with social support programs to aid those grappling with employment challenges. Public policy will emerge as a pivotal determinant of technological evolution’s trajectory and consequences. Economic incentives, education reforms, and immigration policies will directly influence productivity, employment levels, and enhanced economic mobility. ... Central and state government agencies ought to collaborate with industry partners and educational institutions to craft programs that equip new workers with the skills needed to thrive in an automation-driven world. These programs bear the potential to combat emerging inequality by propelling education and training initiatives that foster success for all.


When open source cloud development doesn't play nice

Remember that the cloud provider is merely “providing” the open source software. They are not typically supporting it beyond that. For more, you’ll need to look internally or in other places. Open source users, whether in the cloud or not, often have to rely on community resources, typically provided through forums or message boards, which takes time. This can impede cloud development progress in urgent, time-sensitive scenarios or complex issues. A developer told me once that she needed to attend a meeting of the open source community before she could have a resolution to a specific problem—a meeting that was five weeks out. That won’t work. From a security standpoint, open source software can pose specific challenges. Although a community of developers regularly reviews such software, it can still harbor undetected vulnerabilities, primarily because its code is openly accessible. For instance, some open source supply chain issues arose a few years ago. These vulnerabilities can become severe security threats without stringent security measures and frequent updates. 



Quote for the day:

''Sometimes it takes a good fall to really know where you stand.'' -- Hayley Williams

Daily Tech Digest - October 08, 2023

How AI is enhancing anti-money laundering strategies for improved financial security

Financial institutions collect massive volumes of transactional data daily, making it impractical for human experts to review each transaction for signs of money laundering manually. AI systems, on the other hand, can efficiently process this data, flagging transactions that exhibit unusual patterns or deviate from established norms. These AI systems utilise advanced algorithms to develop customer behavior profiles, creating a baseline against which future transactions can be compared. Any deviation from the norm, such as sudden large transfers, frequent cash deposits, or transactions with high-risk jurisdictions, triggers an alert for further investigation. This allows institutions to focus their resources on genuinely suspicious activities rather than drowning in false positives. Analysing data to recognise suspicious activities: AI algorithms excel at analysing enormous datasets, identifying hidden patterns and correlations that could signify money laundering activities. By examining transaction history and customer behavior, AI-enabled tools can uncover links between seemingly unrelated events.


Record Numbers of Ransomware Victims Named on Leak Sites

At current levels, 2023 is on course to be the biggest year on record for victim naming on so-called ‘name and shame’ sites since this practice began in 2019. It is expected the 10,000th victim name was posted to leak sites in late summer 2023, but this has not yet been confirmed by Secureworks. ... The 2023 report found that ransomware median dwell time was under 24 hours, representing a dramatic fall from 4.5 days during the previous 12 months. In 10% of cases, ransomware was deployed within five hours of initial access. Smith believes this trend is due to improved cyber detection capabilities, with cyber-criminals speeding up their operations to reduce the chances of being stopped before deploying ransomware. “As a result, threat actors are focusing on simpler and quicker to implement operations, rather than big, multi-site enterprise-wide encryption events that are significantly more complex. But the risk from those attacks is still high,” commented Smith.


Cloud backup and disaster recovery evolve toward maturity

At the end of the day, backup as a service is kind of just that. It operates like a regular backup application, using a schedule and point-in-time backups. DRaaS is more about failing over if something comes up as a disaster recovery process. It's designed to replicate or restore data environments automatically; it doesn't transform data in the same sense that a backup may have a particular data format. DRaaS is about moving the data from point A to point B and being able to get back to it as quickly as possible, especially in the context of a failover. ... But with the flexibility that cloud data protection affords, a lot of these solutions can essentially get updated whenever you log on because they're SaaS-based. Also, there's so much data in the cloud now and lots of investment in digital transformation, new platforms and cloud-native applications, which is triggering some rethinking of cloud data protection strategies. All of this I think is shortening the review cycles. It's actually a domino effect: Data protection follows data production. 


Mitigating Security Fatigue: Safeguarding Your Remote Team Against Cyberthreats

It’s easy for remote workers to feel disconnected from their teams and employers, which is why it’s important to keep communication consistent. Having the right collaboration tools can make all the difference in keeping remote workers engaged and more likely to follow security protocols. Video calls can help team members meet face-to-face, reducing miscommunication and misunderstandings. It’s also important to have an easy way to collaborate on projects so everyone can stay on the same page and work moves forward efficiently. Of course, any technology you use should be easy to use and easy to keep secure. With the right communication tools, your remote team members can collaborate effectively, stay connected with team members, and generally remember that they aren’t at home alone — they belong to a larger organization. This feeling of connection will encourage and remind them to implement the company’s security standards even though they work from home. As remote work becomes more popular, the need for strong security practices becomes even more vital. 

One might be inclined to believe (from the Trellix example) that the returns and competitive business risks of adopting and not adopting AI in cyber-security processes are quite high from a sales perspective. This point can be rationalised by seminal academic theory in the strategic management sciences. Based on insights from the widely popular Five Forces strategy model by Michael Porter of the Harvard Business School, the threat of new entrants (Trellix competitors), product substitutes (competitor products churned from AI-driven platforms like HVS), high bargaining power of customers (clients of Trellix-like products), and low bargaining power of suppliers (Trellix) should push enterprises to necessarily adopt AI as a cyber-security strategy to boost sales. ... On top of everything, AI as a business strategy for the modern IT/OT-driven business ecosystems has the potential to adhere very well with certain elements of the seminal Eight-Fold strategy proposed by Michael Cusumano of the MIT Sloan School of Management for software-driven businesses


How to Stay Ahead of the Regulatory Curve with Robust Data Governance?

Establishing a data governance culture requires the right combination of people, process, and technology. Defining the right roles and responsibilities (people) and developing the right data governance framework (process) are steps in the right direction. But without the right tools (technology), it becomes difficult at best for a data governance culture to succeed. A data catalog is a critical tool for organizations looking to establish a data governance culture. It gives business users, many of whom are not data experts, clarity on data definitions, synonyms, and essential business attributes so they can understand and use their data more effectively. Data catalogs show who owns the data, allowing for greater collaboration across the business. They provide a self-service way for everyone in the organization to find the data they need and turn what used to be tribal knowledge into useful and accessible information that they can use to make better business decisions.


Preparing for the Unexpected: A Proactive Approach to Operational Resilience

No firm can achieve operational resilience purely on its own. Intelligence sharing within the global financial community helps firms understand current and emerging threats and learn how others are mitigating them. It keeps larger institutions at the forefront of cybersecurity while arming smaller firms with knowledge and tools to protect themselves. It is so critical to operational resilience that DORA dedicates an entire article to it. Beyond regulation, the public sector is also increasingly collaborating with the private sector to protect critical infrastructure, which includes the financial sector. Around the world, organizations including the US Treasury Department's Hamilton Series and NATO's Locked Shields regularly conduct large-scale exercises to test that communication and coordination channels will function efficiently during major incidents. The goal is not only to minimize operational disruption but to proactively maintain public calm and trust. Operational risks are no longer geographically bound. Cross-border intelligence sharing and exercises help financial institutions build a comprehensive approach to operational resilience.


The Top 10 Hurdles to Blockchain Adoption

One of the most significant factors that has made blockchain adoption more difficult is the overall age of the average person using banking services. Unlike previous generations, the current demographic in the world is older than ever. Advancements in healthcare and other factors have increased life expectancy in most regions of the world. ... Energy consumption issues remain a top problem in the market. Conservationists have repeatedly pointed out that networks that leverage the Proof-of-Work consensus algorithm are power-hungry. The reason for this consumption is that the PoW system requires users to exercise their computational power as part of the validation structure. To combat these issues, there has been a steady migration of mining farms to renewables. ... Another issue that has held back blockchain adoption is the lack of supportive legislation for these projects. When there is a lack of governmental support, financial institutions are wary of joining an industry. The main reason for the concern is that they fear later regulatory pushback.


Redefining the Framework of Innovation

The impact of ecosystems on digital disruption today does draw sharp parallels to another important technological evolution. Specifically, it brings to mind the evolution of manufacturing and distribution technology which enabled the transition from vertical integration to multi-tier supply networks. The twist is ecosystem models look forward, not back in the value chain, enabling entire new value chains. However, while there are many clear benefits of ecosystems, these business models are contractually, logistically, and commercially complex. This is especially true when you factor in the challenges of partnering with early-stage tech companies. So, where should leaders begin when considering a partnership or alliance? Take inventory of your most critical innovation paths and evaluate them against the ecosystem model. Key criteria may include needs for outside expertise and intellectual capital, a reduction in capital risk and accelerated innovation delivery to the market. Focus time and resources on selecting the right ecosystem partner. 


Identifying The Right Risk Appetite For Your Business

While risk appetite has a traditional outlook, risk tolerance (or impact tolerance) helps companies move closer to the path of resilience. If risk appetite tells us how much risk an organization can take, risk tolerance indicates how much risk an organization "wants" to take in numbers. Essentially, tolerances are defined losses that an organization is willing to incur in meeting an objective. Every decision bears risks. If a business accepts risk or incurs loss due to a risk event that exceeds the agreed-upon risk appetite and tolerance levels, then serious fiscal, legal and reputational consequences can occur. For this reason, risk appetite should be reevaluated and reconciled whenever changes occur to strategic initiatives or the business environment. ... Risk appetite as a concept is not new, but what is trending is linking them to resilience programs so that organizations take the right amount of risk to meet business objectives while ensuring sustainability, employee health and safety and stakeholder well-being.



Quote for the day:

"The secret of success in life is for a man to be ready for his opportunity when it comes." -- Benjamin Disraeli

Daily Tech Digest - October 07, 2023

No Need to Have a 'FOBO' for AI

It is a well- known fact that before AI takes your job, someone using AI will take it. To stay relevant in the job market, it is then absolutely essential to adopt AI and automation tools to enhance one's productivity to ensure that his or her job is not rendered obsolete. Here are some strategies, which will help one stay ahead of the curve and be able to effectively compete and thrive in the fast paced and dynamic world of employment. ... Being Human: Human beings have evolved over centuries of evolution to become a superior race and embracing human emotions like empathy, gratitude, compassion, zeal to strive for the betterment of our fellow human beings will always keep us ahead of the game. This is what distinguishes us from machines. Interdisciplinary skills: Consider developing skills across multiple disciplines and combining them will make one more versatile and valuable to the employers. Problem Solving: It cannot be understated more, that problem solving and our ability to think critically to solve the complex problems around us will make us stay ahead of the machines. 


Driving Digital Transformation Through Model-Based Systems Engineering

Digital engineering is revolutionizing important areas such as the health care industry. From sophisticated imaging devices and robotic surgical systems to telemedicine platforms that connect doctors and patients across vast distances, each of these systems depends on the integration of numerous complex components, and each must operate seamlessly to ensure optimal performance. A key approach that relates systems engineering to digital transformation and digital engineering is model-based systems engineering (MBSE). Whereas traditional systems engineering relies on document-based approaches to support systems engineering activities (e.g., text-based requirements and design documents), MBSE does so by relying on digital system models instead. In essence, MBSE supports traditional systems engineering. It doesn’t replace it; rather, it offers an approach that aims to make systems engineering more efficient. 


Optimize Your Observability Spending in 5 Steps

You can’t use an observability agent on its own to put these steps into practice. Agents are simply neutral forwarders, sending out information to be processed downstream in the observability analysis tools. You could implement some of these steps using open source tools and in-house development, but this comes with increased operational cost and complexity, requiring your team to build expertise that is not core to your business. Overall, the main challenge with putting these steps into practice is that the available tools are either like agents, which simply send information, or like observability tools, which simply receive it. You need to be able to process telemetry data in stream, to be able to transform and route it as it passes from agent to tool, to optimize and shape it for your downstream requirements. Our Mezmo Telemetry Pipelines were conceived with the goal of helping organizations get better control of their data in stream. This approach enables you to control the flow between your data sources and your observability tools, and manage in detail the optimization of your data before it arrives downstream.


Why AI Regulations Are Needed to Check Risk and Misuse

Adopting a new technology poses certain risks, especially if it has not been previously deployed. That calls for certain risk mitigation strategies, such as testing, sandboxing, proof of concepts, and taking smaller steps such as minimum viable product, before complete adoption. Mahadevan believes there will always be risks and that we "amplify the risk" to a large extent today. "Companies need to follow a framework and put together a risk mitigation panel, rather than focus on the risk itself. I insist that AI and the risk mitigation should become a part of the blueprint. And this is not a job for a CIO alone, it is a job for a CHRO, the risk manager, and for operations," Mahadevan said. Deep fake and the violation of one's privacy is a hotly debated topic in the industry today. Thomas said deep fake will lead to many scams, causing victims to lose a lot of money. It is also a violation of one's privacy, and poses a substantial risk at an individual level. Deep fake technology uses a form of artificial intelligence called deep learning to create convincing videos, photo or audio clips of a subject, which are used for misinformation campaigns or to defraud/deceive relatives or friends.


New kind of quantum computer made using high-resolution microscope

It is unlikely to compete any time soon with the leading approaches to quantum computing, including those adopted by Google and IBM, as well as by many start-up companies. But the tactic could be used to study quantum properties in a variety of other chemical elements or even molecules, say the researchers who developed it. At some level, everything in nature is quantum and can, in principle, perform quantum computations. The hard part is to isolate quantum states called qubits — the quantum equivalent of the memory bits in a classical computer — from environmental disturbances, and to control them finely enough for such calculations to be achieved. Andreas Heinrich at the Institute for Basic Science in Seoul and his collaborators worked with nature’s ‘original’ qubit — the spin of the electron. Electrons act like tiny compass needles, and measuring the direction of their spin can yield only two possible values, ‘up’ or ‘down’, which correspond to the ‘0’ and ‘1’ of a classical bit. 


Net-zero carbon data centers: Expanding capacity amid evolving policy and regulation

The sting in the tail for data center developers, is that emissions associated with the IT process load are now to be included in the calculation. Given that the annual energy consumption of even a modestly sized facility could run to hundreds of thousands of megawatt hours (MWh), this represents a very substantial cost for developers – unless they can drive their on-site emissions down below the 35 percent threshold. Outside of London, there is currently no policy for carbon offsetting, but it seems likely that other local authorities will follow London’s lead and introduce similar schemes in the future. In some regions, particularly the Nordic’s, planning policy has been introduced requiring new data centers to provide waste heat to local district heating infrastructure, or to be ‘heat network ready’ for connection to future schemes. Whilst a policy of promoting heat reuse may not lead to a direct reduction in data center emissions, it is seen as an important step towards decarbonizing the wider community, by displacing other, more carbon intensive, sources of heat.


6 Key Personality Traits for Disruptive Innovation Leaders

“Disruptive innovators require a mindset focused on leapfrogging – creating or doing something radically new or different that produces a significant leap forward,” said Hightech Partners. “Disruptive leaders ensure that everything they do adds value to the market.” ... For companies, it is important that leaders understand how to continually push the limits of their teams, organizations, and partners. Some believe that disruptive leaders should also push boundaries. “Leaders who travel a lot, surrounding themselves with diverse people and entrepreneurs, are able to continually expand their mindset and creative problem solving abilities,” said the report. ... Disruptive leaders manage incredible levels of uncertainty. “Adaptive planning is an approach where actions lead to results and leaders take the opportunity to reflect on and learn from these actions and results,” said Hightech Partners. “Then, they can modify their assumptions and approaches accordingly.” ... The word “normal” doesn’t exist in a disruptive leader’s vocabulary, says the report. “Once something has become normal, it’s probably obsolete,” said Hightech Partners. 


Enterprise architecture creating sustainable business value

“If you imagine a company with a C-suite in the penthouse and the IT department maybe in the basement, and then the business department somewhere in between, enterprise architects are able to ride the elevator and they have the capability to exit the elevator on every floor. And they are also able to move around on that floor in a very free manner. “They do have their own office somewhere. Mostly it's on the floor where the IT department is, but they're barely in their office because they're constantly sitting in other people's offices to communicate, collaborate, bring together and enable people – riding the elevator up and down. ... “Business fluency and an understanding of how a business works, as well as the ability to have a holistic perspective on a complex problem, is crucial. It is important to not only look at one aspect, but also consider how that aspect might influence another aspect. That is also something that enterprise architects are trained for like nobody else. Therefore, I believe that the success of holistic sustainability will be a discipline of enterprise architecture.”


Achieving Scalable, Agile, and Comprehensive Data Management and Governance

“Data governance in general is fairly uneven,” he explained. “In terms of protecting sensitive data, there’s been improvement, though. Organizations have been more willing to shut down risky programs that may expose sensitive data even at the expense of losing competitive advantage rather than run afoul of regulations.” As a sign of this improvement, he added, 73% of survey respondents said they were at least somewhat successful at meeting their regulatory and compliance objectives. Another key concern Stodder discussed was the highly distributed nature of today’s data environment. “Creating data silos goes hand in hand with data democratization,” he said. “Forty-one percent of our survey respondents said managing data silos was one of their top three challenges.” To address this, he said, many are turning to solutions such as data virtualization, data fabrics, or data meshes. He also added that the research showed roughly 30% as already using data virtualization and about the same number planning to.


Global Cyberespionage Operations Surging, Microsoft Warns

Microsoft reports that when it comes to cyber operations and intelligence gathering, nominal allies target each other. Despite last month's meeting between Russian President Vladimir Putin and North Korean hereditary dictator Kim Jong Un, Pyongyang continues to run Moscow-focused espionage operations, especially focused on "nuclear energy, defense and government policy intelligence collection." Alongside the risk posed by nation-state groups, the threat posed by criminals also continues to intensify. "Ransomware‐as‐ a-service and phishing-as-a-service are key threats to businesses, and cybercriminals have conducted business email compromise and other cybercrimes, largely undeterred by the increasing commitment of global law enforcement resources," Burt said. Microsoft said that from September 2022 through July, it saw the number of human-operated or "hands on keyboard" ransomware attacks double compared to less sophisticated, fully automated attacks. Since last November, it said, it saw the number of security incidents that appeared to lead to data exfiltration double.



Quote for the day:

''Success is a state of mind. If you want success start thinking of yourself as a sucess." -- Joyce Brothers

Daily Tech Digest - October 06, 2023

Cloud infrastructure spending is growing

Although I love to be right about the strong cloud spending, that does not mean it’s suitable for all enterprises. Indeed, the trend will be to overspend, even after net-new finops deployments that closely monitor where the dollars are spent. We must focus on accountability, automation, and discipline around allocating and paying for cloud resources. I suspect many cloud deployments are hugely underoptimized and need a tune-up. Even though some of this shared infrastructure spending is unavoidable, CIOs need to review how the spending occurs and look for opportunities to save dollars without reducing the value generated by these systems. I suggest companies consider all other options, such as bringing some processing into enterprise data centers. Those prices have been falling while they have been stable or rising on the public cloud side. Also, many systems function in isolation and don’t benefit much from existing within a public cloud. Simple storage is one example, and many enterprises are putting those systems on-premises these days.


BAs are responsible for creating new models that support business decisions by working closely with finance and IT teams to establish initiatives and strategies aimed at improving revenue and/or optimizing costs. Business analysts need a “strong understanding of regulatory and reporting requirements as well as plenty of experience in forecasting, budgeting, and financial analysis combined with understanding of key performance indicators,” according to Robert Half Technology. ... Business analysts need to know how to pull, analyze and report data trends, share that information with others, and apply it to business goals and needs. Not all business analysts need a background in IT if they have a general understanding of how systems, products, and tools work. Alternatively, some business analysts have a strong IT background and less experience in business but are interested in shifting away from IT into this hybrid role. The role often acts as a communicator between the business and IT sides of the organization, so having extensive experience in either area can be beneficial for business analysts.


AI Needs Data More Than Data Needs AI

While data plays a foundational role in AI, the reverse is not true. Data doesn't inherently need AI to exist or be valuable. Data, in various forms, has been collected and analyzed for centuries without the need for sophisticated AI algorithms. Data on its own can provide valuable insights and inform decision-making processes. Therefore, organizations should not blindly chase the AI hype at the cost of ignoring the importance of data management and data quality. The role of AI is to take the computation and insights of good quality data to the next level and not necessarily attempt to fix the decades-old data management processes. ... While AI relies heavily on data for its operation and evolution, data can benefit from AI in several ways. Data Management: AI can help automate data management tasks, making it easier to process, clean and organize large datasets. Predictive Insights: AI can uncover patterns and insights in data that may not be immediately apparent to humans, enhancing the value of the data.


Enterprises see AI as a worthwhile investment

Despite prior industry research indicating that 90% of AI initiatives fail to produce substantial ROI and roughly half never leave the prototype stage, the overwhelming majority of respondents to this survey (92%) find business value from their models in production and 66% feel their models have delivered results that are outstanding or exceed expectations. Common use cases for AI among these leading-edge organizations include personalizing the customer experience, fraud detection, optimizing sales and marketing and improving real-time decision making. Their success of this group offers a basic roadmap that other organizations should consider when developing their own best practices, including: Approach: A majority of responding organizations have a robust, defined approach and a dedicated team for monitoring ML models in production. In fact among larger enterprises, 71% have at least 100 people working in ML while over half have more than 250. 


5 Strategies for Cloud Security in Health Care

Adopting data security in the cloud doesn’t mean merely uploading patient data to S3 and enabling encryption. There are many security controls that need to be in place before a single patient record is migrated. For instance, there is particular concern about data security on medical devices and wireless body area networks (devices that are embedded in a patient’s body). Obviously, it’s vital to secure such devices from exploits. When running services on the cloud, you should review all relevant data privacy considerations and encryption controls, including data encryption, public-key encryption, identity-based encryption, identity-based broadcast encryption and attribute-based encryption. Then adopt a framework for achieving secure and controlled identity access using federation (like OpenID Connect, which is not the same as OpenID, or SAML). Finally, you should ensure that monitoring and audit controls are in place to maintain confidentiality. You should also have an incident response plan in place to handle crisis scenarios in the event of an incident. 


Financial Institutions Turn to AI and Cloud to Solve Data Challenges

In data management, the potential uses of GenAI, powered by large language models, has been recognised by many financial institutions, including State Street. For instance, it can help in the cross-mapping of datasets, the classifying of data and more generalist applications such as summarising reports and responding to plain English inquiries. ... The Alpha platform uses GenAI with Snowflake as a strategic partner providing the data foundation of the platform. Snowflake’s cloud-native architecture streamlines data sharing and governance, enables faster time to market for data-centric applications, and offers a rich environment of AI and machine learning-based capabilities for data scientists, quants and engineers. “Every few years, the technology landscape re-sets, creating a small window of opportunity that in turn enables a giant leap in innovation; GenAI is the opportunity that will define the new set of industry leaders over the next decade,” State Street Executive Vice President and Chief Architect Aman Thind tells A-Team Group.


Building data center networks for GenAI fabric enablement

Building GenAI data centers from a network perspective differs greatly from traditional data center buildouts -- or even those that were designed to support high-performance computing (HPC). ... After all, the pace of a GenAI application is only as fast as its slowest component. If properly built, the network can be eliminated as a potential performance bottleneck. Building a highly scalable network is also key to GenAI data centers as it enables future growth capacity. Network switch fabrics must include hardware that can expand horizontally and vertically, as well as use network OSes on switching hardware that include advanced features, such as packet spraying, load awareness and intelligent traffic redirection. These features provide automated rerouting of traffic within the network and between GPU processing units that may become overloaded. ... Early GenAI adopters have concluded that the use of multisite or micro data centers is the best option to accommodate this level of density. And, yet again, this puts pressure on the network interconnecting these sites to be as high-performing and resilient as possible.


Breach Roundup: Still Too Much ICS Exposed on the Internet

Apple responded to an actively exploited zero-day flaw in iOS and iPadOS on Wednesday with the release of security patches. The identified vulnerability, tracked as CVE-2023-42824, exists in the kernel and may allow an attacker to elevate privileges. "Apple is aware of a report that this issue may have been actively exploited against versions of iOS before iOS 16.6," the company said. The update also addresses CVE-2023-5217, a WebRTC component issue. WebRTC is an open-source project that supports real-time computing between browsers and mobile applications, powering uses such as video and voice calling. ... Sony Interactive Entertainment alerted around 6,800 individuals about a cybersecurity breach. The intrusion resulted from an unauthorized party exploiting a zero-day vulnerability, tracked as CVE-2023-34362, in the MOVEit file transfer platform. This critical-severity SQL injection flaw, leading to remote code execution, was used by the Clop ransomware gang in widespread attacks in late May. 


8 Ways to Combat Ageism in Your Job Search

Workplace experts say candidates can combat this by showing what efforts they've made to quickly pick up new skills and show enthusiasm for future learning. That might mean enrolling in extra training courses, getting new certifications and highlighting them in your résumé or interview, North said. Younger workers may need to show that they have taken proactive measures to learn new job skills they may lack. Older workers may want to show that they can keep up with fast-paced environments and various tech tools. ... "If you don't have to input this information, don't volunteer it," he said, adding that phrases like 40-plus years of experience also may not be best. Instead, stick to your skills and experiences. If you lack experience in one area, show how your skills are transferrable for this specific job. You can also be clear about any kind of transition, like a career change, or gap in employment by placing it in an executive summary section at the top of your résumé, Freeman said. Quantify your previous work's impact with numbers or qualify it by explaining how it affected the results.


Ransomware Crisis, Recession Fears Leave CISOs in Tough Spot

With a new ransomware target being attacked every 14 seconds, organizations must prioritize ransomware prevention. With its developing sophistication, mitigating ransomware is increasingly more challenging. There's no silver bullet to eradicate attacks, and having to operate in a tight market adds a layer of complexity. CISOs and security leaders must focus on the best return on investment while building out a multilayered approach for improving their overall IT security. One strategy to accomplish this is managing attack vectors using encrypted channels with preventive technologies that can stop adversaries before they have a chance to compromise networks or while they are executing their multistep campaigns. ... Ransomware gangs also take advantage of legitimate websites encrypted with SSL/TLS to look secure, but have been infected with drive-by downloads. And cybercriminals leech onto browser vulnerabilities that can lead to infection when the entry point is encrypted, allowing encrypted threats embedded with malicious payloads to go unnoticed.



Quote for the day:

“People are not lazy. They simply have important goals – that is, goals that do not inspire them.” -- Tony Robbins

Daily Tech Digest - October 05, 2023

AI and Overcoming User Resistance

If users are concerned, and even worried about AI, it could lead to user resistance, which is a dynamic that IT pros are familiar with from their history of implementing new systems that alter business processes, require employee retraining, and may even change employee jobs. So, are process change and user resistance any different when you introduce AI? I would argue yes. You’re not just retraining an employee on a new set of steps for processing an invoice or taking an order. You’re actually introducing an automated thinking process into what an employee has been doing. Now, technology is going to make or recommend decisions that the employee used to make. This can lead to employees experiencing a loss of empowerment and control. ... This is exactly the “sweet spot” that companies (and IT) should aim for with AI projects: an environment where everyone sees beneficial value from AI, and where no one feels disenfranchised. This is an achievable environment if users are engaged early in business process redefinition and in how AI will work. 


Eyes everywhere: How to safely navigate the IoT video revolution

Users are rightfully wary of bringing even more cameras into their homes and offices. The good news is that they, too, can protect their camera-enabled devices with some simple steps. First, customize. This includes changing default usernames and passwords, updating the device’s firmware and software, and staying informed about the latest security threats. This is a simple yet effective way to create a barrier between yourself and would-be hackers. Next, take it to the edge. Processing and storing data at the edge instead of the cloud is another surefire way to protect your endpoints. After all, by storing the information under your own lock and key, you can be sure about who can access it and how. Users also benefit from reduced latency by storing the information closer to home, which is particularly important with heavy video feeds. Finally, buy trusted brands. Attack surfaces are only as strong as their weakest link. So, chose companies that have a proven track record when it comes to privacy and security. 


Why HTTP Caching Matters for APIs

In some caching strategies, especially for dynamic resources, the cache can store not only the complete response but also the individual elements or changes that make up the response. This approach is known as “delta caching” or “incremental caching.” Instead of sending the complete response, delta caching sends only the changes or updates made to the cached version of the resource. ... Delta caching is particularly useful for scenarios where resources change frequently, but the changes are relatively small compared to the complete resource. For example, in a collaborative document editing application, delta caching can be employed to send only the changes made by a user to a shared document, instead of sending the entire document every time it is updated. ... Caching enhances application resilience by reducing the risk of service disruptions during periods of high demand. By serving cached responses, even if the backend servers experience temporary performance issues, the application can continue to respond to a significant portion of requests from the cache. The caching layer acts as a buffer between the backend servers and the clients.


Author Talks: How to speak confidently when you’re put on the spot

People become nervous for many reasons. More than 75 percent of people report being nervous in high-stakes communication, be it planned or spontaneous. Past experience could be a factor, as well as high stakes and the importance of the goals you’re trying to achieve. Those of us who study this at an academic level believe that the nervousness is wired into being human. We see this across all cultures. We see it develop typically in the early teen years and progress from there. There’s an evolutionary component to it. One of the most helpful tips is normalizing the anxiety that you feel. You’re not alone. ... My anxiety management plan has three steps. The first thing I do is hold something cold in the palms of my hand before I speak. That cools me down. Secondly, I say tongue twisters to warm up my voice and also to get myself in the moment. Third, I remind myself, “I am in service of my audience. I am here to help them.” That really gets me other-focused rather than self-focused. That’s my anxiety management plan. I encourage everybody to find a plan that works for them.


Dell customizes GenAI and focuses on data lakehouse

Being able to fine tune as well as train generative AI is a process that relies on data, lots and lots of data. For enterprise use cases, that data isn’t just generic data taken from a public source, but rather is data that an organization already has in its data centers or cloud deployments and is likely also spread across multiple locations. To help enable enterprises to fully benefit from data for generative AI, Dell is building out an open data lakehouse platform. The data lakehouse concept is one that was originally pioneered by Databricks, as a way of enabling organizations to more easily query data stored in cloud object storage based data lakes. The Dell approach is a bit more nuanced in that it is taking a hybrid approach to data, with a goal of being able to query data across on-premises as well as mutli-cloud deployments. Greg Findlen, senior VP data management at Dell explained during the press briefing that the open data lakehouse will be able to use Dell storage and compute capabilities as well as multi-cloud storage. 


Don’t try running with data before you can walk

In South Africa, data governance tends to be a grudge investment based on regulatory issues. However, organisations that don’t do the basics well, and don’t have mature data governance and established frameworks in place, may well find they are spending on analytics technologies that don’t live up to expectations. What stands in the way of getting governance right? Firstly, it’s not easy. It involves all stakeholders across all domains. It may require a mindset change, and users may need to learn to use new technology. Secondly, it can be expensive, and it may take time before the organisation sees the value of it. One of the biggest problems is that the value of data governance investments is difficult to quantify in monetary terms. ... Data products should be supported by the entire CDO capability – including the CDO, data owners and data stewards – as well as IT, to ensure the data products will add the required business value. Owners and stewards need to identify and curate the required data for the products, while also ensuring good quality data and metadata management to make it more usable for broader business.


Yes, Software Development is an Assembly Line, but not Like That

Manufacturing engineers produce assembly lines and manufacturing processes that can produce those units of value. Software engineers are largely the same, also producing systems and processes that deliver units of value. The manufactured widget of software is actually the discrete user interactions with those features and pieces of software, not the features themselves. The assembly line in software engineering isn’t, as many think, the engineers producing features. ... Systems like Total Quality Management, which are focused on driving a cultural mindset of continuous improvement and an entire company focused on providing very low defect rates, easily translate to customer satisfaction in software organizations. Just to pick on TQM a bit, if we were to adapt it to software, we would focus on the number of times users are impacted by a defect more than the number of open bugs. Instead of tracking the number of defects and searching for more, we would be tracking the number of users who either failed to receive the promised value from the product or had severely diminished value.


Cloud Services Without Servers: What’s Behind It

“The basic idea of serverless computing has been around since the beginning of cloud computing. However, it has not become widely accepted,” explains Samuel Kounev, who heads the JMU Chair of Computer Science II (Software Engineering). But a shift can currently be observed in the industry and in science, the focus is increasingly moving towards serverless computing. A recent article in the Communications of the ACM magazine of the Association for Computing Machinery (ACM) deals with the history, status and potential of serverless computing. Among the authors are Samuel Kounev and Dr. Nikolas Herbst, who heads the JMU research group “Data Analytics Clouds”. ... “NoOps” is the first, which stands for “no operations”. This means, as described above, that the technical server management, including the hardware and software layers, is completely in the responsibility of the cloud provider. The second principle is “utilisation-based billing”, which means that only the time during which the customer actively uses the allocated computing resources is billed. 


7 sins of software development

Some software development issues can be fixed later. Building an application that scales efficiently to handle millions or billions of events isn’t one of them. Creating effective code with no bottlenecks that surprise everyone when the app finally runs at full scale requires plenty of forethought and high-level leadership. It’s not something that can be fixed later with a bit of targeted coding and virtual duct tape. The algorithms and data structures need to be planned from the beginning. That means the architects and the management layer need to think carefully about the data that will be stored and processed for each user. When a million or a billion users show up, which layer does the flood of information overwhelm? How can we plan ahead for those moments? Sometimes this architectural forethought means killing some great ideas. Sometimes the management layer needs to weigh the benefits with the costs of delivering a feature at scale. Some data analysis just doesn’t work well at large scale. Some formulas grow exponentially with more users. 


Organizations grapple with detection and response despite rising security budgets

For better understanding and evaluation, the study was able to categorize the responding organizations into "secure creators" and "prone enterprises." The grouping was done on the basis of the number of solutions used, the adoption of emerging technologies, and the use of technologies to simplify their automation environments. The study found that secure creators are more satisfied with their approach to cybersecurity, experience fewer cybersecurity incidents, and can detect and respond to incidents quicker. About 70% of them are early adopters of emerging technologies. The secure creators are also more focused on extracting the most value from specific advanced solutions, with 62% already using or in the late stages of implementing AI/ML solutions, as compared to only 45% of the prone enterprises. "When it comes to technology, the more clutter an organization has in its armory, the harder it is to pick up signals and get on top of issues quickly," Watson said.



Quote for the day:

"You’ll never achieve real success unless you like what you’re doing." -- Dale Carnegie