Daily Tech Digest - February 23, 2023

Trends in Data Governance in 2023: Maturation Toward a Service Model

Organizations will increasingly adopt a Data Governance service model as they increase implementations of AI technologies. The “EU and U.S. plan to impose new regulations to protect consumers and impact how algorithms can ingest, use, transform, and make recommendations based on datasets. Companies have a short time to ramp up their Data Governance responses to AI because many algorithms adjust inputs and outputs in real time. Organizations need more Data Governance preparation, as only 30% of a McKinsey AI study respondents recognized potential legal risks as relevant. The firms, blinded to the importance of AI regulations, will face increased pressure to adapt their Data Governance approaches by the end of 2023. EU’s draft AI regulations promise to impose more considerable fines on companies who fail to comply, 6% of their global revenue, instead of the 4% levied by the GDPR. Consequently, worker adoption of Data Governance updates, in preparation for AI regulations, with their engagement and feedback, will play a crucial role in 2023. 


Sci-fi magazine halts new submissions after a surge in AI-written stories

Clarke acknowledged there are tools available for detecting plagiarized and machine-written text, but noted they are prone to false negatives and positives. OpenAI recently released a free classifier tool to detect AI-generated text, but also noted it was "imperfect" and it was still not known whether it was actually useful. The classifier correctly identifies 26% of AI-written text as "likely AI-written" -- its true positive rate. It incorrectly identifies human-written text as AI-written 9% of the time -- its false positive. Clarke outlines a number of approaches that publishers could take besides implementing third-party detection tools, which he thinks most short fiction markets can't currently afford. Other techniques could include blocking submissions over a VPN or blocking submissions from regions associated with a higher percentage of fraudulent submissions. "It's not just going to go away on its own and I don't have a solution. I'm tinkering with some, but this isn't a game of whack-a-mole that anyone can "win." The best we can hope for is to bail enough water to stay afloat," wrote Clarke.


Pairing AI with Tech Pros: A Roadmap to Successful Implementation

“The technology can also automatically check the quality and interpret data where metadata is not available, interpret tabular data and summarize them with natural text and jointly interpret image, text, and tabular data,” he says. Krishna cautions that while generative AI has exciting potential, the recent focus on the technology has also reinforced the importance of responsible AI. “Going forward, organizations will be using AI methodologies to make decisions for their customers, employees, vendors and everyone associated with them,” he says. “A responsibility charter needs to be sponsored by C-suite leaders and developed through dynamic and consistent discussions led by the leaders in compliance, risk and data analytics.” Lo Giudice adds it is important for organizational leaders and IT workers, for example software developers, to come together and decide which AI-based tools could be deployed and the strategy behind that deployment. “Developers are influencers of this, because if they get excited about it, it will win,” he says. 


Platforming the Developer Experience

With intuitive, self-service workflows and all the tools developers need, they rarely, if ever, have to think about ‘the how’ of getting their software into the hands of users. And this works if and when an organization does at least a couple of things right: The organization prioritizes the developer experience and empowers other parts of the organization to answer the question, How can we create the optimal developer experience? The organization puts resources behind understanding and building the best developer experience–and that’s where both the developer platform and DevOps teams as “fixers” ideas emerge. Does this mean the “optimal experience” can’t be optimized? Does that mean developers cannot have input into their own (or more general) developer experience(s)? No. In fact, part of what makes the developer platform idea compelling is that developers don’t have to weigh in or make decisions on the platform or tooling. Still, it’s possible to let them have that freedom if the team or organization wants to. Bottom line: There is no one-size-fits-all developer platform any more than there is a single developer experience. 


How IT professionals can change careers to cyber security

While most IT professionals will have these skills on a basic level, many will only understand them as needed for their own day to day work, Teale says. Therefore, additional training is sometimes necessary. Many IT professionals may not need to fork out for a cyber security degree although certifications might be a helpful way forward. Basic foundational books and courses can offer some guidance, and an apprenticeship or course from a certified body might make sense for IT professionals who are looking to switch early in their careers, Finch says. ... There are a number of entry level courses available, such as CISMP or CompTIA, says Freha Arshad, managing director, Accenture Security in the UK. “All of the major cloud service providers offer security courses for varied levels and skill sets. With enterprises increasingly focused on the cloud, this area is also a good place to start.” In addition, says McQuade, there are free resources online to support self-learning: “HackXpert and TryHackMe provide training labs, while Cybrary offers a library of helpful videos, labs and training exams. ...”


CISOs struggle with stress and limited resources

The lack of bandwidth and resources is not only impacting CISOs, but their teams as well. ... Relentless stress levels are also affecting recruitment efforts with 83% of CISOs admitting they have had to compromise on the staff they hire to fill gaps left by employees who have quit their job. More than a third of the CISOs surveyed said they are either actively looking for or considering a new role. “The results from our mental health survey are devastating but it’s not all doom and gloom. Our research found that CISOs know exactly what they need to reduce stress levels: more automated tools to manage repetitive tasks, better training, and the ability to outsource some work responsibilities,” said Eyal Gruner, CEO, Cynet. “One of the most eye-opening insights from the report was the fact that more than 50% of the CISOs we surveyed said consolidating multiple security technologies on a single platform would decrease their work-related stress levels,” Gruner added.


Making Risk Management for Agile Projects Effective

Agile claims to be risk-driven and based on its implicit practices—it lends itself to an adaptive risk management style. For instance, the adaptability of sprint planning is a response to uncertainty, “biting off a small chunk at a time” to eventually deliver the finished solution. Due to its inherent nature, Agile can mitigate some risk that occurs during the sprint cycle, but this is not the only risk that may occur during a project’s lifespan. For example, in larger enterprises, there is more risk related to the external, organizational and project environments, including corporate reputation, project financing, user adoption of business changes and regulatory compliance. Management of this type of “project” risk is not addressed in most Agile literature, which focuses on risk that may occur at the sprint level. One recent proposal to address this limitation is to adopt an Agile risk management process that includes tailoring Agile methodologies to include project and enterprise risk management approaches in line with the risk context for the project.


Robotic Process Automation: Confluence of Automation and AI

According to Deloitte, it can lead to improved service, fewer mistakes, increased audibility, increased productivity, and lower costs. It makes it possible to have a workforce that is automated in a variety of ways around the clock. More sophisticated tools are taking the place of the outdated methods that relied on Excel sheets and macros. Additionally, functions like dashboarding, workflow, and proactive system and process monitoring are becoming increasingly important components of technology infrastructures thanks to these new tools. Additionally, these “new” tools frequently need to interact with older systems, which is not possible alway. To extract, format, shape, and distribute the data in a way that a downstream system can consume, necessitates human interaction. This process is being automated with RPA in a more controlled, efficient, and less labor-intensive manner. RPA bots can, for the sake of simplicity, completely automate human actions like opening files, entering data, and copy-pasting fields.


The Future of Network Security: Predictive Analytics and ML-Driven Solutions

ML-driven network security solutions in cybersecurity refer to the use of self-learning algorithms and other predictive technologies (statistics, time analysis, correlations etc.) to automate various aspects of threat detection. The use of ML algorithms is becoming increasingly popular for scalable technologies due to the limitations present in traditional rule-based security solutions. This results in the processing of data through advanced algorithms that can identify patterns, anomalies, and other subtle indicators of malicious activity, including new and evolving threats that may not have known bad indicators or existing signatures. Detecting known threat indicators and blocking established attack patterns is still a crucial part of overall cyber hygiene. However, traditional approaches using threat feeds and static rules can become time-consuming when it comes to maintaining and covering all the different log sources. In addition, Indicators of Attack (IoA) or Indicators of Compromise (IoC) may not be available at the time of an attack or are quickly outdated.


1 in 4 CISOs Wants to Say Sayonara to Security

CISOs aren't necessarily running down alerts constantly the way their employees are, but they're overloaded with other career fatigue factors. "CISOs are constantly trying to balance high expectations against an absence of the tools needed to meet those expectations," Gartner analysts wrote in the prediction piece. "Compliance-centric cybersecurity programs, significantly low executive support, and subpar industry-level maturity are all indicators of an organization that does not view security risk management as critical to business success." One of the big factors that could have CISOs reconsidering their career trajectory in cybersecurity altogether is the fear about what will happen to their professional reputation if their company gets breached, says Diana Kelley, a veteran cybersecurity executive and co-founder and CSO of Cybrize, a cybersecurity workforce planning platform. She says CISOs and CSOs worry about "having their name dragged through the mud" after a breach, or even facing criminal charges, which feels more possible in the fallout from the conviction of Uber's Joe Sullivan last year.



Quote for the day:

"Leadership is a two-way street, loyalty up and loyalty down." -- Grace Murray Hopper

Daily Tech Digest - February 20, 2023

How quantum computing threatens internet security

“Basically, the problem with our current security paradigm is that it relies on encrypted information and decryption keys that are sent over a network from sender to receiver. Regardless of the way the messages are encrypted, in theory, someone can intercept and use the keys to decrypt apparently secure messages. Quantum computers simply make this process faster,” Tanaka explains. “If we dispense with this key-sharing idea and instead find a way to use unpredictable random numbers to encrypt information, the system might be immune. [Muons] are capable of generating truly unpredictable numbers.” The proposed system is based on the fact that the speed of arrival of these subatomic particles is always random. This would be the key to encrypt and decrypt the message, if there is a synchronized sender and receiver. In this way, the sending of keys would be avoided, according to the Japanese team. However, muon detection devices are large, complex and power-hungry, limitations that Tanaka believes the technology could ultimately overcome.


Considering Entrepreneurship After a Successful Corporate Career?

Here Are 3 Things You Need to Know.Many of you may be concerned that a transition could alienate your audience and force you to wait before making a move. But this is a common misconception rooted in the idea that your personal brand reflects what you do professionally. At Brand of a Leader, we help our clients shift their thinking by showing them that their personal brand is who they are, not what they do. The goal of personal brand discovery is to understand your essence and package it in a way that appeals to others. Your vocation is only one of your key talking points, and when you pivot, you simply shift those points while maintaining the essence of your brand. So, when should you start building your personal brand? The answer is simple: the sooner, the better. Building a brand takes time — time to build an audience, create visibility and establish associations between your name and consistent perceptions in people's minds. Starting sooner means you'll start seeing results faster.


Establish secure routes and TLS termination with wildcard certificates

By default, the Red Hat OpenShift Container Platform uses the Ingress Operator to create an internal certificate authority (CA) and issue a wildcard certificate valid for applications under the .apps subdomain. The web console and the command-line interface (CLI) use this certificate. You can replace the default wildcard certificate with one issued by a public CA included in the CA bundle provided by the container userspace. This approach allows external clients to connect to applications running under the .apps subdomain securely. You can replace the default ingress certificate for all applications under the .apps subdomain. After replacing the certificate, all applications, including the web console and CLI, will be encrypted using the specified certificate. One clear benefit of using a wildcard certificate is that it minimizes the effort of managing and securing multiple subdomains. However, this convenience comes at the cost of sharing the same private key across all managed subdomains.


Overcoming a cyber “gut punch”: An interview with Jamil Farshchi

Your biggest enemies in a breach are time and perfection. Everyone wants everything done in a split second. And having perfect information to construct perfect solutions and make perfect decisions is impossible. Time and perfection will ultimately crush you. By contrast, your two greatest allies are communication and optionality. Communication is being able to lay out the story of where things are, and to make sure everyone is rowing in the same direction. It’s being able to communicate the current status, and your plans, to regulators—and at the same time being able to reassure your customers and make sure they have confidence that you’re going to be able to navigate to the other side. Optionality is critical, because no one makes perfect decisions in this kind of firefight. Unless you’re comfortable making decisions that might not be right at any given point in time, you’re going to fail. [As a leader,] you need to frame up a program and the decisions you’re making in such a way that you’re comfortable rolling them back or tailoring them as you learn more, and as things progress.


7 reasons to avoid investing in cyber insurance

Two things organizations might want to consider right off the bat when contemplating an insurance policy are the cost to and benefit for the business, SecAlliance Director of Intelligence Mick Reynolds tells CSO. “When looking at cost, the recent spate of ransomware attacks globally has seen massive increases in premiums for firms wishing to include coverage of such events. Renewal quotes have, in some cases, increased from around £100,000 ($120,000) to over £1.5 million ($1.8 million). Such massive increases in premiums, for no perceived increase in coverage, are starting now to be challenged by board risk committees as to the overall value they provide, with some now deciding that accepting exposure to major cyber events such as ransomware is preferable to the cost of the associated policy.” As for benefits to the business, insurance is primarily taken out to cover losses incurred during a major cyber event, and 99% of the time these losses are quantifiable and relate predominantly to response and recovery costs, Reynolds says.


The importance of plugging insurance cyber response gaps

The insurance industry is a lucrative target as organisations hold large amounts of private and sensitive information about their policy holders who, rightfully so, have the expectation of their data being kept safe and secure. This makes it no surprise that the industry is a key target for cyber criminals due to the massive disruption it can cause and the potential high financial reward on offer. Research shows that 82 per cent of the largest insurance carriers were the focus of ransom attacks in 2022. It is expected that the insurance industry will only become a more favourable target, and these types of disruptions will become increasingly severe. The insurance industry is one that has embraced innovation and new forms of technology in its practices over recent years in order to offer their customers a seamless experience. In doing so, alongside the onset of remote working catalysed by the pandemic, they have increased their threat surface. ... These are just the tip of the iceberg, so when cyber criminals look to exploit data, the insurance industry is a primary target due its huge customer base.


Value Chain Analysis: Best Practices for Improvements

To stay competitive, organizations must ensure that they have picked the right partners for each of the functions in the value chain, and that appropriate value is captured by each participant. “In addition to ensuring each participant’s value and usefulness in the chain, value chain analysis enables organizations to periodically verify that functions are still necessary, and that value is being delivered efficiently without undue waste such as administrative burden, communications costs or transit or other ancillary functions,” he says. Business leaders and IT leaders like the chief information officer and chief data officer must prove that they are benefiting the bottom line. While it is time consuming, value chain analysis is a key method to examine company value -- an essential practice during times of high stakes and economic uncertainty. Jon Aniano, senior vice president, Zendesk, adds running a full VCA requires analyzing and tracking a massive amount of data across your entire company.


Cybersecurity takes a leap forward with AI tools and techniques

“An effective AI agent for cybersecurity needs to sense, perceive, act and adapt, based on the information it can gather and on the results of decisions that it enacts,” said Samrat Chatterjee, a data scientist who presented the team’s work. “Deep reinforcement learning holds great potential in this space, where the number of system states and action choices can be large.” DRL, which combines reinforcement learning and deep learning, is especially adept in situations where a series of decisions in a complex environment need to be made. Good decisions leading to desirable results are reinforced with a positive reward (expressed as a numeric value); bad choices leading to undesirable outcomes are discouraged via a negative cost. It’s similar to how people learn many tasks. A child who does their chores might receive positive reinforcement with a desired playdate; a child who doesn’t do their work gets negative reinforcement, like the takeaway of a digital device.


9 ways ChatGPT will help CIOs

“ChatGPT is very powerful out of the box, so it doesn’t require extensive training or teaching to get up to speed and handle specific business processes. A valuable initial business application for ChatGPT should be directed towards routine tasks, such as filling out a contract. It can effectively review the document and answer the necessary fields using the data and context provided by the organization. With that said, ChatGPT has the potential to shoulder administrative burdens for CIOs quickly, but it’s important to regularly measure the accuracy of its work, especially if an organization plans to use it regularly. The best way for CIOs to get started with ChatGPT is to take the time to grasp how it would work within the context of their organization before rushing to widespread adoption. At these early stages of the technology, it’s better to let it complement existing workflows under close supervision instead of restructuring around it as an end-to-end solution. 


Art Of Knowledge Crunching In Domain Driven Design

Miscommunication during knowledge crunching sessions would have different reasons, such as cognitive bias, which is a type of error in reasoning, decision-making, and perception that occurs due to the way our brains perceive and process information. This type of bias occurs when an individual’s cognitive processes lead them to form inaccurate conclusions or make irrational decisions. For example, when betting on a roulette table, if previous outcomes have landed on red, then we might mistakenly assume that the next outcome will be black; however, these events are independent of each other (i.e., the probability of their results do not affect each other). Also, apophenia is the tendency to perceive meaningful connections between unrelated things, such as conspiracy theories or the moment we think we get it but actually, we do not get it. A good example of this could be an image sent from Mars that includes a shape on a rock that you might think is the face of an alien, but it’s just a random shape of a rock.



Quote for the day:

"Effective team leaders adjust their style to provide what the group can't provide for itself." -- Kenneth Blanchard

Daily Tech Digest - February 19, 2023

2023 could be the breakthrough year for quantum computing

Despite progress on short-term applications, 2023 will not see error correction disappear. Far from it, the holy grail of quantum computing will continue to be building a machine capable of fault tolerance. 2023 may create software or hardware breakthroughs that will show how we’re closer than we think, but otherwise, this will continue to be something that is achieved far beyond 2023. Despite it being everything to some quantum companies and investors, the future corporate users of quantum computing will largely see it as too far off the time horizon to care much. The exception will be government and anyone else with a significant, long-term interest in cryptography. Despite those long time horizons, 2023 will define clearer blueprints and timelines for building fault-tolerant quantum computers for the future. Indeed, there is also an outside chance that next year will be the year when quantum rules out the possibility of short-term applications for good, and doubles down on the 7- to 10-year journey towards large-scale fault-tolerant systems.


Technical Debt is a Major Threat to Innovation

The challenge is instead of trying to keep the proverbial IT lights on during the COVID-19 era, IT teams are now being asked to innovate to advance digital business transformation initiatives, said Orlandini. A full 87% of survey respondents cited modernizing critical applications as a key success driver. As a result, many organizations are embracing platform engineering to bring more structure to their DevOps processes, he noted. The challenge, however, is striking a balance between a more centralized approach to DevOps and maintaining the ability of developers to innovate, said Orlandini. The issue, of course, is that in addition to massive technical debt, the latest generation of applications are more distributed than ever. The survey found 91% of respondents now rely on multiple public cloud providers for different workloads, with 54% of data residing on a public cloud. However, the survey also found on-premises IT environments are still relevant, with 20% planning to repatriate select public cloud workloads to an on-premises model over the next 12 months.


What’s Going Into NIST’s New Digital Identity Guidelines?

Both government and private industries have been collecting and using facial images for years. However, critics of facial recognition technology accuse it of racial, ethnic, gender and age-based biases, as it struggles to properly identify people of color and women. The algorithms in facial recognition tend to perpetuate discrimination in a technology meant to add security rather than adding risk. The updated NIST digital guidelines will directly address the struggles of facial recognition in particular, and biometrics overall. “The forthcoming draft will include biometric performance requirements designed to make sure there aren’t major discrepancies in the tech’s effectiveness across different demographic groups,” FCW reported. Rather than depend on digital photos for proof, NIST will add more options to prove identity. Lowering risk is as important to private industries as it is to federal agencies. Therefore, it would behoove enterprises to take steps to rethink their identity proofing.


The Past and Present of Serverless

As a new computing paradigm in the cloud era, Serverless architecture is a naturally distributed architecture. Its working principle is slightly changed compared with traditional architectures. In the traditional architecture, developers need to purchase virtual machine services, initialize the running environment, and install the required software (such as database software and server software). After preparing the environment, they need to upload the developed business code and start the application. Then, users can access the target application through network requests. However, if the number of application requests is too large or too small, developers or O&M personnel need to scale the relevant resources according to the actual number of requests and add corresponding policies to the load balance and reverse proxy module to ensure the scaling operation takes effect timely. At the same time, when doing these operations, it is necessary to ensure online users will not be affected. Under the Serverless architecture, the entire application release process and the working principle will change to some extent.


Why Apache Beam is the next big thing in big data processing

It’s a programming model for writing big data processing pipelines which is portable and unified. Now what does it mean exactly: First let’s understand the use cases for big data processing pipelines. Batch processing: Batch processing is a data processing technique used in big data pipelines to analyze and process large volumes of data in batches or sets. In batch processing, data is collected over a period of time, and then the entire batch of data is processed together Stream processing : Processing data as it is generated. It is a data processing technique to process data in real-time as it is generated, rather than in batches. In stream processing, data is processed continuously, as it flows through the pipeline. ... Beam offers multi-language pipelines which is basically a pipeline that is constructed using one Beam SDK language and incorporates one or more transforms from another Beam SDK language. The transforms from the other SDK language are known as cross-language transforms. 


The Use of ChatGPT in the Cyber Security Industry

ChatGPT has also been useful within cybernetic defense, by being asked to create a Web Application Firewall (WAF) rule to detect a specific type of attack, in the threat hunting scenario, where it is possible that the tool creates a machine learning model in any language, such as python, so that the tool can analyze the network traffic of a .pcap file, where the network packets were captured and thereby identify possible malicious behavior, such as a network connection with a malicious IP address that is already known and may indicate that a device is compromised, indicate an unusual increase in attempts to access the network through brute force, among other possibilities. ... This is worrying to the point of schools in NYC City blocking access to ChatGPT due to concern about the negative impacts this can generate on the students’ learning process, since in most cases, depending on the question, the answer is already provided without any effort or without having to study.


Is quantum machine learning ready for primetime?

Hopkins disagrees. “We are trying to apply [quantum ML] already,” he says, joining up with multiple clients to explore practical applications for such methods on a timescale of years and not decades, as some have ventured. ...  “You’re not going to fit that on a quantum computer with only 433 qubits,” says Hopkins – sufficient progress is being made each year to expand the possible number of quantum ML experiments that could be run. He also predicts that we will see quantum ML models become more generalisable. Schuld, too, is hopeful that the quantum ML field will directly benefit from recent and forthcoming advances on the hardware side. It’ll be at this point, she predicts, when researchers can begin testing quantum ML models on realistic problem sizes, and when we’re likely to see what she describes as a ‘smoking gun’ revealing a set of overarching principles in general quantum ML – one that reveals just how much we do and don’t know about the mysteries of applying these algorithms to complex, real-world problems.


Cyber Resilience Act: A step towards safe and secure digital products in Europe

Cybersecurity threats are global and continually evolving. They are targeting complex, interdependent systems that are hard to secure as threats can come from many places. A product that had strong security yesterday can have weak security tomorrow as new vulnerabilities and attack tactics are discovered. Even with a manufacturer appropriately mitigating risks, a product can still be compromised through supply chain attacks, the underlying digital infrastructure, an employee or many other ways. Microsoft alone analyzes 43 trillion security signals daily to better understand and protect against cyberthreats. Staying one step ahead requires speed and agility. Moreover, addressing digital threats requires a skilled cybersecurity workforce that helps organizations prepare and helps authorities ensure adequate enforcement. However, in Europe and across the world there is a shortage of skilled staff. Over 70% of businesses cannot find staff with the required digital skills. 


Microservices Architecture for Enterprise Large-Scaled Application

Microservices architecture is a good choice for complex, large-scale applications that require a high level of scalability, availability, and agility. It can also be a good fit for organizations that need to integrate with multiple third-party services or systems. However, microservices architecture is not a one-size-fits-all solution, and it may not be the best choice for all applications. It requires additional effort in terms of designing, implementing, and maintaining the services, as well as managing the communication between them. Additionally, the overhead of coordinating between services can result in increased latency and decreased performance, so it may not be the best choice for applications that require high performance or low latency. ... Microservices architecture is a good choice for organizations that require high scalability, availability, and agility, and are willing to invest in the additional effort required to design, implement, and maintain a microservices-based application.


Developing a successful cyber resilience framework

The difference between cyber security and cyber resilience is key. Cyber security focuses on protecting an organization from cyber attack. It involves things such as firewalls, VPNs, anti-malware software, and hygiene, such as patching software and firmware, and training employees about secure behavior. On the other hand, “cyber resilience focuses on what happens when cyber security measures fail, as well as when systems are disrupted by things such as human error, power outages, and weather,”. Resiliency takes into account where an organization's operations are reliant on technology, where critical data is stored, and how those areas can be affected by disruption. ... Cyber resilience includes preparation for business continuity and involves not just cyber attacks or data breaches, but other adverse conditions and challenges as well. For example, if the workforce is working remotely due to a catastrophic scenario, like the COVID-19 pandemic, but still able to perform business operations well and produce results in a cyber-secure habitat, the company is demonstrating cyber resilience.



Quote for the day:

"The art of communication is the language of leadership." -- James Humes

Daily Tech Digest - February 18, 2023

Oracle outages serve as warning for companies relying on cloud technology

“Oracle engineers identified a performance issue within the back-end infrastructure supporting the OCI Public DNS API, which prevented some incoming service requests from being processed as expected during the impact window,” the company said on its cloud infrastructure website. In an update, the company said it implemented "an adaptive mitigation approach using real-time backend optimizations and fine-tuning of DNS Load Management to handle current requests." Oracle said that the outage caused a variety of problems for customers. OCI customers using OCI Vault, API Gateway, Oracle Digital Assistant, and OCI Search with OpenSearch, for example, may have received 5xx-type error or failures (which are associated with server problems), Oracle said. Identity customers may have experienced issues when creating and modifying new domains. In addition, Oracle Management Cloud customers may have been unable to create new instances or delete existing instances, Oracle said. Oracle Analytics Cloud, Oracle Integration Cloud, Oracle Visual Builder Studio, and Oracle Content Management customers may have encountered failures when creating new instances.


EU parliamentary committee says 'no' to EU-US data privacy framework

In particular, the committee noted, the executive order is too vague, and leaves US courts — who would be the sole interpreters of the policy — wiggle room to approve the bulk collection of data for signals intelligence, and doesn’t apply to data accessed under US laws like the Cloud Act and the Patriot Act. The parliamentary committee's major points echoed those of many critics of the deal in the EU, as well as the criticsm of the American Civil Liberties Union (ACLU), which has said that the US has failed to enact meaningful surveillance reform. ... In short, the committee said that US domestic law is simply incompatible with the GDPR framework, and that no agreement should be reached until those laws are more in alignment. The committee’s negative response this week to the proposed data privacy framework, however, was a nonbinding draft resolution and though it is a sticking point, does not put a formal halt to the adoption process, as its approval was not required to move the agreement along.


How edge devices and infrastructure will shape the metaverse experience

Cloud-native edge infrastructure can address these shortcomings and provide optimized service chaining. It can handle a tremendous amount of data processing while delivering cost-effective, terabit-scale performance and reduced power consumption. In doing so, edge computing can move past closed networking models to meet the demanding data processing requirements of the metaverse. “Edge computing allows data to be processed at or near the data source, implying that commands and processes will occur promptly. As the metaverse will require massive data simultaneously, processing data quickly and seamlessly depends on proximity,” Prasad Joshi, SVP and head of emerging technology solutions at Infosys, told VentureBeat. “Edge computing offers the ability to process such information on a headset or on the device, thereby making that immersive experience much more effective.” ... The power, space and cooling limitations of legacy architecture further exacerbate this data surge. While these challenges impact consumer-based metaverse applications, the stakes are much higher for enterprise use cases.


The New AI-Powered Bing Is Threatening Users. That’s No Laughing Matter

It’s not a Skynet-level supercomputer that can manipulate the real world. ... Those feats are impressive. But combined with what appears to be an unstable personality, a capacity to threaten individuals, and an ability to brush off the safety features Microsoft has attempted to constrain it with, that power could also be incredibly dangerous. Von Hagen says he hopes that his experience being threatened by Bing makes the world wake up to the risk of artificial intelligence systems that are powerful but not benevolent—and forces more attention on the urgent task of “aligning” AI to human values. “I’m scared in the long term,” he says. “I think when we get to the stage where AI could potentially harm me, I think not only I have a problem, but humanity has a problem.” Ever since OpenAI’s chatbot ChatGPT displayed the power of recent AI innovations to the general public late last year, Big Tech companies have been rushing to market with AI technologies that, until recently, they had kept behind closed doors as they worked to make them safer.


Machines Are Dreaming Instead of Learning

The question is—how much of the ‘data problem’ is about the quantity versus the quality of data? To deal with this data scarcity or quantity, people are moving away from accessing and using real data towards using synthetic data. In a nutshell, synthetic data is artificially generated data, either mathematically or statistically, which appears close to real-world data. This also increases the amount of data which, in turn, increases the accuracy of each model and removes all the existing flaws in the data. There are many positive reasons to be attracted towards synthetic data such as data privacy. ... One of the reasons that synthetic data is on the rise is to tackle the bias that is present in smaller datasets. Even though larger datasets can have poor quality data—which would require higher fine-tuning and heavier workloads—synthetic data does not represent the quality and the amount of variability that is present within real-world data. Synthetic data is generated using algorithms that model the statistical properties of real data.


Making Microservices Just the Right Size

By attempting to make smaller and simpler services, applications have become more complex. The smaller service size is a great benefit to the individual development team that owns that service, but the complex interconnection between services has made the overall system architecture more involved. We’ve essentially moved the complexity uphill. Rather than individual developers dealing with complexity at the code level, system architects deal with the complexity at the system level. Thus, services that are too large are difficult to build and understand at scale. Services that are too small simply move the complexity up to the system level. The goal, therefore, is to find the right size. It’s like the story of Goldilocks and the Three Bears; finding the right size for your services is challenging, and often involves trial and error. It’s easy to build them too big or too small. Finding the Goldilocks size can be challenging. How do you find the Goldilocks size for your microservices? The answer depends a lot on your organization and your application.


4 Ways To Be A Learning Leader

Constant curiosity makes learning simply part of you and your way of being. If you're motivated and hungry to improve your skills and knowledge, you'll learn more successfully. Professor and researcher Francesca Gino wrote, “When our curiosity is triggered, we think more deeply and rationally about decisions and come up with more-creative solutions.” Additionally, developing and demonstrating a genuine interest in people and their perspectives and interests enriches all your relationships. Start by asking yourself what you're curious about, then think about all the topics that extend from that. If this still feels hard, set an intention to ask one other-oriented question per meeting or interaction. We all consume and digest information and learning differently. Think about how you prefer to learn in given contexts. For example, do you like to just go for it? Do you like talking to other leaders, coaches or mentors? Maybe you like podcasts or reading books and articles. Discover what works best for your learning.


Malware authors leverage more attack techniques that enable lateral movement

"An increase in the prevalence of techniques being performed to conduct lateral movement highlights the importance of enhancing threat prevention and detection both at the security perimeter as well as inside networks," researchers from cybersecurity firm Picus, said in their report. Many years ago lateral movement used to be associated primarily with advanced persistent threats (APTs). These sophisticated groups of attackers are often associated with intelligence agencies and governments, whose primary goals are cyberespionage or sabotage. To achieve these goals these groups typically take a long time to understand the network environments they infiltrate, establish deep persistence by installing implants on multiple systems, they identify critical servers and sensitive data stores and try to extract credentials that gives them extensive access and privilege escalation. APTs also used to operate in a targeted manner, going to specific companies from specific industries that might have the secrets their handlers are looking for.


The cost and sustainability of generative AI

More demand for AI means more demand for the resources these AI systems use, such as public clouds and the services they provide. This demand will most likely be met with more data centers housing power-hungry servers and networking equipment. Public cloud providers are like any other utility resource provider and will increase prices as demand rises, much like we see household power bills go up seasonally (also based on demand). As a result, we normally curtail usage, running the air conditioning at 74 degrees rather than 68 in the summer. However, higher cloud computing costs may not have the same effect on enterprises. Businesses may find that these AI systems are not optional and are needed to drive certain critical business processes. In many cases, they may try to save money within the business, perhaps by reducing the number of employees in order to offset the cost of AI systems. It’s no secret that generative AI systems will displace many information workers soon.


6 quantum computing questions IT needs to ask

The challenge is the older systems' data format and fields may not be compatible with newer systems. In addition, the fields and tables might not contain what you'd expect. There is also the complexity of free text fields that store keywords. Do not underestimate the challenge of making existing data available for quantum application to work with. ... The important question in developing quantum applications is finding tools that can provide a 10-year lifespan with guaranteed software support. There are many open source tools for quantum-based application development. A company could take on one (or more) open source projects, but this can be a challenge and a costly commitment. The issue is not only keeping your software up to date (and retaining staff to develop it) but also to develop quantum software that's compatible with the rest of your IT environment. When considering lifespan, consider abandoned open source projects for quantum software applications.



Quote for the day:

"Leadership is an opportunity to serve. It is not a trumpet call to self-importance." -- J. Donald Walters

Daily Tech Digest - February 17, 2023

Bard, Bing, and the 90% problem

With search in particular, accuracy and thoroughness matter. One simple answer is fine — when it’s right. And when you can trust that it’s right. But it certainly seems like right now, that’s anything but the case with any of this technology. Hell, Microsoft's Bing-bot includes prominent disclaimers that it’s likely to provide inaccurate or incomplete information! And all novelty and cool factor aside, I just don’t see how that’ll make for an especially useful utility from a search context, for as long as that remains the case. ... It's really quite simple: If even one out of every 10 attempts at using something produces a flawed or for any reason unsatisfactory result, folks tend to lose faith in said thing pretty fast. And they then end up turning to another tool for the same purpose more often than not. That's why lots of us rely on Assistant for functional commands, which work fairly consistently — but when it comes to more complex searches, whether we've got Assistant at our beck and call on a phone or built into the core system interface on a Chromebook, we're still more likely to go to Google to get an answer.


EaaS as a Technique to Raise Productivity in Teams

EaaS can help you provide your application in a staging environment. Essentially, this environment is a copy of your production environment. EaaS tools simply assist you with duplicating the production environment and all of its elements (e.g., the codes, settings, and deployment configurations). These technologies enable you to quickly create these environments for your clients, providing them with a trial version of your software. Consequently, even before the application is finished, you may present your products to clients more quickly. EaaS also allows developers to be more creative by constructing settings similar to sandboxes in which they can experiment with new ideas without having to set up new setups or recreate current ones. The EaaS approach is scalable and cost-effective. Only the resources you use and the time your server is online are subject to payment. So, if you need to submit a proof of concept to a stakeholder, you just need to pay for the time the environment will be operational.


Fraudsters are using machine learning to help write scam emails in different languages

Scammers don't even need to speak the language of the people or organizations they're targeting: analysis of some prolific BEC campaigns by researchers at Abnormal Security suggests that email fraudsters are turning to machine learning-powered translation tools like Google Translate to help compose emails used in the attacks. This technique is enabling widespread BEC campaigns for an expanded array of cyber-criminal groups, who can cast a larger net at minimal cost. "Attacking targets across various regions and using multiple languages is nothing new. However, in the past, these attacks were perpetrated mainly by sophisticated organizations with bigger budgets and more advanced resources," said Crane Hassold, director of threat intelligence at Abnormal Security. ... The payment fraud campaigns have been distributed in at least 13 different languages, including Danish, Dutch, Estonian, French, German, Hungarian, Italian, Norwegian, Polish, Portuguese, Spanish, and Swedish.


Don’t Let a Cyberattack Destroy Your Pharmacy

One mistake that many independent pharmacies make is to use free Gmail addresses to transmit sensitive data, Mr. Gallagher added. The email service is not encrypted or secure, he stressed, which is why a better option is to use a private domain for company email. Similarly, he added, it’s important to choose HIPAA-compliant videoconferencing software, such as Microsoft Teams, for discussions with patients and internal meetings. Sloppy data disposal practices are another concern. “What we’ve learned from previous breaches that have happened at pharmacies is that whether it’s paper or whether it’s electronic, it’s really a good idea to ensure that the information is responsibly and securely disposed of,” said Lee Kim, JD, the senior principal of cybersecurity and privacy at the Healthcare Information and Management Systems Society, who wasn’t a presenter at NASP. “How many of us actually think, ‘Well, maybe I should ensure that everything is wiped from the photocopier before it gets serviced’? Probably not many, but if you don’t think about the small transactional things like that … people’s information is at risk.”


States sketch out roadmaps for zero trust ‘journey’

“Money doesn't solve every problem, and endless amounts of money would not instantly create a perfect world where every state has zero trust fully implemented in a very mature way,” Pugh said. “But it would help those states that are very budget strapped and have many competing priorities.” One way of assessing how far along states are in implementing zero trust is whether it is “top of mind in security conversations,” said Jim Richberg, public sector field CISO and vice president of information security at Fortinet. And by that measure, state leaders are paying attention. Those that have led the way on state-level zero trust said guidance already exists from the likes of the National Institute of Standards and Technology’s Authenticator Assurance Levels and Identity Assurance Levels. With those guidelines in place, said Adam Ford, Illinois’ chief information security officer during a National Governors’ Association webinar, states can establish a baseline for themselves, even though the system nationwide is set up so we are "50 experiments going on at the same time," he said.


Don't put off data minimization

From a risk-based perspective, the biggest exposure is in relation to cyberattack. This is a particular threat for law firms because cybercriminals now include you on a shortlist of prime targets. The ABA’s cybersecurity report in 2021 observed that ransomware, in particular, is: “an increasing threat to lawyers and law firms of all sizes”. Microsoft revealed that state-sponsored Chinese hackers have been targeting “US-based universities, defense contractors, law firms and infectious disease researchers”. A lack of systematic data minimisation increases your attractiveness to such criminals because you present a larger, juicier target. Moreover, cyberattack can be your biggest nightmare. It incurs lost productivity and may entail ransom demands. You’ll likely need to pay cybercrime expert fees, and potentially regulatory and professional fines. But that’s not all. A New York based entertainment law firm suffered an attack in 2020 when hackers demanded a ransom payment of USD$42 million to prevent the release of confidential information about the firm’s world-famous clients. News outlets subsequently reported that the firm eventually paid out USD$365k. And there’s the rub. 


CIO role: 4 ways to do more with less

Even the best CIOs can fall victim to a common efficiency-robbing habit: getting lost in the weeds on a particular project. As CIO, you have a lot on your plate, and it’s easy to miss deadlines or deliver sub-par performance if you get too focused on details your team can – and should – handle. Assuming you have a competent, trustworthy team, let go of more minor details and remain laser-focused on your organization’s desired strategic outcomes. When CIOs feel compelled to control every detail, it can indicate a struggling organization. If a business’ IT arm is bogged down by legacy systems or an outpouring of manual and rote tasks that do nothing for business performance, the CIO will often be mired in dealing with organizational performance issues. That means more time managing internal fire drills and less time thinking strategically and making business-critical decisions. ... When you have the confidence and infrastructure to delegate details to your team, you’ll have much more bandwidth to focus on the big picture and drive your business forward.


Navigating the ever-changing landscape of digital security solutions

We see an increasingly fragmented geopolitical landscape with unique data residency requirements for each country which is resulting in localized hosting of solutions as well as nimbleness and increased granularity of data control. Regulations like GDPR and CCPA necessitate the need for not only safeguarding information (via encryption and tokenization) but also driving automated protection of PII. Recent regulations from the White House and guidance from CISA are aimed at driving better compliance with incident disclosure as well as offering a blueprint for zero trust. ... Most progressive organizations view cybersecurity as business critical and partner with organizations like ours to create a comprehensive cybersecurity strategy. In short, while there is increased oversight, both the consumers and providers of security solutions are more focused on: implementing a zero-trust approach, instituting automated protection of information and taking a partnership posture as opposed to a traditional vendor-buyer approach.


Cybersecurity Jobs Remain Secure Despite Recession Fears

"With reports of job cuts at organizations including Twitter, Meta, Microsoft, Amazon and Google, cybersecurity staff could benefit from proactive hiring targeted towards those recent layoffs," the report stated. "With so many tech jobs impacted by recent layoffs, it is possible that many of those individuals may find opportunity in pursuing a career in cybersecurity, where they can apply related skills and expertise." The resilience in demand for cybersecurity professionals comes as many workers burned out and resigned, part of the Great Resignation in 2022. Organizations that lost valuable specialists did so for three main reasons, Rosso says. Cybersecurity teams have traditionally not had great career advancement opportunities, so their ability to gain promotions and increased salaries at their current company are often limited. In addition, the culture surrounding many security teams has often led to burnout and mental stress, she says. "We know, for example, that at the end of 2021 and beginning of 2022, the Log4j issue was causing people to clock a lot of hours, and that led to some burnout," she says. 


Why Your Organization Needs to Embrace Data Resiliency

Enterprises should take a holistic approach to understanding their data: how it's gathered, how it's used throughout the organization, and how it's impacted by a lack of availability or corruption, Krishnamoorthy says. “This starts with creating a detailed map of business processes, applications, systems, and data,” he suggests. Schick notes that there's no industry-standard checklist for ensuring data resiliency, but advises separating critical and non-critical data, storing data in separate locations, logging transactions that change critical data, and using tools and processes to quickly recover corrupted or lost data. Enterprises should retain data only for as long as it's needed, O'Hern suggests. “We eliminate risk when we purge … which means it no longer exists to be held hostage.” Krishnamoorthy notes that it's also important to understand how applications, automated tools and systems, and IT staff interact with enterprise data from manageability, serviceability, and security perspectives. 



Quote for the day:

"Nothing is so potent as the silent influence of a good example." -- James Kent

Daily Tech Digest - February 16, 2023

Eyes on Data: The Elevated Role of Data Management and the CDO

As data becomes more prevalent for every single employee of every organization, it is imperative that organizations go beyond data governance to develop a strong data-driven culture. The importance of a data-driven culture was identified as a key factor in overall success. Data culture starts at the top. Senior executives must establish a data mindset across the firm, emphasizing the importance of a sound data management discipline. Getting the most out of an organization’s data means investing in the programs that support it and the people who are tasked with using it to ensure strong data awareness and literacy. Without a focus on data literacy, organizations are at risk of coming up short in achieving their objectives. ... Today’s data management professionals are assuming more and more responsibility for the public’s data. It is critical, therefore, that firms take responsibility for the ethical access and use of this data and do everything they can to avoid unintentional outcomes due to poor data quality, lack of data analytic model governance, or hidden data biases. 


What are the biggest challenges organizations face in their digital transformation efforts?

The leadership should give a big safety net to everyone by saying ‘Hey, we are going on this journey, we are going to learn a lot and, if you fail, if you have issues, that’s okay. We’ll cover you, we are on your side, let’s just go through this learning journey first.’ So give that safety net for everyone. At the same time, provide some kind of framework for learning. You can’t just say to a whole organization we are going to be transforming ourselves and we are going to be DevOps-enabled and just leave it at that. You should have a program, and some kind of learning mechanism, and probably some outside training if that is needed. You should have days set aside, maybe even give employees 80% of the time to do normal work but 20% to learn something new. This framework of learning and enabling is really important for people to upskill themselves. Think in a different way and basically be happy about the journey that they are on because once people are motivated and happy, then a bunch of stuff starts happening.


Operational Resilience: More than Disaster Recovery

The broader focus of operational resilience requires organisation-wide participation. You cannot simply leave it to a single department or team. Instead, everyone needs to be involved, from executives and the board of directors to individual employees in multiple departments. In today’s climate, it’s not just your own organisation that’s under threat. Your suppliers, partners, and vendors are targets, too. If a major supplier is compromised or taken down, your business might go down with them. Leadership needs to understand risk and to know the risk tolerance and risk appetite of the company. That even includes things such as procurement functions and agreements with third-party suppliers. Resilience must be built into everything down to every-day workflows, and if a single supplier is insufficient to manage risk, then diversity of supply is a must. There are many cases where a cyber event at a supplier rendered multiple organisations unable to fulfil their business outcomes. ... In the context of operational resilience, every risk management scenario and process must consider the supply chain.


Helping users and organizations build an instinctive data privacy habit

Although there’s no one-size-fits all measurement metric for engagement, there are a few guidelines we typically use. The first is tracking the level of engagement and participation from our stakeholders, including consumers, businesses, and other non-profit organizations we partner with. We look at the number of events and activities hosted (e.g., webinars, panel discussions, etc.) and how well attended those are, as well as media/social media mentions as a bellwether for gauge reach and impact of the campaign. Additionally, we also measure success by the extent to which we’ve achieved the objectives and goals we set at the beginning of each campaign. It’s an abstract metric, but for Data Privacy Week, our mission is to raise awareness and educate individuals and organizations about the importance of protecting personal information and championing transparency by companies relative to how they collect, store, and use consumer data. 


6 things ChatGPT can't do (and another 20 it refuses to do)

It won't write about anything after 2021 - As an AI language model, I was trained on a large dataset of text that only goes up to 2021. This means that I don't have access to information or events that occurred after that date. While I have access to a lot of historical data and information, I don't have access to current events, news, or updates that have occurred since my training data cutoff date. ... It won't predict future results of sports events or political contests - As an AI language model, I can process and analyze large amounts of data, including historical data on sports events and political contests. However, making accurate predictions for these types of events is not always straightforward. While historical data can be used to train machine learning models to identify patterns and correlations, it cannot account for unforeseeable events, changes in circumstances, or new variables that may impact the outcome. Additionally, humans make decisions and can change their behavior based on many factors, making it difficult to predict the outcome of an event with complete certainty.


8 signs you’ve mistimed a major IT initiative

Indecision and a poor risk appetite are two common reasons why CIOs frequently delay an important IT initiative. “Nimble, forward-looking leaders put execution speed at the center of performance discussions; indecisive IT leaders tend to rely on consensus decisions and endless risk evaluations,” says Colm Sparks-Austin, president and managing director of business advisory firm Capgemini Canada. To correctly time a major IT initiative, the decision-maker should align the initiative with business goals. “If the business isn’t spearheading the initiative, or is not aware of it, it’s clear that something is wrong,” Sparks-Austin says. CIOs should also ensure they’re analyzing all IT spend through a business goals lens, Sparks-Austin advises.  ... Unrealistic funding almost always plays an important role in initiative timing, observes Ravi Malick, CIO at cloud-based content management, collaboration, and file-sharing tool provider Box. Overly optimistic funding is almost always a main part of the equation when an initiative fails, he notes. 


How to make progress on managing unstructured data

“As the CIO, your job is to be able to provide the information a business needs in order to make decisions,” Minetola said. “The ability to now see into that 80 per cent of the data and make decisions based off that . . . is significant.” ... When thinking about all the data sources an organisation needs to grapple with as part of its transformation, it makes sense. For instance, consider a bank with thousands of computer systems in over a hundred countries. “You need technologies that close silos,” Evelson said. “Whenever we talk about digital transformation, data and analytics platforms that unify everything that I just talked about, like search-powered technologies, are at the top of everyone’s mind.” ... Search-powered technology should bring two critical capabilities to the table: a visualisation layer and machine learning. Visualisation improves the ability to extract insights from large volumes of data. “It’s one thing to be able to have data,” Minetola said. “It’s another thing to understand it.” Furthermore, machine learning such as natural language processing or vector search can help join data sources to create more relevance and context.


What Ukraine's IT Industry Can Teach CIOs About Resilience

The agile, remote structure refined during the pandemic has served Ukrainian IT companies well as they operate using a hybrid workforce -- some employees live abroad, some are on the move due to Russian attacks, and others serve in the military. Unlike traditional industries, many IT jobs are service-oriented. “​​All you need is a computer, Internet, and electricity. You can literally work from anywhere,” Kavetskyi says. Both companies and individuals have engaged in a sustained process of business continuity planning. Now, most organizations have it down to a science. “They have power generators in their offices and Starlinks,” Kavetskyi claims. He emphasizes the power of knowledge sharing: “The IT clusters helped small and medium-sized companies implement basic continuity plans. Everyone working in this industry had a chance to see what others were doing.” “Of course, there was data that couldn't be shared,” he adds. “But in general, big companies were willing to [share their strategies]. Mainly, we had to find time to organize those meetings, considering the logistical challenges.”


Soft skills: How well-rounded IT pros can push your business forward

With organizational spend under greater scrutiny, it’s critical for every new hire you onboard to add value to the business. Productivity and technical skills are paramount in demonstrating resource value. But when you have two candidates with comparable technical skills, you need to consider the value each person’s soft skills bring to the table. ... Soft skills impact how teams communicate, collaborate, and problem-solve, and these capabilities determine the success of your IT projects and client relationships – and, ultimately, your organizational culture. Company culture also plays a crucial role in your brand reputation: You want clients and job candidates to view your team as pragmatic, business-minded problem solvers and communicators. So as non-technical skills continue to play a critical role in the IT arena, it’s time to reconsider the qualities you search for and foster in employees. Skills tests like coding problems and design scenarios make it relatively easy to gauge an applicant’s technical skills. 


Evolving cyberattacks, alert fatigue creating DFIR burnout, regulatory risk

Magnet Forensics’ respondents generally agreed that addressing the burnout and alert fatigue facing DFIR professionals is hampered by recruiting and hiring challenges as well as onboarding difficulties and a lack of automation. Increased investment in automation would be “highly” or “extremely” valuable for a range of DFIR functions including the remote acquisition of target endpoints and the processing of digital evidence, half of respondents said. However, while automation such as security orchestration, automation, and response (SOAR) is already in place in many SOCs, those solutions orchestrate and automate cybersecurity runbooks by taking telemetry, enforcing actions and using other tools, the report noted. “While important for threat containment and remediation, these runbook-related activities are distinct from those performed by digital forensics automation solutions, which execute a data transformation pipeline by orchestrating, automating, performing, and monitoring forensic workflows,” it added.



Quote for the day:

"Take time to deliberate; but when the time for action arrives, stop thinking and go in." -- Andrew Jackson