Daily Tech Digest - January 28, 2023

6 tips for making the most of a tight IT budget

Having a clear vison of where you are and where you are going helps to put everything into perspective. As Madhumita Mazumder, GM-ICT at Australian tourism company Journey Beyond says, “If we have a proper strategic plan for the IT department that is aligned with the organization’s vision, we can achieve things within the budget and deal with half the problems that could arise six months or a year down the line.” Giving an example of this approach, Mazumdar says, “We have got absolute clarity on pursuing a cloud-first strategy. The vision of having 100% cloud infrastructure enabled us to significantly reduce our third-party data center costs as we migrated it into our cloud environment.” Similarly, Mazumdar is clear on the outsourcing versus insourcing debate. “I am a big fan of insourcing and support developing a team to take things in-house. For instance, having an in-house team ensures 100% patching of all my network devices on time. Patching happens at odd hours when the business isn’t operating. 


The developer role is changing radically, and these figures show how

"Today, developers are no longer just people building software for technology companies. They're an increasingly diverse and global group of people working across industries, tinkering with code, design, and docs in their free time, contributing to open source projects, conducting scientific research, and more," writes Dohmke. Also, the world's developers are no longer so highly concentrated in the US. GitHub has about 17 million users in the US, which is still its largest user base, but the service predicts India -- whose GitHub developer population stands at 10 million today -- will surpass the US by 2025. "They're people working around the world to build software for hospitals, filmmaking, NASA, and the PyTorch project, which powers AI and machine learning applications. They're also people who want to help a loved one communicate and family members overcome illnesses," Dohmke notes. On top of this, Microsoft's multi-billion dollar investment in OpenAI is helping to attract new developers via services such as its paired programming coding assistant GitHub Copilot, which uses OpenAI's Codex to suggest coding solutions. 


Attackers move away from Office macros to LNK files for malware delivery

LNK abuse has been growing since last year, according to researchers from Cisco Talos, who have seen several attacker groups pivoting to it. One of those groups is behind the long-running Qakbot (also known as Qbot or Pinkslipbot) malware family. "​​Qakbot is known to evolve and adapt their operation according to the current popular delivery methods and defense techniques," the researchers said in a new report. "As recently as May 2022, their preferred method of distribution was to hijack email threads gathered from compromised machines, and insert attachments containing Office XLSB documents embedded with malicious macros. However, after Microsoft announced changes to how macros were executed by default on internet downloaded content, Talos found Qakbot increasingly moving away from the XLSB files in favor of ISO files containing a LNK file, which would download and execute the payload." However, LNK files have a lot of sections and contain a lot of metadata about the machines that generated them, leaving unique traces that can be associated with certain attack campaigns or attacker groups.


What developers should do during a downturn

Particularly if you have worked only at one company for a long while, it may be time to “upskill” yourself. There are more ways than ever to do this. While there are a lot of expensive so-called boot camps, these are lean times, and frankly, some of them are predatory. Consider self-study using MOOCs like Coursera, Udemy, Saylor, and EdX. These have university-style courses that are free or low-cost. If you are early in your career, you can now get a certification or even a bachelor’s degree in computer science entirely online. Both the University of London and BITS Pilani offer bachelor’s programs on Coursera. (A number of other schools offer master’s programs.) However, MOOCs are not the only game in town. Your local university is also getting in on the game and may offer completely online courses. Having done this recently, my advice is not to bother with a formal degree if you are already a seasoned professional, unless you are switching fields. Universities have a lot of “money” courses they make you take in which you fulfill “requirements” but in which you learn nothing of any value whatsoever. 


How Smarter Data Management Can Relieve Tech Supply Chain Woes

As data volumes have exploded in the last few years – primarily from growing unstructured data volumes such as user files, video, sensor data and images – it is no longer viable to have a one-size-fits-all data management strategy. Data needs will always change unpredictably over time. Organizations need a way to continually monitor data usage and move data sets to where they need to be at the right time. This is not just a metrics discussion but requires regular communication and collaboration with departments. What are the new requirements for compliance, security and auditing? What about analytics needs in the coming quarter and year? This information helps all IT departments optimize decisions for ongoing data management while still keeping costs and capacity in mind. For instance, by knowing that the R&D team always wants their data available for retesting for up to three months after a project, IT can keep it on the NAS for 90 days and then move it to a cloud data lake for long-term storage and potential cloud machine learning or artificial intelligence analysis. 


Best practices for migrating unstructured data

The quality of your data will have a direct impact on how successful your migration is. To assess data quality, you should first examine the data’s structure. You’ll need to ensure that all data is properly organized, labeled and formatted. You should also examine any external factors that affect data quality, such as errors in source files or duplicate entries. Once you have evaluated the data’s structure, you should look for any possible inconsistencies. Check for incorrect spelling, typos and any other errors that could affect the accuracy of your migration. You should also ensure that all data is up-to-date and accurate. It is critical to ensure that your unstructured data complies with applicable laws, regulations and industry standards such as HIPAA, GDPR, GxP, PCI-DSS and SOX. To maintain compliance at all stages, you’ll need to ensure that the data migration process meets all relevant requirements for the laws that apply to your business. To begin, make sure you have the right security measures to protect data in transit and storage. This might include encryption at rest and in transit as well as other technical safeguards.


5 technical skills aspiring IT architects need to learn

Much like code, IT architecture design is also prone to technical debt. Technical debt includes the cost of additional work required later as a consequence of choosing easy solutions instead of better approaches. The primary objective of creating technical debt is to prioritize delivery over proper design principles. In general, technical debt is bad and should be avoided. However, an experienced architect knows when to use technical debt to speed up delivery, reduce the time to market, and achieve better results. A part of an architect's role is to draw a roadmap for technical debts and then document, track, and address them promptly. Overly architected systems may exhibit unintended consequences like performance impacts and suboptimal user experience due to network latency and other factors. Synonymous with how database developers compromise on normalization rules and introduce data redundancy to improve query performance, architects sometimes need to compromise on architecture and design principles to specific objectives related to things like: Time to market; System performance; Customer experience.


ChatGPT is 'not particularly innovative,' and 'nothing revolutionary'

"OpenAI is not particularly an advance compared to the other labs, at all," said LeCun. "It's not only just Google and Meta, but there are half a dozen startups that basically have very similar technology to it," added LeCun. "I don't want to say it's not rocket science, but it's really shared, there's no secret behind it, if you will." LeCun noted the many ways in which ChatGPT, and the program upon which it builds, OpenAI's GPT-3, is composed of multiple pieces of technology developed over many years by many parties. "You have to realize, ChatGPT uses Transformer architectures that are pre-trained in this self-supervised manner," observed LeCun. "Self-supervised-learning is something I've been advocating for a long time, even before OpenAI existed," he said. "Transformers is a Google invention," noted LeCun, referring to the language neural net unveiled by Google in 2017, which has become the basis for a vast array of language programs, including GPT-3. The work on such language programs goes back decades, said LeCun.


Delegation: The biggest test for transformational CIOs

CIOs can take steps to minimize the risks of delegated decisions resulting in bad decisions by ensuring that the people to whom the IT organization delegates have the right skills and expertise, as well as an understanding of overall business goals and the architectural frameworks into which their decisions must fit. Perhaps the biggest concern is around cyber security, says Atkinson. “When you distribute decision making for the launch of technology environments, you risk having under-managed environments for cyber security purposes,” he says. CIOs can address this by establishing standards and encouraging more collaborative decision making. Royal Caribbean’s Poulter sees teamwork as an essential component of risk reduction. The security team is just one participant in a decision-making team that should include application, architecture, infrastructure, and other experts, she says. Giving teams the autonomy to come together to make cross-domain decisions is hugely important.


3 Data Management Rules to Live By

As companies continue to build out dedicated data teams and full-fledged data-centric organizations, look for a higher level of specialization to make its way to the management of the data stack. Here are just a few of the roles I expect to play a major part in managing the data stack in the future. The data product manager is responsible for managing the life cycle of a given data product and is often responsible for managing cross-functional stakeholders, product road maps, and other strategic tasks. The analytics engineer, a term made popular by dbt Labs, sits between a data engineer and analysts and is responsible for transforming and modeling the data such that stakeholders are empowered to trust and use that data. Analytics engineers are simultaneously specialists and generalists, often owning several tools in the stack and juggling many technical and less technical tasks. The data reliability engineer is dedicated to building more resilient data stacks, primarily via data observability, testing, and other common approaches. Data reliability engineers often possess DevOps skills and experience that can be directly applied to their new roles.



Quote for the day:

"Leadership, on the other hand, is about creating change you believe in." -- Seth Godin

Daily Tech Digest - January 27, 2023

The Evolution Of Internet Of Things

The IoT ecosystem is more than mere connected devices, nor is IIoT merely a matter of connecting plants and machinery to the edge or cloud storage. There are a whole lot of technologies involved in the process ranging from chips and sensors that capture data from physical assets to communication networks; advanced analytics, including machine learning and artificial intelligence; Simulation and collaborative tools, including digital twins; machine vision and human-machine interfaces; and security systems and protocols. Among the major players in the IoT/IIoT space are ABB, Amazon Web Services, Cisco Systems, General Electric, IBM Corporation, Intel Corporation, Microsoft, Oracle Corporation, Robert Bosch GmbH, Rockwell Automation and Siemens, besides several others. Industrial automation, as exemplified best by the progress made by the automobile industry over the last 100 years, increased productivity dramatically and reduced the cost of production. 


Why Founders Are Hiring These Two Coaches to Supercharge Their Business

What I want to encourage founders to do is invest in help, join a community or hire a coach, or get an advisor who can really be there in a more effective capacity, someone that you can bounce ideas off of, someone that you can be extremely honest with when things are going wrong. You need that support. You can't build the business alone. And it really takes a combination of mindset and strategic work. So when we work with founders, we build a strategic roadmap while we also work on this mindset and in their professional growth. We believe that you cannot have a successful company without both pieces of the puzzle. ... Here's what I say about juggling it all, is that, think about it as if you're juggling balls, and some of the balls are glass and some are rubber, and your clean house is a rubber ball and your health is a glass ball. So make sure that the balls that you're dropping are rubber and not glass. You'll always be dropping balls. And the other thing is everyone needs help. 


Difference Between Conversational AI and Generative AI

Conversational AI is the Artificial intelligence (AI) that can engage in conversation and refers to tools that allow users to communicate with virtual assistants or chatbots. They mimic human interactions by identifying speech and text inputs and translating their contents into other languentreeeeeeages using massive amounts of data, machine learning, and natural language processing. While Generative AI often uses deep learning techniques, like generative adversarial networks (GANs), to identify patterns and features in a given dataset before creating new data from the input data. Now that we have a fair idea of Conversational AI and Generative AI, let’s dive deeper into how they work and differ. In conversational AI the two major components are, Natural language processing (NLP) and machine learning are two major components to keep the AI algorithms up-to-date, these NLP operations interact with machine learning processes in a continual feedback loop. The fundamental elements of conversational AI enable it to process, comprehend, and produce responses naturally.


3 business application security risks businesses need to prepare for in 2023

As organizations ramp up their digital transformation efforts and transition between on-premises and cloud instances, they’re also increasingly bringing in web-facing applications. Applications that used to be kept behind enterprise “walls” in the days of on-premises-only environments are now fully exposed online, and cybercriminals have taken advantage. Given the myriad sensitive information kept within these applications, enterprises must ensure internet-facing vulnerabilities have the highest priority. ... While zero-day vulnerabilities are common entry points for threat actors, they also tend to pay close attention to patch release dates, as they know many enterprises fall behind in patching their vulnerabilities. Many patch management processes fail because security teams use manual methods to install security fixes, which takes up a significant portion of their already-limited time. As the number of patches piles up, it can be difficult to determine which patches must be applied first and which can be left for later.


Uncertainty persists, but enterprises rush to adopt network as a service

“Although enterprises can see the operational value NaaS could bring, they worry about the potentially higher total cost of ownership (TCO), day-to-day management challenges and risk of significant fluctuations in monthly bills,” Hayden added. “This leaves a massive challenge for communications service providers (CSPs).” The report acknowledged that CSPs have made large strides over the past few years as they look to leverage their underlying infrastructure to climb the digital value chain by delivering cloud-enabled integrated network services. It emphasises that accelerating NaaS adoption should be a top priority for CSPs as it offers a clear avenue towards network monetisation through over-the-top (OTT) service delivery. “CSPs must first invest heavily in their NaaS solution looking to integrate automation and drive platform openness,” Hayden recommended. “On top of this, they must look to develop a partnership ecosystem comprised of systems integrators and network service partners.”


What the FBI’s Hive takedown means for the ransomware economy

“Today’s disruption of the Russian Hive ransomware infrastructure underscores the historic international cooperation between law enforcement agencies. The International Ransomware Taskforce is having an impact,” said Tom Kellermann, CISM, senior VP of cyberstrategy at Contrast Security. However, Kellermann warns that there’s still more to be done to address the impunity of Russian state-backed cybergangs, who are free to engage in criminal activity internationally with little threat of prosecution. ... “Disrupting Hive is no doubt a victory, but the war is far from over,” said Kev Breen, director of cyber threat research at Immersive Labs. “While this action will have a short-term effect on the proliferation of ransomware, Hive operates under a ransomware-as-a-service (RaaS) model, meaning they use affiliates that are responsible for gaining the initial foothold and then dropping the ransomware payload. “With the proverbial head of this snake cut off, those affiliates will turn to other ransomware operators and pick up where they left off,” Breen said.


Data Lake Security: Dive into the Best Practices

The three key security risks facing data lakes are:Access control: With no database tables and more fluid permissions, access control is more challenging in a data lake. Moreover, permissions are difficult to set up and must be based on specific objects or metadata definitions. Commonly, employees across the company also have access to the lake, which contains personal data or data that falls under compliance regulations. With 58% of security incidents caused by insider threats, according to a commissioned Forrester Consulting study, employee access to sensitive data is a security nightmare if left unchecked. Data protection: Data lakes often serve as a singular repository for an organization’s information, making them a valuable target to attack. Without proper access controls in place, bad actors can gain access and obtain sensitive data from across the company. Governance, privacy, and compliance: Because employees from across the company can feed data into the data lake without inspection, some data may contain privacy and regulatory requirements that other data doesn’t. 


Why Securing Software Should Go Far Beyond Trusting Your Vendors

Securing a software supply chain against attacks takes knowing what elements in your system have the potential to be attacked. More than three-quarters (77%) of those BlackBerry surveyed said that, in the last 12 months, they discovered previously unknown participants within their software supply chain — entities they had not been monitoring for adherence to critical security standards. That’s even when these companies were already rigorously using data-encryption, Identity Access Management (IAM), and Secure Privileged Access Management (PAM) frameworks. As a result, malicious lines of code can sit in blind spots for years, ready to be exploited when the attacker chooses. The Cybersecurity and Infrastructure Security Agency (CISA), the National Security Agency (NSA), and the Office of the Director of National Intelligence (ODNI) recently issued a recommended practices guide for customers on securing the software supply chain. 


7 Insights From a Ransomware Negotiator

"We really like to focus on emphasizing communication when it comes to threat intelligence — whether it's threat intelligence talking with the SOC or the incident response team, or even vulnerability management," he says. "Getting an idea of what these trends look like, what the threat actors are focusing on, how much they pop up and go away, all of that is very valuable for the defenders to know." Even though the underlying TTPs of fulltime groups makes a lot of ransomware detection and response a little easier, there are still some big variables out there. For example, as many groups have employed the ransomware-as-a-service (RaaS) model, they employ a lot more affiliates, which means negotiators are always dealing with different people. "In the early days of ransomware, when you started negotiations, there was a good chance you were dealing with the same person if you were dealing with the same ransomware," Schmitt says. "But in today's ecosystem, there are just so many different groups, and so many different affiliates that are participating as part of these groups, that a lot of times you're almost starting from scratch."


The downsides of cloud-native solutions

One of the main issues with cloud-native development and deployment is that it can lead to vendor lock-in. When an application is built and deployed to a specific cloud provider, you typically use the native capabilities of that cloud provider. It can be difficult and costly to move to a different provider or an on-premises platform. This can limit the flexibility of the organization in terms of where they choose to run their applications. It flies in the face of what many believe to be a core capability of cloud-native development: portability. ... Another downside is that cloud-native development can be complex and require a different set of skills and tools compared to traditional on-premises and public cloud development. This can be a challenge for organizations that are not familiar with cloud-native practices and may require additional training and resources. I often see poorly designed cloud-native deployments because of this issue. If you’re not skilled in building and deploying these types of systems, the likely outcomes will be poorly designed, overly complex applications. 



Quote for the day:

"Who we are cannot be separated from where we're from." -- Malcolm Gladwell

Daily Tech Digest - January 26, 2023

Bringing IT and Business Closer Together: Aiming for Business Intimacy

“Businesses today are looking to drive new value from software, to increase competitiveness, open new revenue streams, and increase efficiencies,” he explains. “As part of this, the business often drives the software decisions, proof-of-concepts, vendor selection, and more.” It’s not until the end of the process that IT is brought in to “sign off and deploy”, and this siloed approach results in teams working separately, often producing poor results and driving animosity between the groups. “Instead, if the business and IT teams work together for the entire project, requirements are surfaced and expertise from across the organization is brought in to make the best possible decisions,” Maxey says. From his perspective, there are several best practices that can ensure closer alignment between IT and businesses. “Embed IT into the business unit, versus in a separate department and ask IT to project manage business software projects so they are always in discussions and aware of the process,” he says. 


IT leadership: Seven spectrums of choice for CIOs in 2023

Purpose is the first thing that we want people to be thinking about in light of the office shock that they have been going through. It’s a question for organizational leaders - what is the purpose of your organization? On the spectrum, we say that a purpose ranges from the individual to the collective. And it’s important to think about that because for an individual first starting out in the workplace, their purpose may be very straightforward in terms of supporting themselves and their family. But as they get further into their career, they can enlarge their thinking about a purpose that actually can make the world better. And the same thing is true for organizations – they may start out very focused on getting their business going, but later can think about how they can contribute to the world. And in that sense, another spectrum – outcomes – is very closely related. You may start out with your primary outcome being profit, but then once you’re established and comfortable, you can think much larger, like bringing prosperity to the world, whether that world is local or much larger.


The risks of 5G security

With 5G-enabled automated communications, machines and devices in homes, factories and on-the-go will communicate vast amounts of data with no human intervention, creating greater risk. Kayne McGladrey, field CISO at HyperProof and a member of IEEE, explained the dangers of such an approach. “Low-cost, high-speed and generally unmonitored networking devices provide threat actors a reliable and robust infrastructure for launching attacks or running command and control infrastructure that will take longer to detect and evict,” he said. McGladrey also pointed out that as organizations deploy 5G as a replacement for Wi-Fi, they may not correctly configure or manage the optional but recommended security controls. “While telecommunications providers will have adequate budget and staffing to ensure the security of their networks, private 5G networks may not and thus become an ideal target for a threat actor,” he said. 5G virtualized network architecture opens every door and window in the house to hackers because it creates — in fact, requires — an extraneous supply chain for software, hardware and services. 


Fujitsu: Quantum computers no threat to encryption just yet

Fujitsu said its researchers also estimate that it would be necessary for such a fault-tolerant quantum computer to work on the problem for about 104 days to successfully crack RSA. However, before anyone gets too complacent, it should be noted IBM's Osprey has three times the number of qubits that featured in its Eagle processor from the previous year, and the company is aiming to have a 4,158-qubit system by 2025. If it continues to advance at this pace, it may well surpass 10,000 qubits before the end of this decade. And we'd bet our bottom dollar intelligence agencies, such as America's NSA, are or will be all over quantum in case the tech manages to crack encryption. Quantum-resistant algorithms are therefore still worth the effort, even if the NSA is ostensibly skeptical of quantum computing's crypto-smashing powers. Fujitsu said that although its research indicates the limitations of quantum computing technology preclude the possibility of it beating current encryption algorithms in the short term, the IT giant will continue to evaluate the potential impact of increasingly powerful quantum systems on cryptography security.


State of DevOps: Success happens through platform engineering

The platform engineering team takes responsibility for designing and building self-service capabilities to minimise the amount of work developers need to do themselves. This, according to the report’s authors, enables fast-flow software delivery. Platform teams deliver shared infrastructure platforms to internal software development teams. The team responsible for the platform treats it as a product for its users, not just an IT project. ... Ronan Keenan, research director at Perforce, said the concepts behind platform engineering have been used on a small scale at large technology organisations for many years, but platform engineering provides a more concrete focus. “The concept is about building self-service capabilities which engineers and developers can use. This reduces their workload as they do not have to build these capabilities themselves,” he said, adding that a platform’s team builds and maintains shared IT infrastructure. By having such a shared infrastructure: “The software development process can run quicker since you are lightening the load on the developers and engineers. Platform engineering also offers a more consistent process.”


How Can Big Tech Layoffs be a Boon for the Quantum Computing Cloud?

The good news is that a skilled classical engineer can obtain the necessary knowledge from a variety of places, including online and short courses, to collaborate effectively with quantum physicists. Therefore, consider the possibility of recruiting someone with experience in conventional computing for those quantum organizations that are in desperate need of personnel to aid them in carrying out their goals. Not only might you discover that it’s simpler than you thought for these people to become productive in your organization, but they might also be able to use their prior experience working for traditional computing companies to their advantage and offer original solutions to any technical issues that arise there. However, the cloud may have a bright spot. The issue for quantum enterprises in finding appropriate people has frequently come up at conferences for the industry. Some of that was brought on in recent years by the fierce competition from the traditional computer companies, who increased their development efforts during the Covid years and also implemented work-from-home policies to make it simpler for someone to join an organization with its headquarters in a different city.


Attackers use portable executables of remote management software to great effect

The phishing emails are help desk-themed – e.g., impersonate the Geek Squad or GeekSupport – and “threaten” the recipient with the renewal of a pricy service/subscription. The goal is to get the recipient to call a specific phone number manned by the attackers, who then try to convince the target to install the remote management software. “CISA noted that the actors did not install downloaded RMM clients on the compromised host. Instead, the actors downloaded AnyDesk and ScreenConnect as self-contained, portable executables configured to connect to the actor’s RMM server,” the agency explained. “Portable executables launch within the user’s context without installation. Because portable executables do not require administrator privileges, they can allow execution of unapproved software even if a risk management control may be in place to audit or block the same software’s installation on the network. Threat actors can leverage a portable executable with local user rights to attack other vulnerable machines within the local intranet or establish long term persistent access as a local user service.”


The Anticipation Game: Spotlight on Data Backups

Regardless of how reliable a storage platform is, keeping all critical data stored in one place is a disaster waiting to happen for any organisation. To avoid the pains of security breaches, ransom payments, and data leaks, companies should aim to create and distribute backup copies across multiple onsite and offsite storage destinations. Another way to truly keep ransomware at bay is to apply immutability for backup data. Immutability means data is stored in such a way that it cannot be altered, deleted, or encrypted by ransomware. The ideal data backup solution should have a well-rounded set of ransomware protection and recovery features, allowing customers to achieve near-zero downtime and avoid paying ransom in return for access to the data. For example, the capability to store backups in ransomware-resilient Amazon S3 buckets and hardened Linux-based local repositories to prevent data deletion or encryption by ransomware. Ideally, IT admin teams would be able to leverage a backup to tape functionality to create air-gapped backups on tape to reduce the chance of ransomware encryption.


B2B integration is the backbone of a resilient supply chain: OpenText study

Advanced supply chain integration capabilities can help support more efficient and effective current approaches as well as new models that translate directly to business performance. ... B2B integration capabilities and processing align with top business priorities of reducing operational and logistical costs, faster time to market, improving data quality/accuracy and progressing visibility. Recognizing the need for a seamless B2B integration and a future-proof supply chain, OpenText offers a portfolio of end-to-end solutions through the OpenText Business Network Cloud. This network provides users with the ability to automate business processes and facilitate efficient, secure, and compliant collaboration between people, systems, and things – providing a true foundation for establishing an advanced digital backbone to help support business growth and transformation initiatives. By connecting to OpenText’s powerful suite of cloud applications via our secure, scalable and highly reliable OpenText Trading Grid platform, users can allow internal and external stakeholders to collaborate seamlessly across this single and central network to exchange transactions such as purchase orders, shipment notices and payment instructions.


Five steps to build a business case for data and analytics governance

The causal relationship between poor data and analytics and poor business performance must be highlighted if a compelling business case for governance is to be made. Initially, look to identify the business processes and process owners that are critical in addressing the problem statement. These will often span multiple business areas, so look to focus on key processes rather than on lines of business. This will help break down the silos that have led to the insular and disconnected governance of data and analytics. Determine the most impactful key performance indicators (KPIs) and key risk indicators (KRIs) for business success, and then identify the specific data and analytics assets that are used in the KPIs and KRIs. These assets are the ones that must fall within the scope of the data and analytics governance proposal. A key characteristic of highly successful D&A governance initiatives is their ability to effectively define and manage scope. Be clear on what is in scope and what is out of scope for governance while identifying the key stakeholders needed in the D&A governance steering group. 



Quote for the day:

"The litmus test for our success as Leaders is not how many people we are leading, but how many we are transforming into leaders" -- Kayode Fayemi

Daily Tech Digest - January 25, 2023

How Quantum Metric is using data analytics to optimize digital teams

Quantum Metric was Ciabarra’s attempt to solve problems he personally faced while running his online app store, Intelliborn. As the company grew to over one million active users per day, he uncovered how difficult it was to see and understand all of his customers at scale, and in real time. “I had used Google Analytics, which was great to see how traffic was growing, but it couldn’t tell me where my customers were struggling, and why. I would fix something that someone on Twitter was ‘yelling’ at me about, but it sometimes would impact my business, and sometimes it wouldn’t,” Ciabarra told VentureBeat. “I thought — why is this so hard? Maybe addressing the squeaky wheel didn’t make sense from a business perspective.” That sparked the idea for Quantum Metric. So, with his cofounding engineer, David Wang, alongside his cat, Indy, Ciabarra went on to develop the first version of the Quantum Metric platform. It focused on surfacing customer frustrations and helping organizations see their customer experience through session replays.


Creating a competitive edge with a cloud maturity strategy

Companies cannot become cloud mature overnight. Cloud maturity involves a strategic effort from all levels of the businesses to look carefully at cloud spend, mitigate cloud-related risks, and upskill workers in cloud technologies. Those that manage to achieve a high level of cloud maturity remain much more competitive than firms that stop at merely adopting cloud technologies. According to McKinsey, Fortune 500 companies could earn more than $1 trillion dollars by 2030 as a result of cloud adoption and optimisation. Deutsche Bank recognised that in order to keep up with the future of banking and remain competitive, it needed to become more cloud mature. ... Cloud maturity is essential to a company’s success – but first leaders need to make sure their employees are equipped with the skills required to solve security issues. Only then will businesses be ready to implement the right strategies to maximise their return on investment and realise the full potential of cloud computing.


CNET Is Testing an AI Engine. Here's What We've Learned, Mistakes and All

Over the past 25 years, CNET built its expertise in testing and assessing new technology to separate the hype from reality and help drive conversations about how those advancements can solve real-world problems. That same approach applies to how we do our work, which is guided by two key principles: We stand by the integrity and quality of the information we provide our readers, and we believe you can create a better future when you embrace new ideas. The case for AI-drafted stories and next-generation storytelling tools is compelling, especially as the tech evolves with new tools like ChatGPT. These tools can help media companies like ours create useful stories that offer readers the expert advice they need, deliver more personalized content and give writers and editors more time to test, evaluate, research and report in their areas of expertise. In November, one of our editorial teams, CNET Money, launched a test using an internally designed AI engine – not ChatGPT – to help editors create a set of basic explainers around financial services topics. 


Common Misconceptions About Modern Ransomware

Not too long ago, if someone decided to pay a ransom, they might not receive the decryption keys after doing so. However, today, ransom payers usually do receive the keys. This was a quiet shift that took place over several years. Before this shift took place, the unsophisticated encryption process could be considered hit or miss. Today, ransomware and threat actors hit more than they miss. Often, they can encrypt most of the data—and do so quickly. Just several years ago, a threat group would take many months to move around in a network, find data sources, monitor traffic and begin an encryption process. Fast forward to today, and the average attack-to-encryption time is 4.5 days. During the early days of ransomware attacks, threat groups would occasionally move to a domain controller and gain access to an active directory. This granted them the keys to the kingdom and had a detrimental effect on the victim organization. Today, because of poor active directory security and configurations, threat groups can often elevate their credentials and their own active directory rapidly.


Can AI replace cloud architects?

The most likely path is that tactical AI tools will continue to appear. These tools will focus on specific architectural areas, such as network design, database design, platform selection, cloud-native design, security, governance, use of containers, etc. The output should be as good as, if not better than what we see today because these tools will leverage almost perfect data and won’t have those pesky human frailties that drive some architecture designs—emotions and feelings. Of course, some of these AI tools exist today (don’t tell me about your tool) and are progressing toward this ideal. However, their usefulness varies depending on the task. Tactical AI tools must still be operated by knowledgeable people who understand how to ask the right questions and validate the designs and recommendations the tool produces. Although it may take fewer people to pull off the tactical component design of a large cloud architecture, the process will not likely eliminate all humans. Remember, many of these mistakes occur because enterprises have difficulty finding skilled cloud pros. 


Chinese threat actor DragonSpark targets East Asian businesses

SparkRAT uses WebSocket protocol to communicate with the C2 server and features an upgrade system. This allows the RAT to automatically upgrade itself to the latest version available on the C2 server upon start-up by issuing an upgrade request. “This is an HTTP POST request, with the commit query parameter storing the current version of the tool,” researchers noted. In the attacks analyzed by the researchers, the SparkRAT version used was built on November 1, 2022, and deployed 26 commands. “Since SparkRAT is a multi-platform and feature-rich tool, and is regularly updated with new features, we estimate that the RAT will remain attractive to cybercriminals and other threat actors in the future,” researchers said. DragonSpark also uses Golang-based m6699.exe, to interpret runtime encoded source code and launch a shellcode loader. This initial shellcode loader contacts the C2 server and executes the next-stage shellcode loader.


Microsoft to Block Excel Add-ins to Stop Office Exploits

Excel add-in files are designated with the XLL file extension. They provide a way to use third-party tools and functions in Microsoft Excel that aren't natively part of the software; they're similar to dynamic link libraries (DLLs) but with specific features for Excel spreadsheets. For cyberattackers, they offer a way to read and write data within spreadsheets, add custom functions, and interact with Excel objects across platforms, Vanja Svajcer, a researcher with Cisco's Talos group, said in a December analysis. And indeed, attackers started experimenting with XLLs in 2017, with more widespread usage coming after the technique became part of common malware frameworks, such as Dridex. ... One of the reasons for that is because Microsoft Office does not block the feature but raises a dialogue box instead, a common approach that Microsoft has taken in the past, Svajcer wrote: "Before an XLL file is loaded, Excel displays a warning about the possibility of malicious code being included. Unfortunately, this protection technique is often ineffective as a protection against the malicious code, as many users tend to disregard the warning."


The Intersection of Trust and Employee Productivity

Unfortunately, many companies adopt a "block first, ask questions later" approach to security, which can erode employee trust and undermine the benefits of empowering employees to choose their own applications. In our previous research at Cerby, we found that 19% of employees ignore application blocks and continue to use the apps they prefer, despite such restrictions. This suggests that organizations should seek to balance high levels of trust in employees with zero trust principles for data, applications, assets and services (DAAS). A more effective approach may be to adopt an enrollment-based approach to security that balances trust-positive initiatives like employee choice of applications with cybersecurity and compliance requirements. By adopting this approach, organizations can build digital trust with employees by giving them more control over the tools and technologies they use while still ensuring the security and reliability of their systems and processes for consumers. But the benefits of building high levels of employee trust go beyond improved job performance and satisfaction. 


Examining the CIO time management dilemma

The skill profile and expectations of the CIO have, therefore, shifted to balance both business management with technology, so, where necessary, CIOs need to bolster those skills accordingly to deliver the right solutions for the business. “What makes a strong CIO is being able to recognize where the blind spots in their skill sets are and bring supplemental skills in with other leaders in the organization,” she adds. So the CIO role has evolved into this business manager position to understand how technology delivers value to the business. “And because technology is becoming the way we do business, it becomes imperative for the CIO to have that business acumen in addition to the technology,” she says, adding having that acumen is necessary to articulate justifying investment in it to enable organizational growth. In addition, as CEOs have increased their investment into digital advances in security, AI, and data analytics, their demand for results has grown, according to Gartner VP analyst Daniel Sanchez-Reina. 


Cloud egress costs: What they are and how to dodge them

Egress charges work the other way, by discouraging firms from transferring data out, either to other cloud providers, or to on-premise systems. “They’ve made the commercial decision that ingress should be effectively absorbed within the consolidated cost of service represented in the unit prices of cloud components, but egress charges are separated out,” says Adrian Bradley, head of cloud transformation at consulting firm KPMG. “At the heart of that, it is a real cost. The more a client consumes of it, the more it costs the cloud providers.” Firms have seen egress charges rise as they look to do more with their data, such as mining archives for business intelligence purposes, or to train artificial intelligence (AI) engines. Data transfers can also increase where organisations have a formalised hybrid or multi-cloud strategy. “Either there’s a need to do a lot more data egress, or perhaps there’s just simply the positive use of cloud to develop new products and services that intrinsically use more data,” says Bradley. The result is that firms are moving more data from cloud storage, and are being hit by increasing costs.



Quote for the day:

"Leadership does not depend on being right." -- Ivan Illich

Daily Tech Digest - January 24, 2023

Status of Ethical Standards in Emerging Tech

This chasm between invention and accountability is the source of much of the angst, dismay, and danger. “It is much better to design a system for transparency and explainability from the beginning rather than to deal with unexplainable outcomes that are causing harm once the system is already deployed,” says Jeanna Matthews, professor of computer science at Clarkson University and co-chair of the ACM US Technology Committee’s Subcommittee on AI & Algorithms. To that end, the Association for Computing Machinery’s global Technology Policy Council (TPC) released a new Statement on Principles for Responsible Algorithmic Systems authored jointly by its US and Europe Technology Policy Committees in October 2022. The statement includes nine instrumental principles: Legitimacy and Competency; Minimizing Harm; Security and Privacy; Transparency; Interpretability and Explainability; Maintainability; Contestability and Auditability; Accountability and Responsibility; and Limiting Environmental Impacts, according to Matthews.


Best practices for devops observability

A selected group of engineers may have the lead responsibilities around software quality, but they will need the full dev team to drive continuous improvements. David Ben Shabat, vice president of R&D at Quali, recommends, “Organizations should strive to create what I would call ‘visibility as a standard.’ This allows your team to embrace a culture of end-to-end responsibility and maintain a focus on continuous improvements to your product.” One way to address responsibility is by creating and following a standardized taxonomy and message format for logs and other observability data. Agile development teams should assign a teammate to review logs every sprint and add alerts for new error conditions. Ben Shabat adds, “Also, automate as many processes as possible while using logs and metrics as a gauge for successful performance.” Ashwin Rajeev, cofounder and CTO of Acceldata, agrees automation is key to driving observable applications and services. He says, “Modern devops observability solutions integrate with CI/CD tools, analyze all relevant data sources, use automation to provide actionable insights, and provide real-time recommendations.


Why leveraging privacy-enhancing tech advances consumer data privacy and protection

Historically, proprietary privacy-enhancing technologies have been developed by location technology companies and used internally. However, it’s my firm belief that for organizations of all types to truly progress toward the level of consumer data privacy people want and expect, privacy-enhancing technologies created by location technology companies should be made available to all companies that could benefit from these advancements. ... These tools help add industry-leading privacy controls to a company’s own systems and work with any kind of location data, no matter how it is generated. This helps ensure that a company is meeting privacy requirements and protecting consumer data. If more technology companies made the privacy-enhancing features used in their own systems available to other companies, organizations across industries could better protect the data stored in their systems, and in turn, consumer data privacy and protection is likely to progress and improve more quickly. A crucial starting point is democratizing access to these technologies.


What’s To Come In 2023? Modern Frameworks, CISO Elevation & Leaner Security Stacks

The past year has shown the effects that whistleblowing (Twitter) can have when an organization ignores its employees flagging activity they consider fraudulent, unsafe, or illegal. But over the past year, we have also seen the consequences when CISOs actively ignore or hide security issues. For example, in the Uber situation, we saw for the first time criminal charges filed and then later a conviction. These contrasting stories create a potential no-win situation for CISOs who, on the one hand, may be ignored for calling out issues or could face jail time if they actively turn a blind eye (and/or hide) them. ... With the beginning of 2023 fraught with enormous economic and regulatory uncertainty, we will likely see a consolidation of tools and a greater focus on which tools are necessary. The nature of tech is that many organizations adopt tools to fix immediate problems, and often these tools have overlapping functionality and use cases. Although security budgets are likely to be a bit safer than other departments in a business, security teams will still need to consider what they must have to be successful with fewer resources. 


The Benefits of an API-First Approach to Building Microservices

APIs have been around for decades. But they are no longer simply “application programming interfaces”. At their heart APIs are developer interfaces. Like any user interface, APIs need planning, design, and testing. API‑first is about acknowledging and prioritizing the importance of connectivity and simplicity across all the teams operating and using APIs. It prioritizes communication, reuseability, and functionality for API consumers, who are almost always developers. There are many paths to API‑first, but a design‑led approach to software development is the end goal for most companies embarking on an API‑first journey. In practice, this approach means API are completely defined before implementation. Work begins with designing and documenting how the API will function. ... In the typical enterprise microservice and API landscape, there are more components in play than a Platform Ops team can keep track of day to day. Embracing and adopting a standard, machine‑readable API specification helps teams understand, monitor, and make decisions about the APIs currently operating in their environments.


MITRE ATT&CK Framework: Discerning a Threat Actor’s Mindsetm

Many security solutions offer a wide range of features to detect and track malicious behavior in containers. Defense evasion techniques are meant to obfuscate these tools so that everything the bad actor is doing seems legitimate. One example of defense evasion includes building the container image directly on the host instead of pulling from public or private registries. There are also evasion techniques that are harder to identify, such as those based on reverse forensics. Attackers use these techniques to delete all logs and events related to their malicious activities so that the administrator of a security, security information and event management (SIEM), or observability, tool has no idea that an unauthorized event or process has occurred. To protect against defense evasion, you’ll need a container security solution that detects malware during runtime and provides threat detection and blocking capabilities. Two examples of this would be runtime threat defense to protect against malware and honeypots to capture malicious actors and activity.


CIO role: 5 strategies for success in 2023

CIOs must adapt to the changing business landscape brought on by the pandemic. With many organizations embracing hybrid work, the internet plays a more prominent role in the overall network strategy. Ensure that your systems and processes are optimized for this new reality. This includes prioritizing the user experience of remote workers and implementing better end-user experience monitoring to ensure that they can be productive and collaborate effectively.  ... As organizations increasingly adopt multi-cloud systems to manage their IT infrastructure, CIOs must be able to navigate the complexity of these environments effectively. One approach is implementing a seamless strategy across all major clouds to streamline management and reduce complexity. Consider how you can optimize performance and apply security uniformly across your multi-cloud estate. Also, be mindful of the changing regulatory and compliance landscape and look for cloud services with built-in compliance features to minimize the burden on your teams.


How passkeys are changing authentication

The latest in FIDO passkeys specs are multi-device. Once a passkey is established for a given service, the same device can be used to securely share it with another device. The devices must be in close proximity, within range of wirelessly connecting, and the user takes an active role in verifying the device sync. The remote cloud service for the given device also plays a role. That means that an iPhone uses Apple's cloud, an Android device uses Google Cloud Platform (GCP), and Windows uses Microsoft Azure. Efforts are underway to make sharing passkeys across providers simpler. It's a rather manual process to share across providers, for example, to go from an Android device to a MacOS laptop. Passkeys are cryptographic keys, so gone is the possibility of weak passwords. They do not share vulnerable information, so many password attack vectors are eliminated. Passkeys are resistant to phishing and other social engineering attacks: the passkey infrastructure itself negotiates the verification process and isn’t fooled by a good fake website -- no more accidentally typing a password into the wrong form.


CIOs sharpen tech strategies to support hybrid work

With competition for talent still tight and pressure on organizations to maximize employee productivity, Anthony Abbatiello, workforce transformation practice leader at professional services firm PwC, says CIOs should focus on what and how they can improve the hybrid experience for users. He advises CIOs to partner with their counterparts in HR to identify the worker archetypes that exist in their organizations to understand how they work and what they need to succeed. “CIOs should be asking how to create the right experience that each worker needs and what do they need to be productive in their job,” Abbatiello says. “Even if you’ve done that before, the requirements of people in a hybrid environment have changed.” Hybrid workers today are looking for digital workplace experiences that are seamless as they move between home and office, Abbatiello says. This include technologies that enable them to replicate in cyberspace the personal connections and spontaneous collegiality that more easily happen in person, as they seeking experiences that are consistent regardless of where they’re working on any given day.


Platform Engineering 101: What You Need to Know About This Hot New Trend

Before platform teams can start building their product, they need to define a clear mission statement to guide the process. This mission statement should fit the overall goals of the organization and proactively define the role of the platform team within the organization. It should also inspire your engineers. Hashicorp’s Director of Platform Engineering Infrastructure Michael Galloway summarizes this well: “it should be emotional and inspiring. … It should be simple but meaningful.” You can start by defining your goals. This could encompass things like enabling the required degree of developer self-service without adding cognitive load or achieving the desired reduction of tickets that go to ops without forcing developers to learn infrastructure-centric technologies end-to-end. After this, you’ll probably wind up with something like: “Our mission is to standardize workflows to improve the developer experience, speed up innovation cycles, and shorten time to market for the engineering organization.” It’s descriptive but not inspiring. Refine your mission statement to strike a good balance. For example: “Our mission is to build a platform developers love because it lets them innovate faster.”



Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson

Daily Tech Digest - January 23, 2023

CIOs step up to fill plus-size leadership roles

While it’s common for others in the C-suite to take on so-called plus-roles, the CIO has historically worn multiple hats without the benefit or glory of a formal title marking their expanded terrain. The successful shift to remote work and accelerated digital transformation during the pandemic years elevated CIOs’ stature, earning trust and recognition from business leaders that was absent in the past. With technology now a pervasive enabler of all aspects of the business, CIOs have the end-to-end visibility and cross-functional insights that other executives lack, not to mention oversight of key enterprise assets such as data and analytics. “We always had business leaders take on other roles, but the CIO hasn’t been considered for one of those pluses — they were always rolled up underneath another executive that was,” says John Hill, senior vice president and chief digital information officer at MSC Industrial Supply, where he heads up digital strategy, ecommerce, marketing, vending solutions, metalworking innovation, and advanced analytics. 


ChatGPT Prompts for Agile Practitioners

“One important thing to keep in mind is that conflict is a natural part of any team and should be viewed as an opportunity for growth and improvement. As a Scrum Master, it's important to create an environment where team members feel comfortable discussing and resolving conflicts in a constructive manner. It's also important to be aware that conflicts can often be the result of deeper underlying issues, such as lack of trust, poor communication, or unmet needs. As a Scrum Master, it's important to be aware of these underlying issues and work with the team to address them. Additionally, it's important to be aware of the team's dynamics and to be mindful of the impact that conflicts can have on individuals and the team as a whole. As a Scrum Master, it's important to be aware of any team members who may be more affected by the conflict and to provide them with additional support as needed. Finally, it is important to be aware of the impact of external factors on the team, such as changes in the organizational structure or market conditions, and to take them into account when addressing conflicts and underlying issues.


3 predictions for open source in confidential computing

Confidential computing is the practice of isolating sensitive data and the techniques used to process it. This is as important on your laptop, where your data must be isolated from other applications, as it is on the cloud, where your data must be isolated from thousands of other containers and user accounts. As you can imagine, open source is a significant component for ensuring that what you believe is confidential is actually confidential. This is because security teams can audit the code of an open source project. ... In the past year, a lot of discovery and educational activities were developed. Confidential computing is now better known, but it has yet to become a mainstream technology. The security and developer communities are gaining a better understanding of confidential computing and its benefits. If this discovery trend continues this year, it can influence more outlets, like conferences, magazines, and publications. This shows that these entities recognize the value of confidential computing. In time, they may start to offer more airtime for talks and articles on the subject.


Exploring Cloud-Native Acceleration of Data Governance

Any seasoned data governance practitioner knows that implementing data management is only for those with extraordinary perseverance. Identifying and documenting data applications and flows, measuring and remediating data quality, and managing metadata — despite a flood of tools that attempt to automate many data management activities, these typically remain manual and time-consuming. They are also expensive. At leading organizations, data governance and remediation programs can top $10 million per year; in some cases, even $100 million. Data governance can be seen as a cost to the organization and a blocker for business leaders with a transformation mandate. So, here appears our “so-what.” If data governance could be incorporated directly into the fibers of the data infrastructure while the architectural blocks are being built and connected, it would dramatically reduce the need for costly data governance programs later. We will review three essential design features that, once incorporated, ensure that data governance is embedded “by design” rather than by brute manual force.


Crossplane: A Package-Based Approach to Platform Building

While Crossplane allows users to install many packages alongside one another, a common pattern has emerged of defining a single “root” package, which is an ancestor node of every other package that gets installed in a control plane. Doing so makes the entire API surface area reproducible by installing that one package. As an organization grows and evolves over time, that package can be expanded by defining new APIs or establishing new dependencies. ... Furthermore, because using a root package is a convention rather than a technical constraint, the property of composability is not violated. Taking the previously mentioned MySQL Configuration package for example, a user may install it as the root package of their MySQL database control plane, while another user may depend on it as only one component of a much larger API surface for use cases like internal cloud platforms. Declaring a Vendor Dependency While the attributes of Crossplane packages enable platform builders to quickly add functionality to their control planes, they also present a unique distribution mechanism for vendors. 


The loneliness of leading a cybersecurity startup

When building something unprecedented and game-changing, the course and rules are steeped in darkness and uncertainty, with naysayers, critics, board members, competitors and time itself hurling criticisms every step of the way. Why on earth would anyone subject themselves to that? Perhaps it should be made clear that, for most of the aspiring CEOs I meet, their career path is hardly a choice. Entrepreneurship often runs in their blood, serving as a defining force for the people who pursue it. This all-consuming nature has become a necessary qualifier for most investors; It’s an important tool for survival, as well as for success. This is not to discount the absolute necessity of vision, inspiration and faith in a good idea. But on the long road to entrepreneurial success, these are often subject to so much scrutiny and judgement that a different strength is necessary to stay the course. These days, tightening budgets and dreaded potential layoffs only add to the pressure they feel. However, during the toughest times, an entrepreneur’s hunger to build can provide the critical momentum they need to move forward.


IT hiring: How to find the right match

A strong company mission is not only crucial to attracting talent, but it will also motivate existing employees to do their work well. I am motivated to push through the complex problems I face in my work, for example, because I know that my company is solving some of the most significant issues in the healthcare industry. My work ultimately brings a better experience to hospital staff and patients, which leads to improved patient outcomes. In every organization, it is crucial for executives and board members to focus on agreed-upon goals and to share with employees how the company is meeting them regularly. Furthermore, employees seek a collaborative environment where they work together toward a common goal. ... Intelligent, scrappy workers are often attracted to startups, allowing ambitious employees to try many hats and build new skills quickly. I am fascinated by the technical and intellectual aspects of my job and find them highly rewarding. My teammates and I don’t know all the answers as we work with new models, so we must test our hypotheses and rethink our approach.


Pentagon must act now on quantum computing or be eclipsed by rivals

Many experts, including Spirk, believe that military applications for quantum computing could be less than 10 years away. Case in point: according to the Pentagon’s annual report on Chinese military power, China recently designed and fabricated a quantum computer capable of outperforming a classical high-performance computer for a specific problem. This is also why DARPA announced the ‘Underexplored Systems for Utility-Scale Quantum Computing’ (US2QC) program to explore potentially overlooked methods by which quantum computers could achieve practical levels of utilization much faster than current predictions suggest. The White House recently signed the Quantum Computing Cybersecurity Preparedness Act into law, signaling that it regards quantum as a serious issue. The act addresses the migration of executive agencies’ IT systems to post-quantum cryptography (PQC) - encryption which is secure from attacks by quantum computers because of the advanced mathematics underpinning it.


How Will the AI Bill of Rights Affect AI Development?

The AIBoR states that AI systems should be transparent and explainable, and not discriminate against individuals based on various protected characteristics. “This would require AI developers to design and build transparent and fair systems and carefully consider their systems’ potential impacts on individuals and society,” explains Bassel Haidar, artificial intelligence and machine language practice lead at Guidehouse, a business consulting firm. Creating transparent and fair AI systems could involve using techniques, such as feature attribution, that can help identify the factors that influenced a particular AI-driven decision or prediction, Haidar says. “It could also involve using techniques such as local interpretable model-agnostic explanations (LIME), which can help to explain the decision-making process of a black-box AI model.” AI developers will have to thoroughly test and validate AI systems to ensure that they function properly and make accurate predictions, Haidar says. “Additionally, they will need to employ bias detection and mitigation techniques to help identify and reduce potential biases in the AI system.”


The metaverse brings a new breed of threats to challenge privacy and security gatekeepers

While security experts point to authentication and access controls to protect against metaverse-based scams and attacks, the growing number of platforms providing access to the metaverse may or may not have secure mechanisms for recognizing frauds, says Paul Carlisle Kletchka, governance, risk, and compliance (GRC) analyst with Lynx Technology Partners, a provider of GRC services. “One of the major vulnerabilities is the lack of standardized security protocols or mechanisms in place across the platforms,” he says. “As a result, cybercriminals can use the metaverse for a variety of purposes such as identity theft, fraud, or malicious attacks on other users. Since people can download programs and files from within the metaverse, there is also a risk that these files could contain malware that could infect a user's computer or device and spread back into the organization’s systems. Another threat is piracy: since the metaverse is still in its early stages of development, there are no laws or regulations written specifically for the metaverse to protect intellectual property within this digital environment.



Quote for the day:

"A leader is judged not by the length of his reign but by the decisions he makes." -- Klingon Proverb