Daily Tech Digest - December 26, 2022

Nvidia still crushing the data center market

EVGA CEO Andy Han cited several grievances with Nvidia, not the least of which was that it competes with Nvidia. Nvidia makes graphics cards and sells them to consumers under the brand name Founder’s Edition, something AMD and Intel do very little or not at all. In addition, Nvidia’s line of graphics cards was being sold for less than what licensees were selling their cards. So not only was Nvidia competing with its licensees, but it was also undercutting them. Nvidia does the same on the enterprise side, selling DGX server units (rack-mounted servers packed with eight A100 GPUs) in competition with OEM partners like HPE and Supermicro. Das defends this practice. “DGX for us has always been sort of the AI innovation vehicle where we do a lot of item testing,” he says, adding that building the DGX servers gives Nvidia the chance to shake out the bugs in the system, knowledge it passes on to OEMs. “Our work with DGX gives the OEMs a big head-start in getting their systems ready and out there. So it's actually an enabler for them.” But both Snell and Sag think Nvidia should not be competing against its partners. “I'm highly skeptical of that strategy,” Snell says. 


A Look Ahead: Cybersecurity Trends to Watch in 2023

Multifactor authentication was once considered the gold standard of identity management, providing a crucial backstop for passwords. All that changed this year with a series of highly successful attacks using MFA bypass and MFA fatigue tactics, combined with tried-and-true phishing and social engineering. That success won’t go unnoticed. Attackers will almost certainly increase multifactor authentication exploits. "Headline news attracts the next wave of also-rans and other bad actors that want to jump on the newest methods to exploit an attack," Bird says. "We're going to see a lot of situations where MFA strong authentication is exploited and bypassed, but it's just unfortunately a reminder to us all that tech is only a certain percentage of the solution." Ransomware attacks have proliferated across public and private sectors, and tactics to pressure victims into paying ransoms have expanded to double and even triple extortion. Because of the reluctance of many victims to report the crime, no one really knows whether things are getting better or worse. 


Why zero knowledge matters

In a sense, zero knowledge proofs are a natural elaboration on trends in complexity theory and cryptography. Much of modern cryptography (of the asymmetric kind) is dependent on complexity theory because asymmetric security relies on using functions that are feasible in one form but not in another. It follows that the great barrier to understanding ZKP is the math. Fortunately, it is possible to understand conceptually how zero knowledge proofs work without necessarily knowing what a quadratic residue is. For those of us who do care, a quadratic residue of y, for a value z is: . This rather esoteric concept was used in one of the original zero knowledge papers. Much of cryptography is built on exploring the fringes of math (especially factorization and modulus) for useful properties. Encapsulating ZKP's complex mathematical computations in libraries that are easy to use will be key to widespread adoption. We can do a myriad of interesting things with such one-way functions. In particular, we can establish shared secrets on open networks, a capability that modern secure communications are built upon


Rust Microservices in Server-side WebAssembly

Rust enables developers to write correct and memory-safe programs that are as fast and as small as C programs. It is ideally suited for infrastructure software, including server-side applications, that require high reliability and performance. However, for server-side applications, Rust also presents some challenges. Rust programs are compiled into native machine code, which is not portable and is unsafe in multi-tenancy cloud environments. We also lack tools to manage and orchestrate native applications in the cloud. Hence, server-side Rust applications commonly run inside VMs or Linux containers, which bring significant memory and CPU overhead. This diminishes Rust’s advantages in efficiency and makes it hard to deploy services in resource-constrained environments, such as edge data centers and edge clouds. The solution to this problem is WebAssembly (WASM). Started as a secure runtime inside web browsers, Wasm programs can be securely isolated in their own sandbox. With a new generation of Wasm runtimes, such as the Cloud Native Computing Foundation’s WasmEdge Runtime, you can now run Wasm applications on the server. 


How to automate data migration testing

Testing with plenty of time before the official cutover deadline is usually the bulk of the hard work involved in data migration. The testing might be brief or extended, but it should be thoroughly conducted and confirmed before the process is moved forward into the “live” phase. An automated data migration approach is a key element here. You want this process to work seamlessly while also operating in the background with minimal human intervention. This is why I favor continuous or frequent replication to keep things in sync. One common strategy is to run automated data synchronizations in the background via a scheduler or cron job, which only syncs new data. Each time the process runs, the amount of information transferred will become less and less. ... Identify the automatic techniques and principles that will ensure the data migration runs on its own. These should be applied across the board, regardless of the data sources and/or criticality, for consistency and simplicity’s sake. Monitoring and alerts that notify your team of data migration progress are key elements to consider now. 


Clean Code: Writing maintainable, readable and testable code

Clean code makes it easier for developers to understand, modify, and maintain a software system. When code is clean, it is easier to find and fix bugs, and it is less likely to break when changes are made. One of the key principles of clean code is readability, which means that code should be easy to understand, even for someone who is not familiar with the system. To achieve this, developers should e.g. use meaningful names for variables, functions, and classes. Another important principle of clean code is simplicity, which means that code should be as simple as possible, without unnecessary complexity. To achieve this, developers should avoid using complex data structures or algorithms unless they are necessary, and should avoid adding unnecessary features or functionality. In addition to readability and simplicity, clean code should also be maintainable, which means that it should be easy to modify and update the code without breaking it. To achieve this, developers should write modular code that is organized into small, focused functions, and should avoid duplication of code. Finally, clean code should be well-documented. 


Artificial intelligence predictions 2023

Synthetic data – data artificially generated by a computer simulation – will grow exponentially in 2023, says Steve Harris, CEO of Mindtech. “Big companies that have already adopted synthetic data will continue to expand and invest as they know it is the future,” says Harris. Harris gives the example of car crash testing in the automotive industry. It would be unfeasible to keep rehearsing the same car crash again and again using crash test dummies. But with synthetic data, you can do just that. The virtual world is not limited in the same way, which has led to heavy adoptoin of synthetic data for AI road safety testing. Harris says synthetic data is now being used in industries he never expcted in order to improve development, services and innnovation. ... Banks will use AI more heavily to give them a competitive advantage to analyse the capital markets and spot opportunities. “2023 is going to be the year the rubber meets the road for AI in capital markets, says Matthew Hodgson, founder and CEO of Mosaic Smart Data. “Amidst the backdrop of volatility and economic uncertainty across the globe, the most precious resource for a bank is its transaction records – and within this is its guide to where opportunity resides. 


Group Coaching - Extending Growth Opportunity Beyond Individual Coaching

First, as a coach since our focus is on the relationship and interactions between the individuals, we don’t coach individuals in separate sessions. Instead, we bring them together as the group/team that they are part of and coach the entire group. Anything said by one member of the team is heard by everyone right there and then. The second building block is holding the mirror to the intangible entity mentioned above. To be accurate, holding the mirror is not a new skill for proponents of individual coaching, but it takes a significantly different approach in group coaching and has a more pronounced impact here. Holding the mirror here means picking up the intangibles and making the implicit explicit, for example, sensing the mood in the room, or reading the body language, drop/increase in energy, head nods, smiles, drop in shoulders, emotions etc. and playing back to the room your observation (sans judgement obviously). Making the intangibles explicit is an important step in group coaching - name it to tame it, if you will. The third building block is the believing and trusting in the group system that it is intelligent and self-healing.


Hybrid cloud in 2023: 5 predictions from IT leaders

Hood says this trend is fundamentally about operators accelerating their 5G network deployments while simultaneously delivering innovative edge services to their enterprise customers, especially in key verticals like retail, manufacturing, and energy. He also expects growing use of AL/ML at the edge to help optimize telco networks and hybrid edge clouds. “Many operators have been consuming services from multiple hyperscalers while building out their on-premise deployment to support their different lines of business,” Hood says. “The ability to securely distribute applications with access to data acceleration and AI/ML GPU resources while meeting data sovereignty regulations is opening up a new era in building application clouds independent of the underlying network infrastructure.” ... “Given a background of low margins, limited budgets, and the complexity of IT systems required to keep their businesses operating, many retailers now understandably rely on a hybrid cloud approach to help reduce costs whilst delivering value to their customers,” says Ian Boyle, Red Hat chief architect for retail.


Looking ahead to the network technologies of 2023

The growth in Internet dependence is really what’s been driving the cloud, because high-quality, interactive, user interfaces are critical, and the cloud’s technology is far better for those things, not to mention easier to employ than changing a data center application would be. A lot of cloud interactivity, though, adds to latency and further validates the need for improvement in Internet latency. Interactivity and latency sensitivity tend to drive two cloud impacts that then become network impacts. The first is that as you move interactive components to the cloud via the Internet, you’re creating a new network in and to the cloud that’s paralleling traditional MPLS VPNs. The second is that you’re encouraging cloud hosting to move closer to the edge to reduce application latency. ... What about security? The Internet and cloud combination changes that too. You can’t rely on fixed security devices inside the cloud, so more and more applications will use cloud-hosted instances of security tools. Today, only about 7% of security is handled that way, but that will triple by the end of 2023 as SASE, SSE, and other cloud-hosted security elements explode. 



Quote for the day:

"Leadership is unlocking people's potential to become better." -- Bill Bradley

Daily Tech Digest - December 25, 2022

How Value Stream Management is Fueling Digital Transformation

One of the world’s largest aerospace companies, The Boeing Company has been employing VSM for several years now. Through VSM, they optimized resource utilization and reduced waste. “We always thought we were doing a good job of producing value until we started to work through this,” explained Lynda Van Vleet, Boeing’s portfolio management systems product manager. “In our first two years, we saved hundreds of millions of dollars. But that wasn’t our goal. I think a lot of organizations look at this as a way of saving money because you usually do, but if you start out looking at it as a way of creating value, that just comes along with it.” The organization changed legacy approaches to product management and project investment. This enabled them to speed up their ability to innovate and pursue digital transformation. ... By establishing cross-team visibility, leaders were able to spot redundancies. For example, they saw how different IT organizations had their own analytics teams. “We had people in every organization doing the same thing,” explained Van Vleet. Boeing’s executives established a single analytics team to realign the work more efficiently and improve consistency.


Rethinking Risk After the FTX Debacle

The threat surface for FTX clients wasn't just about protecting their FTX passwords or hoping the exchange wouldn't get hacked like the Mt. Gox bitcoin exchange and so many others did. Instead, their portfolios were at risk of implosions over assets and investments they had never heard of. That is the definition of risk: having your hard-earned money and investments merged with a toxic mix of super-risky sludge. That’s a helpless place to be. After more than 20 years in cybersecurity, it is difficult not to think about risk exposure and threat management in a case like this. Security teams are dealing with something much more akin to SBF than Madoff. There is no singular threat facing an enterprise today. Instead, it is a constellation of assets, devices, data, clouds, applications, vulnerabilities, attacks, and defenses. Security teams' biggest weakness is that they are being asked to secure what they can neither see nor control. Where is our critical data? Who is accessing it, and who needs access? Every day in cybersecurity, the landscape of what needs to be protected changes. Applications are updated. Data is stored or in transit among multiple clouds. Users change. Every day represents new challenges.


Quantum Machine Learning: A Beginner’s Guide

Welcome to the world of quantum machine learning! In this tutorial, we will walk you through a beginner-level project using a sample dataset and provide step-by-step directions with code. By the end of this tutorial, you will have a solid understanding of how to use quantum computers to perform machine learning tasks and will have built your first quantum model. But before we dive into the tutorial, let’s take a moment to understand what quantum machine learning is and why it is so exciting. Quantum machine learning is a field at the intersection of quantum computing and machine learning. It involves using quantum computers to perform machine learning tasks, such as classification, regression, and clustering. Quantum computers are powerful machines that use quantum bits (qubits) instead of classical bits to store and process information. This allows them to perform certain tasks much faster than classical computers, making them particularly well-suited for machine learning tasks that involve large amounts of data.


Importance of anti-money laundering regulations among prosumers for a cybersecure decentralized finance

To the extent of our knowledge, this is the first study to assess this possibility with supportive evidence from a game theoretical perspective. In addition, our study examines and sheds light on the importance of AML regulations among prosumers in fulfilling the institutional role of preventing cyberattacks by the decentralized governance in a blockchain-based sharing economy. This paper focuses on prosumers as they undertake institutional roles in blockchain-based sharing economy models (Tan & Salo, 2021). In fact, most hackers are prosumers and may serve as end-users as well as developers. Therefore, their impact can be significant in setting the tone for safety and security of a blockchain-based sharing economy. Last but not least, our paper provides policy suggestions for creating effective cybersecurity efforts in permissionless DeFi without relinquishing its decentralized nature. Our first policy suggestion is the integration of artificial intelligence (AI) employing machine learning (ML) techniques to promptly flag, track, and recover stolen tokens from offenders.


Conscious Machines May Never Be Possible

Pondering this question, it’s important to recognize that intelligence and consciousness are not the same thing. While we humans tend to assume the two go together, intelligence is neither necessary nor sufficient for consciousness. Many nonhuman animals likely have conscious experiences without being particularly smart, at least by our questionable human standards. If the great-granddaughter of LaMDA does reach or exceed human-level intelligence, this does not necessarily mean it is also sentient. My intuition is that consciousness is not something that computers (as we know them) can have, but that it is deeply rooted in our nature as living creatures. Conscious machines are not coming in 2023. Indeed, they might not be possible at all. However, what the future may hold in store are machines that give the convincing impression of being conscious, even if we have no good reason to believe they actually are conscious. They will be like the Müller-Lyer optical illusion: Even when we know two lines are the same length, we cannot help seeing them as different.


Six Ways To Pivot Hiring Strategies To Attract Cybersecurity Talent

To recruit and retain cybersecurity talent, you should change your approach with these six strategies. Learn from past hirings, whether successful or not: Not every hire will turn out as expected, but you can learn from these previous decisions. Remember, an interview is a conversation: You and the candidate have a lot to learn about each other. You could lose a good hire if interviews are tightly controlled and formal. In the “real world” of cybersecurity, communication and collaboration are critical, so that’s the type of environment you should create in the hiring process. Don’t rush to hire: Even if you are understaffed and have vacancies open for some time, you’ll lose more time and money by hiring the wrong people. Be patient in the process. Find someone who matches your culture: Someone can be a brilliant technical candidate but still be wrong for your organization. In many circumstances, culture fit means someone with soft skills and wants to grow and evolve. Keep in mind that a highly motivated individual is teachable: They can develop their soft and technical skills under you. 


DataOps as a holistic approach to data management

The DataOps approach, which takes its cue from the DevOps paradigm shift, is focused on increasing the rate at which software is developed for use with large data processing frameworks. DataOps also encourages line-of-business stakeholders to collaborate with data engineering, data science, and analytics teams in an effort to reduce silos between IT operations and software development teams. This ensures that the organization’s data may be utilized in the most adaptable and efficient manner to provide desirable results for business operations. DataOps integrates many facets of IT, such as data development, data transformation, data extraction, data quality, data governance, data access control, data center capacity planning, and system operations, because it encompasses so much of the data lifecycle. Typically, a company’s chief data scientist or chief analytics officer leads a DataOps team comprised of specialists like data engineers and analysts. Frameworks and related toolsets exist to support a DataOps approach to collaboration and greater agility, but unlike DevOps, there are no software solutions dedicated to “DataOps.”


How edge-to-cloud is driving the next stage of digital transformation

The thing about computing at the edge is that it needs to run at the speed of life. A self-driving car can't take the time to send off a query and await a response when a truck swerves in front of it. It has to have all the necessary intelligence in the vehicle to decide what action to take. While this is an extreme example, the same is true of factory processes and even retail sales. Intelligence, data analysis, and decision making must be available without a propagation delay, and therefore must live at the edge. Of course, all of this adds to the management overhead. Now you have management consoles from a large number of vendors to contend with, plus those for your services on-premises, and then all the stuff up in the cloud. This is where integration is necessary, where it becomes absolutely essential that all your IT resources – from the edge all the way up to the cloud – need to be managed from a single, coherent, manageable interface. It's not just about ease of use. It's about preventing mistakes and being able to keep track of and mitigate threats. 


Cloud to edge: NTT multicloud platform fuels digital transformation

The platform is the heart of our Multicloud as a Service offering because it provides visibility, control and governance across all clouds and for all workloads. It enhances the cloud providers’ native control planes with AI-backed insights for anomaly detection, correlation forecasting, automated operations, agile deployments and more, without limiting direct access to the cloud. These elements give organizations more comfort in consuming these services in a way that is closely aligned with their needs. ... This can be difficult for many clients to do themselves because most have managed their technology in a particular way for years and now have to make a step change into the cloud paradigm. But NTT has operated cloud platforms and delivered managed services across multiple industries and technologies for more than two decades, so we’re perfectly placed to help them make the leap. Some of the components of our platform may be familiar, but how we bring them together is unique. Our many years of operating experience have been baked into this platform to make it a true differentiator.


Top Decentralized Finance (DeFi) Trends in 2023

Governance tokens give individuals the authority to vote on blockchain project development and management-related matters. By having the power to have a say in blockchain project operations, it becomes possible to ensure the goals/interests of token holders are the same or similar. For example, a DeFi project like Compound lets users use native tokens for various farm or rent income schemes. It has its Token (COMP) that governs the Compound DeFi protocol's growth. ... It will soon be possible to watch the development of new social networks by creators and followers. New immersive fan economy fueled by social tokens in the metaverse can revolutionize digital monetization. Communities or celebrities can monetize their brand further by using social tokens. They will create bidirectional relationships between artists and customers with reciprocal benefits. Individuals, rather than organizations, become the agents of creativity in a dispersed collaborative paradigm. It is a unified and linked metaverse where tokenized NFTs may contain digital data rights while storing, tracking, and enforcing those rights.



Quote for the day:

"Coaching isn't an addition to a leader's job, it's an integral part of it." -- George S. Odiorne

Daily Tech Digest - December 24, 2022

3 takeaways to boost your enterprise architect career in 2023

Many people confuse business architecture and IT systems architecture, but they are different practices. Business architects transform business ideas and potential projects that align with an organization's strategies and influence a company's performance as a leader in their market, says Renee Biggs. IT architects contribute to the technology components of business architecture or enterprise architecture. ... Finding the right balance between documentation and implementation is part of an architect's job. But what do you do if your culture values performance a little too much, asks Evan Stoner, a senior specialist solutions architect. His advice is to "deliver what is needed." This means uncovering the current needs and designing for them. Change doesn't happen overnight. The future is uncertain, so architectures are constantly evolving. Software development is a key example, say Neal Fishman and Paul Homan; you must continually redevelop business applications to address new opportunities. But constant change can produce substandard solutions that require frequent revision.


Risk and resilience: compliance in 2023

By starting to prepare early, David Tattam, Chief Research & Content Officer and co-founder of Protecht expects that “businesses will start to realise the tangible benefits a holistic actionable view of risk provides. Smart businesses will track a measurable baseline of risk efficiencies over time, in line with their profit and returns strategy, to demonstrate the ROI of their risk management program”. Some legislation may have complicated and far-reaching impacts. Lee Biggenden, COO and Co-Founder of Nephos Technologies has noticed that “there are ongoing discussions in the European Union about open data platforms which, if passed, could revolutionise how data is used, shared and owned. Anticipated to come into force in 2023, it will have a huge impact on businesses who will need to put the controls and visibility in place over their data regardless of what industry they're in. Although on paper this may seem like a step in the right direction, it does raise concerns about personal privacy as third-party data sharing is a key part of the proposed act. We are all guilty of clicking privacy boxes without reading the full terms and conditions”. 


The Metaverse Doesn’t Have a Leg to Stand On

Zuckerberg’s not the only mark here. Microsoft also placed a bet on the metaverse (the avatars in its iteration, Mesh, also lack legs). In the past few years, a comically wide variety of companies have hired their own “chief metaverse officer,” from Disney and Procter & Gamble to the Creative Artists Agency and the accounting firm Prager Metis. Meta placed its bet on the metaverse in the flashiest way, changing its name, spending all that dough, et cetera, but it’s not alone in its conviction that these virtual worlds are the inevitable future. Even writer Neal Stephenson, who coined the word metaverse in his 1992 novel Snow Crash, founded an actual metaverse company in 2022. In the past few years, metaverse startups like Decentraland and the Sandbox grabbed venture capital interest by hyping themselves as hubs for a new NFT-fueled economy. Despite these companies’ hefty valuations, they have remained decidedly niche. (Refuting a third-party report that it had only 38 active users one day, Decentraland said it had an average of 8,000 daily active users—which is still tiny.) Why did Zuckerberg gamble his business on something so wobbly, so literally legless?


The year 2022 for Women in Tech

The results of the Toptal survey are not able to clearly indicate that certain progress has been made in the last year. These facts reveal that a bigger change still needs to be achieved. There are certain steps and actions that should be taken by everyone involved in the ecosystem, in order to overcome the existing gender inequality in the world and in the domain of technology and STEM specifically. The first step is to break the stereotypes. The stereotypical thinking is at the root of the problem of misconception of the role of women in society, the jobs and activities that are ‘appropriate’ for them. The second step is bridging the gender gap and it presents the next big challenging shift that must be called for. The gender gap exists notably in career opportunities and this is easily noticed when comparing the number of women studying and graduating in the STEM field and the number of women who manage to land jobs in the tech field and achieve real-life professional realization in the tech area. A significant part of the existing gender gap is the pay gap. Still, the gender pay gap provenly exists even in the most advanced countries. 


The Tech:Forward recipe for a successful technology transformation

Having an approach that is both this comprehensive and detailed was instrumental in aligning one large OEM’s tech-transformation goals. Previous efforts had stalled, often because of competing priorities across various business units, which frequently led to a narrow focus on each unit’s needs. One might want to push hard for cloud, for example, while another wanted to cut costs. Each unit would develop its own KPIs and system diagnostics, which made it virtually impossible to make thoughtful decisions across units, and technical dependencies between units would often grind progress to a halt. The company was determined to avoid making that mistake again. So it invested the time to educate stakeholders on the Tech:Forward framework, detail the dependencies within each part of the framework, and review exactly how different sequencing models would impact outcomes. In this way, each function developed confidence that the approach was both comprehensive and responsive to its needs. Meetings with the CFO


3 cloud architecture best practices for industry clouds

Make no assumptions about the security of industry-specific clouds. Those sold by the larger cloud providers may be secure as stand-alone services; however, they could become a security vulnerability when integrated and operated directly within your solution. The best practice here is to build and design security into your custom applications that leverage industry clouds. Also, do so with integration in mind so no new vulnerabilities are opened. You can take two things that are assumed to be secure independently, and then add dependencies that entirely change the security profile. ... However, you’ll often find the best-of-breed option is on another cloud or perhaps from an independent industry cloud provider that decided to go it alone. The best practice here is to not limit the industry-specific services under consideration. As time goes on, there will be dozens of services to do tasks such as risk analytics for investment banking, for example. Picking the less optimized choice means you’ll lower the value that’s returned to the business. In other words, you make less ROI when you make less optimized decisions.


Benefits of A Technology-Enabled Risk Assessment Process

Organizations need effective risk assessments to manage resources and make informed business decisions that enable growth. Performing an effective risk assessment means going beyond an annual, check-the-box activity by implementing a risk assessment process that can yield actionable results and findings and serve as a business intelligence tool to inform risk management strategies. In today’s rapidly evolving market, banks, financial services companies, payment services providers, and  fintechs  are focused on improving their processes to create dynamic and efficient digital experiences for their customers. But the improvements do not extend just to customers. ... Technology-enabled solutions – which support the automated or semi-automated collection of data, scoring of inherent risk, mapping of controls, and scoring of residual risk – can help organizations streamline and add efficiencies to their risk assessments and provide a better understanding of real-time risk than the frequently outdated, once-a-year process can. Organizations then can use the valuable business intelligence obtained through the risk assessment process to increase revenue and identify new business opportunities for further growth.


Making the case for an Enterprise Architect in Digital Transformation programs

Large transformation work need to be addressed in 3 buckets – Plan, Build & Run. While “Build” is the largest portion of investments in any transformation program, it is obvious that organizations are required to safeguard whatever has been built with minimal “Run” budgets. “Build” and “Run” are cyclical in nature. For example, you build, then you maintain (run), then you either build more or something new and then maintain (run) that more or something new. So it is pretty obvious that leaders responsible for large transformations think of Build as the starting point and transition to Run as the end. It is senior leaders however that need to see things being built (and run) as building blocks of a vision or the journey. This is the Planning function (which is strategic). The “Plan” function needs to be executed by someone who understands business vision and maps the journey to that vision. S/he does that by identifying business capabilities and value chain (not business process), and then developing the strategy as building blocks. The skill required to do this is called “Enterprise Architecture”.


EU Cyber Resilience Act: Good for Software Supply Chain Security, Bad for Open Source?

With all of the good that the CRA brings in evolving the regulatory conversations past SBOMs, the current draft has some problematic language that could actually hurt the future of open source. But first, what it gets right about open source. Page 15, Paragraph 10 attempts to exempt, or carve out, open source software (OSS) from the regulations, saying: In order not to hamper innovation or research, free and open-source software developed or supplied outside the course of a commercial activity should not be covered by this Regulation. This is in particular the case for software, including its source code and modified versions, that is openly shared and freely accessible, usable, modifiable and redistributable. This is good, even great. OSS and project maintainers should be exempt from these regulations that apply liability, as this will have the effect of quashing innovation and sharing of ideas via code. However, in the same paragraph, the CRA attempts to draw a line between commercial and non-commercial use of open source software:


Finding the Right Data Governance Model

It is critical to distinguish the term “governance” from the term “management” in the context of Data Governance. It should be noted that the principal difference between “governance” and “management” is that governance refers to the decisions that must be made and who must make them. This is to ensure effective resource allocation and management of data operations. On the other hand, Data Management involves implementing those decisions that arise from assessing and monitoring either existing controls or the environment that includes advancements in technology and the market. The activities required for Data Governance can, therefore, be distinguished from those needed for Data Management since management is influenced by governance. Data Governance is oversight of Data Management activities to ensure that policy and ownership of data are enforced in the organization. The emphasis is on formalizing the Data Management function and associated data ownership roles and responsibilities.



Quote for the day:

"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn

Daily Tech Digest - December 23, 2022

Why the industrial metaverse will eclipse the consumer one

The industrial metaverse is further ahead on the 3D front, with simulations and digital twins. The industrial metaverse is ahead on the standards front, with companies like Nvidia pushing potential standards such as Universal Scene Description (USD) through its Omniverse platform. USD has been characterized as doing for the metaverse what HTML did for the internet. In this regard, USD can lead to greater interoperability, [connecting] formerly disparate applications or ecosystems … to make workflows more seamless. ... Digital assets, similarly, are typically locked to a particular ecosystem, servicer or game. Many of the most transformative opportunities in the consumer space will also come with mainstream smart glasses, which are still years away before we see a stronger impact. The enterprise and industrial metaverses are also better grounded in ROI, meaning more trials and initial deployments have a higher potential to succeed or lead to more adoption compared to consumer efforts, which have seen more pushback, such as the addition of NFTs in games in Western markets [gaining] limited traction.


Surviving the Incident

The next step to the IR playbook is to identify the "crown jewels" of the organization — the critical systems, services, and operations that, if impacted by a cyber event, would disrupt business operations and cause a loss of revenue. Similarly, understanding the collected data type, how it is transmitted and stored, and who should access it must be mapped to ensure data security. Identifying and mapping critical systems can be accomplished through penetration tests, risk assessments, and threat modeling. A risk assessment is often the first tool to identify potential attack vectors and prioritize security events. However, to achieve a proactive stance, organizations are increasingly leveraging threat intelligence and modeling to identify and address vulnerabilities and security gaps early on before a known attack occurs. The primary goal is to identify weaknesses or vulnerabilities with assets to reduce the attack surface and close all the security gaps. This guide will focus on web application security as our attack scenario. Why web application security? 


Not everything we call AI is actually 'artificial intelligence'. Here's what you need to know

Most of what we know as AI today has narrow intelligence – where a particular system addresses a particular problem. Unlike human intelligence, such narrow AI intelligence is effective only in the area in which it has been trained: fraud detection, facial recognition or social recommendations, for example. AGI, however, would function as humans do. For now, the most notable example of trying to achieve this is the use of neural networks and “deep learning” trained on vast amounts of data. Neural networks are inspired by the way human brains work. Unlike most machine learning models that run calculations on the training data, neural networks work by feeding each data point one by one through an interconnected network, each time adjusting the parameters. As more and more data are fed through the network, the parameters stabilise; the final outcome is the “trained” neural network, which can then produce the desired output on new data – for example, recognising whether an image contains a cat or a dog. The significant leap forward in AI today is driven by technological improvements in the way we can train large neural networks, readjusting vast numbers of parameters in each run thanks to the capabilities of large cloud-computing infrastructures.


Metaverse Security Concerns Coming Into Focus as Businesses Plan For “Virtual Reality” Futures

Organizations smell potential here, with 23% responding that they are already developing initiatives even as basic specifications are still firming up. Of the respondents that expressed a desire to do business in the metaverse, the leading interest (44%) was customer engagement opportunities. Other popular areas are learning/training measures and workplace collaboration. But when asked about their concerns about expanding into this new area, respondents said that metaverse security was item #1 on the list. By and large, today’s security solutions have not yet considered the prospect of metaverse integration. Nevertheless, 86% of the respondents said that they would feel comfortable sharing user personal information between different metaverse services. Security providers may be waiting to see what users settle on in the metaverse before tailoring their products accordingly. Of the products available thus far, online games are the only ones drawing mass amounts of users (particularly the pre-existing Roblox and Fortnite) along with simple 3D world chat apps that allow users to appear as an avatar.


What’s next for AI

The big companies that have historically dominated AI research are implementing massive layoffs and hiring freezes as the global economic outlook darkens. AI research is expensive, and as purse strings are tightened, companies will have to be very careful about picking which projects they invest in—and are likely to choose whichever have the potential to make them the most money, rather than the most innovative, interesting, or experimental ones, says Oren Etzioni, the CEO of the Allen Institute for AI, a research organization. That bottom-line focus is already taking effect at Meta, which has reorganized its AI research teams and moved many of them to work within teams that build products. But while Big Tech is tightening its belt, flashy new upstarts working on generative AI are seeing a surge in interest from venture capital funds. Next year could be a boon for AI startups, Etzioni says. There is a lot of talent floating around, and often in recessions people tend to rethink their lives—going back into academia or leaving a big corporation for a startup, for example.


How to Innovate by Introducing Product Management in SMB and Non-Tech Companies

It’s common to find product managers and product owners in SaaS, technology, ecommerce, retail, and other B2C companies. Leadership in these companies long realized that understanding markets, determining product-market fits, defining customer personas, and understanding value propositions are all key to developing minimally viable solutions and delivering ongoing product enhancements. But identifying product managers and owners in non-tech companies, B2B businesses, SMBs, and the government remains a long-running work in progress. To start innovating, it comes down to transforming from stakeholder-led backlogs to product-managed, market-driven roadmaps. Tech, media, and ecommerce companies figure this out right away because chasing stakeholder-driven features often yields subpar results. More traditional businesses are likely to misdiagnose the problems with stakeholder-driven backlogs as a technology execution or platform issue. But there are a few secrets to making product management work even in the most traditional businesses.


IT Job Market: 2022's Wild Ride and What to Expect for 2023

Even as those layoff announcements were rolling in, the US Bureau of Labor Statistics job report for October showed a strong job market for tech pros and continued growth for remote jobs. In November that growth continued with IT industry association CompTIA reporting that US tech companies added 14,400 workers during the month, marking two consecutive years of monthly job growth in the sector. Tech jobs in all industry sectors increased by 137,000 positions. And while job postings for future hiring slipped in November, they still totaled nearly 270,000. As the tech sector heads into a changed 2023 employment market, it’s unclear how all these mixed signals will play out, although experts are starting to weigh in on best practices. Employers are likely looking carefully at budgets and head counts. But it will be a challenging line to walk. Employers have spent the past few years investing in employee experience programs and focusing on retaining their valuable talent. An abrupt change in direction such as mass layoffs will likely sour companies’ reputations as employers.


Inside the Next-Level Fraud Ring Scamming Billions Off Holiday Retailers

Besides the operation being stacked with technology know-how, Michael Pezely, Signifyd's director of risk intelligence, tells Dark Reading that the e-commerce threat group has sheer speed and volume of scam transactions on its side. "E-commerce orders — particularly at the enterprise level — arrive at dizzying speed," Pezely says. "Signifyd, for instance, processed as much as $42 million an hour in orders during Cyber Week. It would be virtually impossible for a human team to review that volume of orders for signs of fraud." Pezely added that merchants are on the lookout for goods being shipped to a foreign country, but this group of scammers places orders that appear to originate from the US and ship to US addresses. "Furthermore, if a merchant is relying on only its own transaction data, there likely will be a lag between the time a fraud attack begins and when it is recognized," Pezely explains. "Without having the benefit of seeing millions of transactions across thousands of merchants, a novel fraud attack might not be in plain sight for some time."


Protecting your organization from rising software supply chain attacks

The reason for the continued bombardment, said Moore, is increasing reliance on third-party code (including Log4j). This makes distributors and suppliers ever more vulnerable, and vulnerability is often equated with a higher payout, he explained. Also, “ransomware actors are increasingly thorough and use non-conventional methods to reach their targets,” said Moore. For example, using proper segmentation protocols, ransomware agents target IT management software systems and parent companies. Then, after breaching, they leverage this relationship to infiltrate the infrastructure of that organization’s subsidiaries and trusted partners. “Supply chain attacks are unfortunately common right now in part because there are higher stakes,” said Moore. “Extended supply chain disruptions have placed the industry at a fragile crossroads.” Supply chain attacks are low cost and can be minimal effort and have potential for high reward, said Crystal Morin, threat research engineer at Sysdig. And, tools and techniques are often readily shared online, as well as disclosed by security companies, who frequently post detailed findings.


Why User Journeys Are Critical to Application Detection

The first generation of cybersecurity detection technology is rules, but rules only detect known patterns. Individualized rules require expensive experts to maintain: each application is unique, and one must be extremely familiar with its business logic, log formats, how it is used, etc., in order to write and manage rules for detecting application breaches. ... Over a decade ago, the security market adopted statistical analysis to augment rule-based solutions in an attempt to provide more accurate detection for the infrastructure and access layers. However, UEBA failed to deliver as promised to dramatically increase accuracy and reduce false positive alerts due to a fundamentally mistaken assumption – that user behavior can be characterized by statistical quantities, such as the average daily number of activities. ... The main criteria for success in a detection solution is accuracy, which is dictated by the number of false positives, and the number of false negatives. The evolution of detection solutions led to the third generation of solutions analyzing Sequences of Activity, i.e. Journeys, to contextualize activity and improve detection accuracy.



Quote for the day:

"Before you revel in the anticipation of tomorrow, toil in the preparation of today." -- Tim Fargo

Daily Tech Digest - December 22, 2022

Data forecast for 2023: Time to extract more value

Using data effectively relies in large part on being able to properly manage and control how data is used. That's where data governance comes into play, with tools and technologies that help organizations govern the data they use. Data governance will have an expanded role in 2023, according to Eckerson Research analyst Kevin Petrie. There will be a growing use of ML technologies to improve data governance technology by helping to automate processes and policies for data. Petrie said he also expects a rising number of data governance platforms to help organize, document and apply policies to ML models alongside other data assets in 2023. Benefitting from data to improve business outcomes entails collecting product and service data. That's where the concept of data as a product -- also referred to as data product -- will have growing relevance in 2023. Barr Moses, CEO of data observability vendor Monte Carlo, predicted that nearly every product will become a data product as organizations seek to optimize operations. "In 2023, more and more companies will seek to integrate ways to track and monetize data generated by their products as part of their core offerings to drive competitive advantage," Moses said.


The Future of Skills: Preparing for Industry 4.0 and Beyond

Industry 4.0—Industrial Internet of Things or the 4th Industrial revolution, as it is popularly addressed—has arrived with lots of opportunities and challenges that have the potential to transform the marketplace completely. Industry 4.0 refers to the “smart” and connected production systems that are designed to sense, predict and interact with the physical world so as to make decisions that support production in real-time, increasing productivity, energy efficiency and sustainability. McKinsey estimates that IoT has the potential to unlock an economic value somewhere between US$5.5 to $12.6 trillion by 2030. Therefore, with so many changes happening so quickly, neither employers nor employees (both employed and yet to be employed) can afford to ignore them or to stay in their comfort zone following the same old practices or skills. A report by World Economic Forum states that 84 percent of employers are set to rapidly digitalize working processes with the potential to move 44 percent of their workforce to operate remotely, and the top skills needed as we lead up to 2025 are critical thinking and analysis, problem solving, active learning, resilience, stress tolerance and flexibility.


What is DataOps? Collaborative, cross-functional analytics

Enterprises today are increasingly injecting machine learning into a vast array of products and services and DataOps is an approach geared toward supporting the end-to-end needs of machine learning. “For example, this style makes it more feasible for data scientists to have the support of software engineering to provide what is needed when models are handed over to operations during deployment,” Ted Dunning and Ellen Friedman write in their book, Machine Learning Logistics. “The DataOps approach is not limited to machine learning,” they add. “This style of organization is useful for any data-oriented work, making it easier to take advantage of the benefits offered by building a global data fabric.” ... Because DataOps builds on DevOps, cross-functional teams that cut across “skill guilds” such as operations, software engineering, architecture and planning, product management, data analysis, data development, and data engineering are essential, and DataOps teams should be managed in ways that ensure increased collaboration and communication among developers, operations professionals, and data experts.


Amplified security trends to watch out for in 2023

Cybercriminals target employees across different industries to surreptitiously recruit them as insiders, offering them financial enticements to hand over company credentials and access to systems where sensitive information is stored. This approach isn’t new, but it is gaining popularity. A decentralized work environment makes it easier for criminals to target employees through private social channels, as the employee does not feel that they are being watched as closely as they would in a busy office setting. Aside from monitoring user behavior and threat patterns, it’s important to be aware of and be sensitive about the conditions that could make employees vulnerable to this kind of outreach – for example, the announcement of a massive corporate restructuring or a round of layoffs. Not every employee affected by a restructuring suddenly becomes a bad guy, but security leaders should work with Human Resources or People Operations and people managers to make them aware of this type of criminal scheme, so that they can take the necessary steps to offer support to employees who could be affected by such organizational or personal matters.


How deep learning will ignite the metaverse in 2023 and beyond

Currently, the digital realities being developed by different companies have their own attributes and integrated functionalities, and are at different development levels. Many of these multiverse platforms are expected to converge, and this junction is where AI and data science domains, such as deep learning, will be critical in taking users to a new stage in their metaverse journey. Success in these endeavors will be contingent upon understanding vital elements of the algorithmic models and their metrics. Deep learning-based software is already being integrated into virtual worlds; some examples include autonomously driving chatbots and other forms of natural language processing to ensure seamless interactions. For another example, in AR technology, deep learning-enabled AI is used in camera pose estimation, immersive rendering, real-world object detection and 3D object reconstruction, helping to guarantee the variety and usability of AR applications. ... “Companies have an interesting opportunity for their customers and community to interact with their brand(s) in new and exciting ways, and deep learning-based artificial intelligence plays a major role in facilitating those experiences,” said Stephenson.


Introducing Cadl: Microsoft’s concise API design language

Microsoft has begun to move much of its API development to a language called Cadl, which helps you define API structures programmatically before compiling to OpenAPI definitions. The intent is to do for APIs what Bicep does for infrastructure, providing a way to repeatably deliver API definitions. By abstracting design away from definition, Cadl can deliver much more concise outputs, ensuring that the OpenAPI tool in platforms like Visual Studio can parse it quickly and efficiently. What is Cadl? At first glance it’s a JavaScript-like language with some similarities to .NET languages. Microsoft describes it as “TypeScript for APIs,” intending it to be easy to use for anyone familiar with C#. Like Microsoft’s other domain-specific languages, Cadl benefits from Microsoft’s long history as a development tools company, fitting neatly into existing toolchains. You can even add Cadl extensions to the language server in Visual Studio and Visual Studio Code, ensuring that you get support from built-in syntax highlighting, code completion, and linting. Making Cadl a language makes a lot of sense; it allows you to encapsulate architectural constraints into rules and wrap common constructs in libraries. 


CIOs in 2023: Guiding Business Strategies Through Data-Driven Decisions

“CIOs need to take on a data mindset by first understanding the data, and then determining how critical the data architecture and data governance is,” he says. For understanding the business process, they need to think about how they can move the needle for the company, prioritize the projects that drive business, and implement or evolve the systems they already have. “The third important thing is building business partnerships across the organization,” Kancharla adds. “Having all levels of relationships will go a long way for the CIOs to be successful. The last thing is really thinking of what optimizations they can bring to the company, especially next year.” He points out that next year, every company will have to bring down costs, which means streamlining and optimizing the software within the company and deploying the tools they already have to the full potential. Segovia adds effective CIOs must also be able to understand the tech and recommendations their teams are executing on. “They need to understand areas in a reasonably deep manner in order to lead teams of wide technical and digital acumen,” he says.


Social media use can put companies at risk: Here are some ways to mitigate the danger

The concern is that foreign-owned applications might share the information they collect with government intelligence agencies. That information includes personally identifiable information, keystroke patterns (PII), location information based on SIM card or IP address, app activity, browser and search history, and biometric information. Personal use of social media by employees can impact the company’s brand as well as endanger the firm or employees themselves—bad actors could use social media to identify where a person works, the division in which they work, and possibly their physical location. The potential harm is higher for high-risk employees such as senior executives or those with authority to execute financial transactions. Of course, there are plenty of good reasons for employees to use social media. It can enhance marketing campaigns, announce news or critical information, and otherwise raise the profile of an organization. Social media channels can be used to monitor risks and threats against a government or critical infrastructure. 


The power of generosity in ecosystems

A traditional approach to competition, rooted in the business mindset of one company gaining an advantage over another, can make it difficult to play in an ecosystem as a participant. For example, one of the risks of being part of an ecosystem is the dependency on its orchestrator. Increased reliance on Big Tech and the consolidation of many industries have created an increased risk of a few powerful cash-generator businesses that need to reward shareholders with consistent, attractive margins and will not think twice about burdening their partners to keep those margins—for example, by asking for discounts in exchange for participating in the ecosystem. But what if there was more of a sense of mutual collaboration? Benjamin Gomes-Casseres of Brandeis University has published research with Harvard Business Review Press on different business combinations (his term for business ecosystems). He states that for an ecosystem to logically exist, the players within an ecosystem must fairly share the benefits, creating added value for the entire ecosystem that exceeds the level of value each company could create independently.


6 BI challenges IT teams must address

There can be obstacles, however, to taking the self-service approach. Having too much access across many departments, for example, can result in a kitchen full of inexperienced cooks running up costs and exposing the company to data security problems. And do you want your sales team making decisions based on whatever data it gets, and having the autonomy to mix and match to see what works best? Central, standardized control over tool rollout is key. And to do it correctly, IT needs to govern the data well. Because of these tradeoffs, organizations must ensure they select the BI approach best-suited for the business application at hand. “We have more than 100,000 associates in addition to externals working for us, and that’s quite a large user group to serve,” says Axel Goris, global visual analytics lead at Novartis, the multinational pharmaceutical corporation based in Basel, Switzerland. “A key challenge was organization around delivery — how do you organize delivery, because a pharmaceutical company is highly regulated.” An IT-managed BI delivery model, Goris explains, requires a lot of effort and process, which wouldn’t work for some parts of the business.



Quote for the day:

"Nothing so conclusively proves a man's ability to lead others as what he does from day to day to lead himself." -- Thomas J. Watson