Daily Tech Digest - December 25, 2022

How Value Stream Management is Fueling Digital Transformation

One of the world’s largest aerospace companies, The Boeing Company has been employing VSM for several years now. Through VSM, they optimized resource utilization and reduced waste. “We always thought we were doing a good job of producing value until we started to work through this,” explained Lynda Van Vleet, Boeing’s portfolio management systems product manager. “In our first two years, we saved hundreds of millions of dollars. But that wasn’t our goal. I think a lot of organizations look at this as a way of saving money because you usually do, but if you start out looking at it as a way of creating value, that just comes along with it.” The organization changed legacy approaches to product management and project investment. This enabled them to speed up their ability to innovate and pursue digital transformation. ... By establishing cross-team visibility, leaders were able to spot redundancies. For example, they saw how different IT organizations had their own analytics teams. “We had people in every organization doing the same thing,” explained Van Vleet. Boeing’s executives established a single analytics team to realign the work more efficiently and improve consistency.


Rethinking Risk After the FTX Debacle

The threat surface for FTX clients wasn't just about protecting their FTX passwords or hoping the exchange wouldn't get hacked like the Mt. Gox bitcoin exchange and so many others did. Instead, their portfolios were at risk of implosions over assets and investments they had never heard of. That is the definition of risk: having your hard-earned money and investments merged with a toxic mix of super-risky sludge. That’s a helpless place to be. After more than 20 years in cybersecurity, it is difficult not to think about risk exposure and threat management in a case like this. Security teams are dealing with something much more akin to SBF than Madoff. There is no singular threat facing an enterprise today. Instead, it is a constellation of assets, devices, data, clouds, applications, vulnerabilities, attacks, and defenses. Security teams' biggest weakness is that they are being asked to secure what they can neither see nor control. Where is our critical data? Who is accessing it, and who needs access? Every day in cybersecurity, the landscape of what needs to be protected changes. Applications are updated. Data is stored or in transit among multiple clouds. Users change. Every day represents new challenges.


Quantum Machine Learning: A Beginner’s Guide

Welcome to the world of quantum machine learning! In this tutorial, we will walk you through a beginner-level project using a sample dataset and provide step-by-step directions with code. By the end of this tutorial, you will have a solid understanding of how to use quantum computers to perform machine learning tasks and will have built your first quantum model. But before we dive into the tutorial, let’s take a moment to understand what quantum machine learning is and why it is so exciting. Quantum machine learning is a field at the intersection of quantum computing and machine learning. It involves using quantum computers to perform machine learning tasks, such as classification, regression, and clustering. Quantum computers are powerful machines that use quantum bits (qubits) instead of classical bits to store and process information. This allows them to perform certain tasks much faster than classical computers, making them particularly well-suited for machine learning tasks that involve large amounts of data.


Importance of anti-money laundering regulations among prosumers for a cybersecure decentralized finance

To the extent of our knowledge, this is the first study to assess this possibility with supportive evidence from a game theoretical perspective. In addition, our study examines and sheds light on the importance of AML regulations among prosumers in fulfilling the institutional role of preventing cyberattacks by the decentralized governance in a blockchain-based sharing economy. This paper focuses on prosumers as they undertake institutional roles in blockchain-based sharing economy models (Tan & Salo, 2021). In fact, most hackers are prosumers and may serve as end-users as well as developers. Therefore, their impact can be significant in setting the tone for safety and security of a blockchain-based sharing economy. Last but not least, our paper provides policy suggestions for creating effective cybersecurity efforts in permissionless DeFi without relinquishing its decentralized nature. Our first policy suggestion is the integration of artificial intelligence (AI) employing machine learning (ML) techniques to promptly flag, track, and recover stolen tokens from offenders.


Conscious Machines May Never Be Possible

Pondering this question, it’s important to recognize that intelligence and consciousness are not the same thing. While we humans tend to assume the two go together, intelligence is neither necessary nor sufficient for consciousness. Many nonhuman animals likely have conscious experiences without being particularly smart, at least by our questionable human standards. If the great-granddaughter of LaMDA does reach or exceed human-level intelligence, this does not necessarily mean it is also sentient. My intuition is that consciousness is not something that computers (as we know them) can have, but that it is deeply rooted in our nature as living creatures. Conscious machines are not coming in 2023. Indeed, they might not be possible at all. However, what the future may hold in store are machines that give the convincing impression of being conscious, even if we have no good reason to believe they actually are conscious. They will be like the Müller-Lyer optical illusion: Even when we know two lines are the same length, we cannot help seeing them as different.


Six Ways To Pivot Hiring Strategies To Attract Cybersecurity Talent

To recruit and retain cybersecurity talent, you should change your approach with these six strategies. Learn from past hirings, whether successful or not: Not every hire will turn out as expected, but you can learn from these previous decisions. Remember, an interview is a conversation: You and the candidate have a lot to learn about each other. You could lose a good hire if interviews are tightly controlled and formal. In the “real world” of cybersecurity, communication and collaboration are critical, so that’s the type of environment you should create in the hiring process. Don’t rush to hire: Even if you are understaffed and have vacancies open for some time, you’ll lose more time and money by hiring the wrong people. Be patient in the process. Find someone who matches your culture: Someone can be a brilliant technical candidate but still be wrong for your organization. In many circumstances, culture fit means someone with soft skills and wants to grow and evolve. Keep in mind that a highly motivated individual is teachable: They can develop their soft and technical skills under you. 


DataOps as a holistic approach to data management

The DataOps approach, which takes its cue from the DevOps paradigm shift, is focused on increasing the rate at which software is developed for use with large data processing frameworks. DataOps also encourages line-of-business stakeholders to collaborate with data engineering, data science, and analytics teams in an effort to reduce silos between IT operations and software development teams. This ensures that the organization’s data may be utilized in the most adaptable and efficient manner to provide desirable results for business operations. DataOps integrates many facets of IT, such as data development, data transformation, data extraction, data quality, data governance, data access control, data center capacity planning, and system operations, because it encompasses so much of the data lifecycle. Typically, a company’s chief data scientist or chief analytics officer leads a DataOps team comprised of specialists like data engineers and analysts. Frameworks and related toolsets exist to support a DataOps approach to collaboration and greater agility, but unlike DevOps, there are no software solutions dedicated to “DataOps.”


How edge-to-cloud is driving the next stage of digital transformation

The thing about computing at the edge is that it needs to run at the speed of life. A self-driving car can't take the time to send off a query and await a response when a truck swerves in front of it. It has to have all the necessary intelligence in the vehicle to decide what action to take. While this is an extreme example, the same is true of factory processes and even retail sales. Intelligence, data analysis, and decision making must be available without a propagation delay, and therefore must live at the edge. Of course, all of this adds to the management overhead. Now you have management consoles from a large number of vendors to contend with, plus those for your services on-premises, and then all the stuff up in the cloud. This is where integration is necessary, where it becomes absolutely essential that all your IT resources – from the edge all the way up to the cloud – need to be managed from a single, coherent, manageable interface. It's not just about ease of use. It's about preventing mistakes and being able to keep track of and mitigate threats. 


Cloud to edge: NTT multicloud platform fuels digital transformation

The platform is the heart of our Multicloud as a Service offering because it provides visibility, control and governance across all clouds and for all workloads. It enhances the cloud providers’ native control planes with AI-backed insights for anomaly detection, correlation forecasting, automated operations, agile deployments and more, without limiting direct access to the cloud. These elements give organizations more comfort in consuming these services in a way that is closely aligned with their needs. ... This can be difficult for many clients to do themselves because most have managed their technology in a particular way for years and now have to make a step change into the cloud paradigm. But NTT has operated cloud platforms and delivered managed services across multiple industries and technologies for more than two decades, so we’re perfectly placed to help them make the leap. Some of the components of our platform may be familiar, but how we bring them together is unique. Our many years of operating experience have been baked into this platform to make it a true differentiator.


Top Decentralized Finance (DeFi) Trends in 2023

Governance tokens give individuals the authority to vote on blockchain project development and management-related matters. By having the power to have a say in blockchain project operations, it becomes possible to ensure the goals/interests of token holders are the same or similar. For example, a DeFi project like Compound lets users use native tokens for various farm or rent income schemes. It has its Token (COMP) that governs the Compound DeFi protocol's growth. ... It will soon be possible to watch the development of new social networks by creators and followers. New immersive fan economy fueled by social tokens in the metaverse can revolutionize digital monetization. Communities or celebrities can monetize their brand further by using social tokens. They will create bidirectional relationships between artists and customers with reciprocal benefits. Individuals, rather than organizations, become the agents of creativity in a dispersed collaborative paradigm. It is a unified and linked metaverse where tokenized NFTs may contain digital data rights while storing, tracking, and enforcing those rights.



Quote for the day:

"Coaching isn't an addition to a leader's job, it's an integral part of it." -- George S. Odiorne

No comments:

Post a Comment