Daily Tech Digest - January 05, 2022

5 Cloud Native Trends to Watch out for in 2022

With the emphasis on cybersecurity, I expect to see open source projects and commercial offerings squarely focused on cloud native security. Two areas will get the attention — software supply chain and eBPF. The software supply chain closely mimics the supply chain of real-world commerce where resources are consumed, then transformed, through a series of steps and processes, and finally supplied to the customer. Modern software development is about assembling and integrating various components available in the public domain as open source projects. In the complex supply chain of software, a compromised piece of software can cause severe damage to multiple deployments. Recent incidents involving CodeCov, Solarwinds, Kaseya, and the ua-parser-js NPM package highlight the need to secure the software supply chain. In 2022, there will be new initiatives, projects, and even new startups focusing on secure software supply chain management. The other exciting trend is eBPF that enables cloud native developers to build secure networking, service mesh, and observability components.


Second-generation AI-powered digital pills are changing the future of healthcare

Many chronic diseases move along a dynamic trajectory that creates a challenge of unpredictable progression. This is often disregarded by first-generation AI as it requires constant adaptation of therapeutic regimens. Also, many therapies do not show loss of response until even a few months. The second-generation AI systems are designed to improve response to therapies and facilitate analysing inter-subject and intra-subject variabilities in response to therapies over time. Most first-generation AI systems extract data from large databases and artificially impose a rigid “one for all” algorithm on all subjects. Attempts to constantly amend treatment regimens based on big data analysis might be irrelevant for an individual patient. Imposing a “close to optimal” fit on all subjects does not resolve difficulties associated with dynamicity and the inherent variability of biological systems. The second-generation AI systems focus on a single patient as the epicentre of an algorithm and to adapt their output in a timely manner. 


How Walmart Canada Uses Blockchain to Solve Supply-Chain Challenges

A public blockchain network — one that anyone can join without asking for permission — allows unlimited viewing of information stored on it, eliminates intermediaries, and operates independently of any governing party. It is well-suited for digital consumer offerings (like NFT’s), cryptocurrencies, and certifying information such as individuals’ degrees or certificates. But private networks — those that require a party to be granted permission to join it — are often far better suited for businesses because access is restricted to verified members and only parties directly working together can see the specific information they exchange. This better satisfies industrial-grade security requirements. For these reasons, Walmart decided to go with a private network built on Hyperledger Fabric, an open-source platform. ... For Walmart and its carriers, this meant working with each carrier’s unique data (vendor name, payment terms, contract duration, and general terms and conditions), which is combined with governing master tables of information such as fuel rates and tax rates. The parties should then jointly agree to the formulas that the blockchain will use to calculate each invoice.


Property Graphs vs. Knowledge Graphs

A property graph uses nodes, relationships, labels, and “properties.” Both the relationships and their connecting nodes of data are named, and capable of storing properties. Nodes can be labeled in support of being part of a group. Property graphs use “directed edges” and each relationship has a start node and an end node. Relationships can also be assigned properties. This feature is useful in presenting additional metadata to the relationships between the nodes. ... Knowledge graphs are very useful in working with data fabric. The semantics feature (and the use of graphs) supports discovery layers and data orchestration in a data fabric. Combining the two makes the data fabric easier to build out incrementally and more flexible, which lowers risk and speeds up deployment. The process allows an organization to develop the fabric in stages. It can be started with a single domain, or a high value use case, and gradually expanded incrementally with more data, users, and use cases. A data fabric architecture, combined with a knowledge graph, supports useful capabilities in many key areas. 


Executive Q&A: Getting the Most from Unstructured Data

The world of ten years ago was dominated by structured data. After 2012, though, as sensors became cheaper, cell phones gradually became smartphones, and cameras were installed to make shooting easier. With this, a large amount of unstructured data was generated, and enterprises entered uncharted territory, making progress slow. Some of the inhibitors to progress in this area include: Complexity: Unlike structured data which can be analyzed intuitively, unstructured data needs to be further processed and then analyzed, usually best done through artificial intelligence. Machine learning algorithms classify and label content from it. However, it is not easy to identify high-quality data from the data set due to the large amount and complexity of unstructured data -- this has been painful for developer teams and a key challenge to data architectures that are already complex. Cost: Although the enterprise recognizes the value of unstructured data, the cost can be a potential obstacle to making use of it. The cost of enterprise infrastructure, human resources, and time can hinder the implementation and development of AI and the data it analyzes.


Google Cloud Attacks Supply Chain Crisis with Digital Twin

“It’s a digital representation of the physical supply chain,” said Hans Thalbauer, the managing director for supply chain and logistics for Google Cloud. “You model all the different locations of your enterprise. Then you model all your suppliers, not just the tier one but tier two, three, and four. You bring in the logistic service providers. You bring in manufacturing partners. You bring in customers and consumers so that you have really the full view.” Once a network of supply chain players has been built out, the customer then starts loading data into their digital twin. The customer starts with their private enterprise data, which typically includes past orders, pricing, costs, and supply and demand forecasts, Thalbauer said. “Then you also want to get information from your business partners,” Thalbauer told Datanami last year. “You share your demands with your suppliers. And they actually loop back to you what is the supply situation. You share the information with the logistics service providers. You share sustainability information with the service provider.”


Fighting fraud in the supply chain with blockchain

Private blockchain platforms are particularly suited to supply chain management because they provide traceability, transparency, real-time logistics tracking, electronic funds transfer and smart contract management. Processes including negotiations support and procurement can also be connected via blockchain to build trust and confidence with new suppliers, partners and colleagues. While private blockchains adhere to the original principles of blockchain and offer all the distributed benefits, they also retain some of the characteristics of more centralised, controlled networks. This improves greater privacy and eliminates many of the illicit activities often associated with public blockchains and cryptocurrencies. No one can enter this type of ‘permissioned’ network without proper authentication, making it ideal where it does not suit an enterprise to allow every participant full access to the entire contents of the database.


Carbon Neutrality Requires Good Data – and Blockchain

Blockchain technology is effective in remedying these problems. As a decentralized, immutable ledger where data can be inputted and shared at every point of action, blockchain works by storing information in interconnected blocks and provides a value-add for insuring carbon offsets. This creates a chain of information that cannot be hacked and can be transmitted between all relevant parties throughout the supply chain. Key players can enter, view, and analyze the same data points securely and with the assurance of the data’s accuracy. In addition, the technology can identify patterns of error, giving actionable insights into where systems or humans may be contributing to the problem. Data needs to move with products throughout the supply chain to create an overall number for carbon emissions. Blockchain’s decentralization offers value to organizations and their respective industries so that more, reliable data can be shared between all parties to shine a light on the areas they need to work on, such as manufacturing operations and the offsets of buildings. 


Developing Deep Learning Systems Using Institutional Incremental Learning

Institutional Incremental Learning is one of the promising ways of addressing data-sharing concerns. Using this approach, organizations can train the model in a secure environment and can share the model without having to share precious data. Institutional Incremental Learning differs from federated learning. In federated learning, all the participants do the training simultaneously. This is challenging, as the centralized server needs to update and maintain models. This results in complex technology and communication requirements. ... After training a model locally, the model, along with the metrics, is shared with participating entities. In this way, the decision to use a particular model lies with the organization, which is going to use them and not be forced by anyone. This truly enables decentralized machine learning, where a model is not only trained but also used at the user's discretion. Incremental institutional learning helps to address catastrophic forgetting. 


Understanding AI’s Limitations Is Key to Unlocking Its Potential

To discern where AI can improve business processes and where it cannot, it’s important to take into account its legal and ethical considerations, its biases and its transparency. Asking critical questions about certain AI applications is critical to setting a project up for success and avoiding risk down the line. From a legal perspective, we must decipher who carries responsibility for a bad judgment call (i.e. a self-driving car hits a pedestrian). We also must recognize that there is bias when working with cognitive-based technology. AI learns from the data it gets; however, it doesn’t have the means to question this data, which means that data sets can easily skew in one direction and leave AI to adopt bias. This can lead to things like discrimination in recruiting processes or racial bias in healthcare management. Businesses that work with AI will also find themselves walking a fine line between trust and transparency. While the intention of advanced AI is to make more independent decisions over time, engineers can run into a “black box” scenario where it’s unclear how the application came to its decision.



Quote for the day:

"To be the improver of improvements you must challenge assumptions, starting with your own." -- Vala Afshar

Daily Tech Digest - January 04, 2022

Team Topology for Machine Learning

Stream-aligned ML Teams are teams that develop and/or manage ML solutions for end-users, i.e., domain experts, or customers in an organization. For example, in a retail company, such a team can be a markdown/discount-pricing team that delivers prices during seasons throughout the year. The scope of the team can vary but should be determined by the cognitive load of the team. For example, if the data sources and regression mechanism of the solution does not vary too much for the in- or sale-seasons then the cognitive load to support both do not double and, hence, a slightly bigger team can develop, operate, and manage the solutions for its stakeholders. On the other hand, for the same industry how online and store channel operates can vary a lot. Therefore, the markdown solution for the online channel may be operated by one team, whereas the same type of solution for the store channel may be operated by a different team. Should such a team develop its own platform or data/infrastructure subsystems?


Connecting the dots on diversity in cybersecurity recruitment

Put simply, a more diverse cybersecurity team is a better cybersecurity team. In a multidisciplinary field like this, different perspectives are critical. When threats and tactics change around us daily, the diverse viewpoints on my team help counter complacency by bringing new thinking to situations. Our adversaries, after all, are continuously trying new tactics, finding new ways to bypass controls and identify vulnerabilities. My team’s different perspectives bring a more disruptive “hacker mindset” to our work in countering attacks. Our industry’s overreliance on specialists with the “right” qualifications and educational backgrounds might actually be a weakness — a point of view reinforced for me by David Epstein’s 2019 book, “Range: Why Generalists Triumph in a Specialized World.” Epstein argues that generalists with wide-ranging interests are more creative, more agile and able to make connections that their more specialized peers can’t see, especially in complex and unpredictable fields — a description that is a good fit for cybersecurity. 


Workplace Trends 2022: The age of employees is here

2022 will see more Gen Z in employment than even before. Organizational culture will have to create space to address the needs of Gen Z. Multiple research shows that this generation has higher conviction in their own strengths and profound belief in dialogue. That means organizations will have to create accessible and better platforms for frequent and candid dialogue and train their Gen X and Gen Y leaders to be open to diverse views. McKinsey’s survey reveals this generation’s quest for truth. As per McKinsey, Gen Z is “True Gen” in contrast to Gen Y - the millennials, sometimes called the “Me Generation”. To attract and retain Gen Z, HR leaders will have to catalyze genuine culture of greatness at the workplace rather than just the labels and brands. They will have to ensure providing this experience of Truth to even interns, as this New Generation is more likely to rely on the experience of their peers rather than labels and brands. When it comes to technology and business models, we're in the midst of a revolution that can't be separated. 


Seven imperatives for moving beyond digital

Many of today’s problems are so massive that no single entity can solve them on its own. These problems can be tackled only by networks of companies and institutions that work together toward a common purpose. For example, think about people’s need for mobility—which requires dealing with public, shared, and privately owned methods of transportation; infrastructure; public 5G networks; energy supply; financing; regulation; and many more factors. The only way for companies to thrive in this disruptive age is to work with ecosystems and harness the capabilities that others have built in order to deliver their own value propositions—and do so at speed, at scale, and flexibly. When a labor shortage loomed in Japan’s construction industry in 2013, Komatsu tried to address the problem by introducing ICT (information and communications technology) construction machinery that used GPS, digital mapping, sensors, and internet-of-things connections to enhance efficiency. But leaders quickly saw that the new machines were not resulting in the expected increase in productivity. The reason? Bottlenecks in processes at the construction site.


Top 5 trends for endpoint security In 2022

Under budget pressure to deliver more with less, CISOs want to consolidate their tech stacks and save the budget for new technologies. Unified Endpoint Management (UEM) proves its value by unifying identity, security, and remote access within Zero Trust Security or ZTNA frameworks now considered essential for securing an anywhere workforce. Like ZTNA, there’s been rapid innovation occurring in UEM over the last twelve months, with reduced security and compliance risks being the goal. UEM’s benefits include streamlining continuous OS updates across multiple mobile devices and platforms, enabling device management, and having an architecture capable of supporting a wide range of devices and operating systems. Another benefit enterprises mention is automating internet-based patching, policy, and configuration management. Unified Endpoint Management (UEM) leaders include Ivanti, whose platform reflects industry leadership with advanced unified endpoint management capabilities.


DataOps will play a pivotal role in financial services growth

Important lessons were learned during the pandemic, not least that banking and other financial institutions’ business models, created pre-COVID, were not suitable for weathering a major crisis. As noted by McKinsey, this could be due to the fact that most business models rely on historical data, “without access to high-frequency data that would enable recalibration”, as well as infrastructure that lacks the agility for effective risk management. In the current landscape of economic recovery, plus the need to navigate the effects of the UK leaving the EU, evolving regulations, intensified competition and more, it’s crucial for financial organisations to rethink their models and data strategies now to strengthen future resilience. Therefore, attention will turn to implementing DataOps practices to make themselves nimble enough to identify and react to sudden micro and macro issues, integrate robust risk assessment and mitigation, and capitalise on newly emerging market opportunities. Further, the digital economy in which we live necessitates an elevated approach to engaging with consumers, who have become accustomed to instantaneous, always-on digital and omnichannel communication and personalisation of products and services on convenient platforms


Death to Tribal Knowledge

If you have existing documentation and people know about it, you’re doing great! The last hurdle to overcome is making sure that your documentation stays up to date. As time passes, processes change and wikis naturally get out of date. Stale documentation with misleading info is the worst, so finding a good way to keep track of existing documentation and showing ownership in updating it when things change is the problem to solve here. New hires in this instance are again one of the best resources you have. If a new hire is setting up their app locally and runs into issues when following the setup documentation, they should take the time to update the documentation with the correct steps. If your company is actively hiring, this ensures that fresh eyes will be following and improving the documentation every month. The same goes for every other current employee. Any time someone finds information in a wiki that is incorrect, they should do their due diligence and update the documentation. Ignoring the bad information won’t make things any easier for the next person who stumbles across the same page.


Bezier Curve Machine Learning Demonstration

This demonstration features ALGLIB, one of the better available numerical analysis libraries for C# programmers that offers several easy-to-use machine learning methods. (Later in this series, we will examine MS CNKT for C# routines.) ALGLIB for C# is available and licensed appropriately as a free, single-threaded edition for individual experimentation and use or as a commercial, multi-threaded edition for purchase. For this demonstration, I will explain at some length how to download and build the free edition as a class library that will need to be included as a reference for the demonstration program to work. For reference, you can go to the ALGLIB Wikipedia page for a description and history, the ALGLIB website for download information and an excellent on-line User’s Guide, and to the download itself for a detailed User’s Manual in .html format. ... Our small demonstration data set was built by selecting pre-classified student histories at random, while insuring a balanced set of demo data representing each status group.


Top Qualities Hiring Managers Look For In Data Scientist Candidates

The ability to code is crucial to good data scientists, that’s why almost every data science role has a technical round. But what’s equally important but sometimes overlooked is the ability to understand the business. Without the business acumen, data scientists will always be the passive implementer of tasks instead of the active thought partners that they should be. Moreover, only when you truly understand the asks and how they fit into the larger business, you are able to problem solve in creative ways without counting on others to prescribe a solution. ... On top of the technical challenge, construct a business case which candidates have to work through. The business case should closely align with the job description. If the role will be conducting a lot of metrics analyses, then the case could be a metrics decomposition type of question; if the role will be mainly building models, then the case could be a realistic business situation that candidates can brainstorm modeling solutions for.


Why businesses should embrace multi-cloud

For many organisations, multi-cloud is inevitable. After all, it’s unlikely there is a single cloud out there that can support all your requirements. Organisations typically use several, to dozens, to hundreds of SaaS products, as well as a handful of IaaS hosting services, and development PaaS. Some applications will work better on certain platforms – cloud native apps should be happy on AWS, Microsoft Azure or Google Cloud, but traditional apps might prefer Oracle Cloud or IBM Cloud. So, a multi-cloud approach enables you to create this best-of-breed environment. But there are also benefits to being able to run workloads across multiple hyperscale cloud environments – something that is being made easier through containerisation. The caveat is that success for a multi-cloud environment lies in bringing all the pieces together in harmony. It’s ensuring the right workload is distributed to the most appropriate cloud and making sure all the cloud services can communicate with one another. Organisations need to establish and understand the core connectivity between, and governance around, these disparate environments.



Quote for the day:

"Distinguished leaders impress, inspire and invest in other leaders." -- Anyaele Sam Chiyson

Daily Tech Digest - January 03, 2022

Get the most value from your data with data lakehouse architecture

A data lakehouse is essentially the next breed of cloud data lake and warehousing architecture that combines the best of both worlds. It is an architectural approach for managing all data formats (structured, semi-structured, or unstructured) as well as supporting multiple data workloads (data warehouse, BI, AI/ML, and streaming). Data lakehouses are underpinned by a new open system architecture that allows data teams to implement data structures through smart data management features similar to data warehouses over a low-cost storage platform that is similar to the ones used in data lakes. ... A data lakehouse architecture allows data teams to glean insights faster as they have the opportunity to harness data without accessing multiple systems. A data lakehouse architecture can also help companies ensure that data teams have the most accurate and updated data at their disposal for mission-critical machine learning, enterprise analytics initiatives, and reporting purposes. There are several reasons to look at modern data lakehouse architecture in order to drive sustainable data management practices.


A CISO’s guide to discussing cybersecurity with the board

When you get a chance to speak with executives, you typically don’t have much time to discuss details. And frankly, that’s not what executives are looking for, anyway. It’s important to phrase cybersecurity conversations in a way that resonates with the leaders. Messaging starts with understanding the C-suite and boards’ priorities. Usually, they are interested in big picture initiatives, so explain why cyber investment is critical to the success of these initiatives. For example, if the CEO wants to increase total revenue by 5% in the next year, explain how they can prevent major unnecessary losses from a cyber attack with an investment in cybersecurity. Once you know the executive team and board’s goals, look to specific members, and identify a potential ally. Has one team recently had a workplace security breach? Does one leader have a difficult time getting his or her team to understand the makings of a phishing scheme? These interests and experiences can help guide the explanation of the security solution. If you’re a CISO, you’re well-versed in cybersecurity, but remember that not everyone is as involved in the subject as you are, and business leaders probably will not understand technical jargon.


Best of 2021 – Containers vs. Bare Metal, VMs and Serverless for DevOps

A bare metal machine is a dedicated server using dedicated hardware. Data centers have many bare metal servers that are racked and stacked in clusters, all interconnected through switches and routers. Human and automated users of a data center access the machines through access servers, high security firewalls and load balancers. The virtual machine introduced an operating system simulation layer between the bare metal server’s operating system and the application, so one bare metal server can support more than one application stack with a variety of operating systems. This provides a layer of abstraction that allows the servers in a data center to be software-configured and repurposed on demand. In this way, a virtual machine can be scaled horizontally, by configuring multiple parallel machines, or vertically, by configuring machines to allocate more power to a virtual machine. One of the problems with virtual machines is that the virtual operating system simulation layer is quite “thick,” and the time required to load and configure each VM typically takes some time. In a DevOps environment, changes occur frequently.


Desktop High-Performance Computing

Many engineering teams rely on desktop products that only run on Microsoft Windows. Desktop engineering tools that perform tasks such as optical ray tracing, genome sequencing, or computational fluid dynamics often couple graphical user interfaces with complex algorithms that can take many hours to run on traditional workstations, even when powerful CPUs and large amounts of RAM are available. Until recently, there has been no convenient way to scale complex desktop computational engineering workloads seamlessly to the cloud. Fortunately, the advent of AWS Cloud Development Kit (CDK), AWS Elastic Container Service (ECS), and Docker finally make it easy to scale desktop engineering workloads written in C# and other languages to the cloud. ... The desktop component first builds and packages a Docker image that can perform the engineering workload (factor an integer). AWS CDK, executing on the desktop, deploys the Docker image to AWS and stands up cloud infrastructure consisting of input/output worker queues and a serverless ECS Fargate cluster.


Micromanagement is not the answer

Neuroscience also reveals why micromanaging is counterproductive. Donna Volpitta, an expert in “brain-based mental health literacy,” explained to me that the two most fundamental needs of the human brain are security and autonomy, both of which are built on trust. Leaders who instill a sense of trust in their employees foster that sense of security and autonomy and, in turn, loyalty. When leaders micromanage their employees, they undermine that sense of trust, which tends to breed evasion behaviors in employees. It’s a natural brain response. “Our brains have two basic operating modes—short-term and long-term,” Volpitta says. “Short-term is about survival. It’s the freeze-flight-fight response or, as I call it, the ‘grasshopper’ brain that is jumping all over. Long-term thinking considers consequences [and] relationships, and is necessary for complex problem solving. It’s the ‘ant’ brain, slower and steadier.” She says micromanagement constantly triggers short-term, survival thinking detrimental to both social interactions and task completion.


Unblocking the bottlenecks: 2022 predictions for AI and computer vision

One of the key challenges of deep learning is the need for huge amounts of annotated data to train large neural networks. While this is the conventional way to train computer vision models, the latest generation of technology providers are taking an innovative approach that enables machine learning with comparatively less training data. This includes moving away from supervised learning to self-supervised and weakly supervised learnings where data availability is less of an issue. This approach, also known as few shot learning, detects objects as well as new concepts with considerably less input data. In many cases the algorithm can be trained with as little as 20 images. ... Privacy remains a major concern in the AI sector. In most cases, a business must share its data assets with the AI provider via third party servers or platforms when training computer vision models. Under such arrangements there is always the risk that the third party could be hacked or even exploit valuable metadata for its own projects. As a result, we’re seeing the rise of Privacy Enhancing Computation, which enables data to be shared between different ecosystems in order to create value, while maintaining data confidentiality.


How Automation Can Solve the Security Talent Shortage Puzzle

Supporting remote and hybrid work requires organizations to invest in and implement new technologies that facilitate needs such as remote access, secure file-sharing, real-time collaboration and videoconferencing. Businesses must also hire professionals to configure, implement and maintain these tools with an eye towards security – a primary concern here, as businesses of all sizes now live or die by the availability and integrity of their data. The increasing complexity of IT environments – many of which are now pressured to support bring-your-own-device (BYOD) policies – has only intensified the need for competent cybersecurity talent. It’s not surprising that the ongoing shortage of trained professionals makes it difficult for organizations to expand their business and adopt new technologies. Almost half a million cybersecurity jobs remain open in the U.S. alone, forcing businesses to compete aggressively to fill these roles. Yet, economic pressures make it particularly difficult for small-to mid-sized businesses (SBMs) to play this game. Most cannot hope to match the high salaries that large enterprises offer.


HIPAA Privacy and Security: At a Crossroads in 2022?

The likely expansion of OCR’s mission into protecting the confidentiality of SUD data comes as actions to enforce the HITECH Breach Notification and HIPAA Security Rule appear to be at a standstill. According to the data compiled by OCR, in 2021 there were more than 660 breaches of the unauthorized disclosure of unsecured PHI reported by HIPAA-covered entities and their business associates that compromised the health information of over 40 million people. A significant number of the breaches reported to OCR appear to show violations of the HIPAA standards due to late reporting and failure to adequately secure information systems or train workforce members on safeguarding PHI. In 2021, OCR announced settlements in two enforcement actions involving compliance with the HIPAA Security Rule standards. OCR has been mum on its approach to enforcement of the HIPAA breach and security rules. One explanation could be the impact being felt by the 5th Circuit Court of Appeals decision overturning an enforcement action against the University of Texas MD Anderson Cancer Center.


Will we see GPT-3 moment for computer vision?

It is truly the age of large models. Each of these models is bigger and more advanced than the previous one. Take, for example, GPT-3 – when it was introduced in 2020, it was the largest language model trained on 175 billion parameters. Fast forward one year, and we already have the GLaM model, which is a trillion weight model. Transformer models like GPT-3 and GLaM are transforming natural language processing. We are having active conversations around these models making job roles like writers and even programmers obsolete. While these can be dismissed as speculations, for now, one cannot deny that these large language models have truly transformed the field of NLP. Could this innovation be extended to other fields – like computer vision? Can we have a GPT-3 moment for computer vision? OpenAI recently released GLIDE, a text-to-image generator, where the researchers applied guided diffusion to the problem of text conditional image synthesis. For GLIDE, the researchers trained a 3.5 billion parameter diffusion model that uses a text encoder. Next, they compared CLIP (Contrastive Language-Image Pre-training) guidance and classifier free guidance.


What is Legacy Modernization

To achieve a good level of agility the systems supporting the organization also have to quickly react and change to the surrounding environment. Legacy systems place a constraint on agility since they are often difficult to change or provide inefficient support to business activities. This is not unusual, at the time of the system design there were perhaps technology constraints that no longer exist, or the system was designed for a particular way of working that is no longer relevant. Legacy Modernization changes or replaces legacy systems, making the organization more efficient and cost-effective. Not only can this optimize existing business processes, but it can open new business opportunities. Security is an important driver for Legacy Modernization. Cyber-attacks on organizations are common and become more sophisticated over time. The security of a system degrades over time, and legacy systems may no longer have the support or the technologies required to deter modern attack methods, this makes them an easy target for hackers. This represents a significant business risk to an organization.



Quote for the day:

"Leadership is particularly necessary to ensure ready acceptance of the unfamiliar and that which is contrary to tradition." -- Cyril Falls

Daily Tech Digest - January 02, 2022

Updating Data Governance: Set Up a Cohesive Plan First

Just because a company has a Data Governance framework it used with a mature technology project, like a data warehouse, does not mean it is sufficient for newer technology initiatives, like machine learning. New business requirements need to be considered, especially where system integration is necessary. For example, Data Quality must be good for all data sets, across the entire enterprise, before machine learning can be applied to a new venture. Danette McGilvray, President and Principal at Granite Falls Consulting, said, “The cold brutal reality is that the data is not good enough to support machine learning in practically every company.” This is only one of many business needs that crop up before succeeding at such an undertaking. Revisiting Data Governance prior to starting a new data project reduces exposure to mistakenly overlooking prerequisites, and moves toward a unified Data Management approach. Rethinking older Data Governance plans alone does not necessarily lead to a more coherent Data Governance. 


A Hands-On Guide to Outlier Detection with Alibi Detect

Data points that are unusually far apart from the rest of the observations in a dataset are known as outliers. They are primarily caused by data errors (measurement or experimental errors, data collection or processing errors, and so on) or naturally very singular and different behaviour from the norm, for example, in medical applications, very few people have upper blood pressure greater than 200, so If we keep them in the dataset, our statistical analysis, and modelling conclusions will be skewed. To name a few, they can alter the mean and standard deviation values. As a result, it’s critical to accurately detect and handle outliers, either by removing them or reducing them to a predefined value. Outlier detection is thus critical for identifying anomalies whose model predictions we can’t trust and shouldn’t use in production. The type of outlier detector that is appropriate for a given application is determined by the data’s modality and dimensionality, as well as the availability of labelled normal and outlier data and whether the detector is pre-trained (offline) or updated online.


How a startup uses AI to put worker safety first

Computer vision has progressed from an experimental technology to one that can interpret patterns in images and classify them using machine learning algorithms to scale. Advances in deep learning and neural networks enable computer vision uses to increase for enterprises, improving worker safety in the process. Computer vision techniques to reduce worker injuries and improve in-plant safety are based on unsupervised machine learning algorithms that excel at identifying patterns and anomalies in images. Computer vision platforms, including Everguard’s SENTRI360, rely on convolutional neural networks to categorize images and industrial workflows at scale. The quality of the datasets used to train supervised and unsupervised machine learning algorithms determines their accuracy. Convolutional neural networks also require large amounts of data to improve their precision in predicting events, fine-tuned through iterative cycles of machine learning models. Each iteration of a machine learning model then extracts specific attributes of an image and, over time, classifies attributes. 


'Work’ in 2022: What’s next?

Undoubtedly, it needs a complete overhaul of the existing policies, but what we must not forget is that in an evolving environment nothing can be treated as constant. Therefore, swift revision of new policies is very important to match up to the changing scenarios while maintaining people centricity as the central thought. Knowing the employee pulse will be the key to create or revise policies for which regular surveys, town halls, leadership connections are extremely important. Employee safety and well-being will continue to hold the top of mind space and the inclination of workplace culture transformation would be towards empathy and flexibility. Though the challenge of overcoming ‘how-much-is-too-much’ is something that the organizations would need to solve for. They will have to rally together to find the sweet spot to maintain the right balance between productivity and not hampering the work life balance of the employees. ... If data is considered the new oil of the 21st century, ‘Trust’ will become equivalent to it in the post pandemic world making the relationship between the employer and the employee go through a gradual transition where managing expectations from both ends will be essential.


Blockchain technology developments help elevate food safety protocols

Blockchain technologies, something we have been discussing for a few years, are closer than we think. Transparency, traceability, and sustainability are vital to everyone in the industry. The FDA has outlined four core elements in the New Era of Smarter Food Safety Blueprint, and the first of these elements is tech-enabled traceability. Traceability processes are critical to ensure all food items are traced and tracked throughout the supply chain. Traceability is essential for food safety as well as operational efficiency. With a solid traceability program, it is possible to locate a product at any stage of the food chain within the supply chain — literally from farm to table. For this technology to work well, it must be user-friendly and affordable to all — small businesses and large corporations alike. When it is available and widely used, it will minimize foodborne illness outbreaks and assist significantly with speeding up the process of finding the source if an outbreak does occur. Affordable digital technology connecting buyers with validated verified sellers is at the forefront. 


I followed my dreams and got demoted to software developer

I was just a UX person, not a coder. Surrounded by only the most freakishly good developers at Facebook (and then at Stack Overflow), I pushed whatever fantasies I had about coding professionally aside. During these few years in which I’ve been coding in earnest on the side, I also found myself regularly discouraged and confused by the sheer number of possible things that I could learn or do. I can’t count the number of quarter-finished games and barely-started projects I have in my private GitHub repos (actually, I can. It’s 15, and those are just the ones that made it there). Without much formal education in this field, I’d frequently get lost down documentation holes and find myself drowning in the 800 ways of maybe solving the problem that I had. Finally, I came to the conclusion that I needed more structure, and that I wouldn’t be able to get that structure in the hour of useful-brain-time I had after work each day. I started researching bootcamps and doing budget calculations and made plans to leave Stack Overflow. 


Quantum Computing In 2022: A Leap Into The Tremendous Future Ahead

The different analyses and drives with quantum computing by the big tech and other organizations are setting out an ocean of open doors before CIOs and IT offices to apply the innovation into this present reality settings. Quantum computers are undeniably appropriate for settling complex optimization- and performing quick quests of unstructured data, as Prashanth Kaddi, Partner, Deloitte India makes reference to, “it can possibly bring problematic change across areas, including finding, medication research, dispersion store network, traffic flow, energy optimizing and many more. Quantum computing additionally fundamentally diminishes time to market, just as helps in enhancing customer delivery. For instance, a drug organization may essentially decrease an opportunity to showcase new medications. In finance, it could empower quicker, more complicated Monte Carlo stimulations, like trading, trajectory optimisation, market instability, and value advancement procedures, and some more.


New data-decoding approach could lead to faster, smaller digital tech

Just one trifling issue: Encoding or decoding data in antiferromagnets can be a bit like trying to write with a dried-up pen or decipher the scribblings of a toddler. "The difficulty—and it's a significant difficulty—is how to write and read information," said Tsymbal, George Holmes University Professor of physics and astronomy. The same antiferromagnetic property that acts as a pro in one context—the lack of a net magnetic field preventing data corruption—becomes a con when it comes to actually recording data, Tsymbal said. Writing a 1 or 0 in a ferromagnet is a simple matter of flipping its spin orientation, or magnetization, via another magnetic field. That's not possible in an antiferromagnet. And whereas reading the spin state of a ferromagnet is similarly straightforward, it's not easy distinguishing between the spin states of an antiferromagnet—up-down vs. down-up—because neither produces a net magnetization that would yield discernible differences in the flow of electrons. Together, those facts have impeded efforts to develop antiferromagnetic tunnel junctions with practical use in actual devices.


IoT & AI Applications in Fisheries Industry Can Bring Another Blue Revolution, Read How?

IoT devices and AI are assisting fisheries in optimizing where and when they fish, as well as there are sensors that identify fish and catch size and onboard cameras aid in sorting the catch. The data can also assist the wild fishing industry cut expenses by providing insights on how to cut fuel usage and enhance fleet maintenance using AI predictive maintenance. According to McKinsey, if large-scale fishing enterprises around the world adopt this concept, they may save $11 billion in annual operating costs. With feed accounting for a considerable amount of operating costs and both under and overfeeding having severe consequences for fish health and size as well as water quality, feeding optimization can result in significant savings and benefits. Many smart fish farms, such as efishery, use feeders that rely on vibration and auditory cues to help with more accurate feeding. The use of technology in aquaculture management is also boosting efficiencies and reducing manpower demands, which is a significant financial and safety benefit for remote marine farms.


Advanced Analytics: A Look Back at 2021 and What's Ahead for 2022

Companies want to become more advanced in analytics in order to better compete. Yet, they are struggling with both keeping talent in house and building new talent to perform more advanced analytics. In other words, organizations need to build literacy to utilize tools such as self-service BI, and they need to either retain or grow talent to move forward with data science. Data literacy was a priority in 2021 and it will continue to be in 2022. Data literacy involves how well users understand and can interact with data and analytics and communicate the results to achieve business goals. It includes understanding data elements, understanding the business, framing analytics, critical interpretation, and communication skills. As part of this, we expect to see more organizations building literacy enablement teams to help educate their people. We expect to see modern analytics tools with more advanced and augmented features such as natural language search and the ability to surface descriptive insights becoming more popular. 



Quote for the day:

"The best leader brings out the best in those he has stewardship over." -- J. Richard Clarke

Daily Tech Digest - January 01, 2022

The Future Role of the Chief Innovation Officer

One of the key ideas highlighted today is making your business resilient. As enterprises move forward in a post-pandemic period, organizational resiliency will be among the main goals, and innovation will be at the heart of becoming resilient. Chief innovation officers will help companies develop the ability to rapidly respond to unexpected events and the associated shift in customer needs. The key here is how to make your products and services flexible to cope with significant business disruptions. Most people think that developing innovative ideas is solely the role of the chief innovation officer. But this is not true. Chief innovation officers do not have a monopoly of knowledge when it comes to new ideas. In most instances, the role of chief innovation officers is to cultivate a company culture that allows other people within the organization to develop new ideas. They do that by training people to become innovative and helping create a work environment that encourages individuals to contribute new ideas for the company's benefit. 


Analytics and AI in 2022: Innovation in the era of COVID

From low-code, we move to what we might call "low-data." Dr. Jans Aasman, CEO of Franz Inc., characterized the new normal of personal data analytics: "in 2022 we will see new ways for users to regain control of their data." That's great, but it's yet another disruption, forcing companies to personalize experiences without access to as much personal data. Jennifer Krizanek, President, NA and CMO of Contentserv, describes the challenge: "2022 will witness businesses strategizing on how to personalize the customer experience without breaking GDPR laws or infringing on consumers' data privacy rights." She further opines that next year "will be the year in which businesses learn how to operate, market and personalize their offerings to consumers without tracking their every move." But how can that be done? For openers, Denodo's Shankar says that "in 2022, organizations will leverage small data analytics to create hyper personalized experiences for their individual customers to understand customer sentiment around a specific product or service within a short time window." 


What the Rise in Cyber-Recon Means for Your Security Strategy

Enterprises need to be aware that an increase in new cybercriminals armed with advanced technologies will increase the likelihood and volume of attacks. Standard tools must be able to scale to address potential increases in attack volumes. These tools also need to be enhanced with artificial intelligence (AI) to detect attack patterns and stop threats in real time. Critical tools should include anti-malware engines using AI detection signatures, endpoint detection and response (EDR), advanced intrusion prevention system (IPS) detection, sandbox solutions augmented with MITRE ATT&CK mappings and next-gen firewalls (NGFWs). In the best-case scenario, these tools are deployed consistently across the distributed network (data center, campus, branch, multi-cloud, home office, endpoint) using an integrated security platform that can detect, share, correlate and respond to threats as a unified solution. Cybercriminals are opportunistic, and they’re also growing increasingly crafty. We’re now seeing them spend more time on the reconnaissance side of cyberattacks.


Is Apple making the same mistakes made by Nokia?

The fall of Nokia and the rise of Apple as a smartphone giant are deeply intertwined. Apple, which throughout the global recession near-unfailingly smashed Wall Street forecasts, is beginning to lose its aura of invincibility. Apple took a $6 billion hit to its sales during the fiscal fourth quarter due to persistent global supply chain problems. The company has missed Wall Street targets twice in under a year. CEO Tim Cook may now have to worry more about economic and product launch cycles, and the whims of fickle consumers. The blockbuster smartphone that adds a special gloss to the Apple brand is a highly cyclical product. Buyers emerge in droves every time a new version is launched, lining up at stores overnight, and snarling supply for the device. Its popularity has heightened speculation around the device every year as over 100 million yearly customers decide when to switch to a new model, whether to buy now or hold out for a better but same-priced phone.


2021 Information security in memes and tweets

For those who don’t know: Cellebrite produces equipment for so-called competent authorities, allowing employees to easily and conveniently hack into smartphones and retrieve information of interest from them. That’s why the company holds a special place in the hearts of privacy advocates. In late 2020, Cellebrite announced that their products were beginning to support Signal. In response, the Signal team published a study of vulnerabilities in Cellebrite hardware and accompanied it with an unparalleled teaser ... Colonial Pipeline, the largest U.S. pipeline system that moves petroleum products, was attacked by ransomware, disrupting gasoline and diesel supplies across the southeast coast of the country. The incident sparked a lot of discussion about how these kinds of businesses should be protected, and the company’s own announcement of a search for a new cybersecurity manager went viral on social media as a meme, with the comment “They probably have a decent budget now.” 


Understanding 5G in India in 2022

5G is not very different in principle from 4G. It uses the same mobile network principles as 4G, but the only difference is, 5G New Radio (NR) air interface can enhance OFDM for delivering a much higher degree of scalability and flexibility. 5G also brings higher bandwidth spectrum frequencies into play. The spectrum in the millimetre wavelength bands and sub-6 GHz bands will be used for 5G services which will be the first time such high-frequency airwaves will be used for rolling out mobile network services. This enables 5G to be way different than 4G when it comes to the use cases and its applications. The mobile ecosystem will touch upon new realms with 5G, which wasn’t possible with the previous generation technologies. Just comparing with 4G networks, 5G is faster, has more capacity, delivers significantly lower latency, is a more unified platform than 4G, and uses a higher frequency spectrum better than 4G would have ever been able to. 5G is designed to support more mission-critical applications and communications and also bring the massive Internet-of-Things (IoT) to the table.


Predictions 2022: Edge tech, enterprise 5G, cybersecurity in focus

The Edge discussion will separate into two focus areas – edge platforms that provide a stable pool of secure capacity for the diverse edge ecosystems and software defined edge workloads/software stacks that extend application and data systems into real world environments. We are already seeing this shift today. As we move into 2022, we expect edge platforms to become more capable and pervasive. ... Edge will become the new battleground for data management as the latter becomes a new class of workload. Data management and the edge will increasingly converge and reinforce each other. The cybersecurity industry is now moving from discussion of emerging security concerns to a bias toward action. Enterprises and governments are facing threats of greater sophistication and impact on revenue and services. As a result, the cybersecurity industry is responding with greater automation and integration. The industry is also pivoting from automated detection to prevention and response with a focus on applying AI and machine learning to speed remediation.


Top 6 Trends Shaping Digital Transformation in 2022

More enterprises will also see a shift toward low-code or no-code software adoption. These tools have long been the source of some controversy in the IT industry because many developers have feared their jobs would become obsolete if companies could operate on low-code software alone. However, this category of tools has proven to be an asset to developers in recent years. “No-code tools are great to solve simpler problems,” says Dean Guida, CEO and founder of Infragistics. “When you combine low-code tools and professional developers you can tackle the higher impact digital solutions that will give competitive advantage to organizations.” Low-code and no-code software means developers can prioritize innovation and business growth rather than spending time building tools for day-to-day operations. But that’s not where the benefits stop. Low-code solutions also provide an opportunity for professionals in other parts of the organization to make smarter decisions without confronting a steep learning curve.


Mind-Controlled Robots A Step Closer To Realization

The two research groups developed a computer program that made it possible to control a robot just by using thoughts; no voice control or touch is necessary. The system receives input from a person’s brain through a head cap equipped with electrodes for performing electroencephalogram (EEG) scans of the user’s brain signals. To use this robot arm, all the person has to do is look at it. The robot will automatically understand if it has done something wrong as our brains will emit a signal that tells the robot, “Not like that.” The system uses a combination of inverse reinforcement learning (IRL) techniques and brain-computer interfaces (BCI) that decode error-related potentials (ErrP), which allows the system to have a reward function depending on the user’s ErrP. In other words, through a hit-and-trial method, the robot figures out the user’s needs by interpreting their brain signals. According to the lead author, Iason Batzianoulis, what was particularly difficult from a technical viewpoint was “‘translating’ a patient’s brain signals into actions performed by the robot.” They did that by using machine learning to link each brain signal to specific tasks.


An overview of 7 tech trends reshaping enterprise architecture

While edge computing isn’t necessarily a new technology, it’s still a fairly new computing practice, not yet having gained the traction that hybrid cloud has. Hybrid cloud allows you to keep leveraging your on-premises infrastructure and your choice of a combination of public and private cloud services. It also enables you to scale on-demand and take advantage of AI/ML capabilities to help parse data faster. If a cloud environment can use a hybrid model, then so can a microservices environment. Integrate the two, and you’ve got the power of your legacy systems without having to deal with the complex layers of those systems. How? Because a microservices architecture allows you to componentize the most beneficial parts of your application. Microservice architecture, often used in cloud-native integration solutions, has made it easier for teams to code more efficiently, debug, and deploy faster, slowly replacing the massive, inflexible legacy systems. When managed correctly, microservices allow developers to better understand the source code without slowing down development because code review, application build, and deployment are streamlined compared to monolithic applications.



Quote for the day:

"Leadership is absolutely about inspiring action, but it is also about guarding against mis-action." -- Simon Sinek

Daily Tech Digest - December 31, 2021

Can blockchain solve its oracle problem?

The so-called oracle problem may not be intractable, however — despite what Song suggests. “Yes, there is progress,” says Halaburda. “In supply-chain oracles, we have for example sensors with their individual digital signatures. We are learning about how many sensors there need to be, and how to distinguish manipulation from malfunction from multiple readings.” “We are also getting better in writing contracts taking into account these different cases, so that the manipulation is less beneficial,” Halaburda continues. “In DeFi, we also have multiple sources, and techniques to cross-validate. While we are making progress, though, we haven’t gotten to the end of the road yet.” As noted, oracles are critical to the emerging DeFi sector. “In order for DeFi applications to work and provide value to people and organizations around the world, they require information from the real world — like pricing data for derivatives,” Sam Kim, partner at Umbrella Network — a decentralized layer-two oracle solution — tells Magazine, adding:


Putting the trust back in software testing in 2022

Millions of organisations rely on manual processes to check the quality of their software applications, despite a fully manual approach presenting a litany of problems. Firstly, with more than 70% of outages caused by human error, testing software manually still leaves companies highly prone to issues. Secondly, it is exceptionally resource-intensive and requires specialist skills. Given the world is in the midst of an acute digital talent crisis, many businesses lack the personnel to dedicate to manual testing. Compounding this challenge is the intrinsic link between software development and business success. With companies coming under more pressure than ever to release faster and more regularly, the sheer volume of software needing testing has skyrocketed, placing a further burden on resources already stretched to breaking point. Companies should be testing their software applications 24/7 but the resource-heavy nature of manual testing makes this impossible. It is also demotivating to perform repeat tasks, which generally leads to critical errors in the first place. 


December 2021 Global Tech Policy Briefing

CISA and the National Security Administration (NSA), in the meantime, offered a second revision to their 5G cybersecurity guidance on December 2. According to CISA’s statement, “Devices and services connected through 5G networks transmit, use, and store an exponentially increasing amount of data. This third installment of the Security Guidance for 5G Cloud Infrastructures four-part series explains how to protect sensitive data from unauthorized access.” The new guidelines run on zero-trust principles and reflect the White House’s ongoing concern with national cybersecurity. ... On December 9, the European Commission proposed a new set of measures to ensure labor rights for people working on digital platforms. The proposal will focus on transparency, enforcement, traceability, and the algorithmic management of what it calls, in splendid Eurocratese, “digital labour platforms.” The number of EU citizens working for digital platforms has grown 500 percent since 2016, reaching 28 million, and will likely hit 43 million by 2025. Of the current 28 million, 59 percent work with clients or colleagues in another country. 


10 Predictions for Web3 and the Cryptoeconomy for 2022

Institutions will play a much bigger role in Defi participation — Institutions are increasingly interested in participating in Defi. For starters, institutions are attracted to higher than average interest-based returns compared to traditional financial products. Also, cost reduction in providing financial services using Defi opens up interesting opportunities for institutions. However, they are still hesitant to participate in Defi. Institutions want to confirm that they are only transacting with known counterparties that have completed a KYC process. Growth of regulated Defi and on-chain KYC attestation will help institutions gain confidence in Defi. ...  Defi insurance will emerge — As Defi proliferates, it also becomes the target of security hacks. According to London-based firm Elliptic, total value lost by Defi exploits in 2021 totaled over $10B. To protect users from hacks, viable insurance protocols guaranteeing users’ funds against security breaches will emerge in 2022. ... NFT Based Communities will give material competition to Web 2.0 social networks — NFTs will continue to expand in how they are perceived.


Firmware attack can drop persistent malware in hidden SSD area

Flex capacity is a feature in SSDs from Micron Technology that enables storage devices to automatically adjust the sizes of raw and user-allocated space to achieve better performance by absorbing write workload volumes. It is a dynamic system that creates and adjusts a buffer of space called over-provisioning, typically taking between 7% and 25% of the total disk capacity. The over-provisioning area is invisible to the operating system and any applications running on it, including security solutions and anti-virus tools. As the user launches different applications, the SSD manager adjusts this space automatically against the workloads, depending on how write or read-intensive they are. ... One attack modeled by researchers at Korea University in Seoul targets an invalid data area with non-erased information that sits between the usable SSD space and the over-provisioning (OP) area, and whose size depends on the two. The research paper explains that a hacker can change the size of the OP area by using the firmware manager, thus generating exploitable invalid data space.


'Businesses need to build threat intelligence for cybersecurity': Dipesh Kaura, Kaspersky

Organizations across industries are faced with the challenge of cybersecurity and the need to build threat intelligence holds equal importance for every business that thrives in a digital economy. While building threat intelligence is crucial, it is also necessary to have a solution that understands the threat vectors for every business, across every industry. A holistic threat intelligence solution looks at every nitty-gritty of an enterprise's security framework and gets the best actionable insights. A threat intelligence platform must capture and monitor real-time feeds from across an enterprise's digital footprint and turn them into insights to build a preventive posture, instead of a reactive one. It must diagnose and analyze security incidents on hosts and the network and signals from internal systems against unknown threats, thereby minimizing incident response time and disrupt the kill chain before critical systems and data are compromised. 


IT leadership: 3 ways to show gratitude to teams

If someone on your team takes initiative on a project, let them know that you appreciate them. Pull them aside, look them in the eye and speak truthfully about how much their extra effort means to you, the team, and the company. Make your thank-you’s genuine, direct, and personal. Most individuals value physical tokens of appreciation in addition to expressed gratitude. If you choose to offer a gift, make it as personalized as you can. For example, an Amazon gift card is nice – but a cake from their favorite bakery is even nicer. Personalization means that you’ve thought about them as a person, taken the time to consider what they like, and recognize their contributions as an individual. Contrary to the common belief that we should be lavish with our praises, I would argue that it’s better to be selective. Recognize behavior that lives up to your company’s values and reserve the recognition for situations where it is genuinely deserved. If a leader showers praise when it’s not really warranted, they devalue the praise that is given when team members actually go above and beyond.


Top 5 AI Trends That Will Shape 2022 and Beyond

Under the umbrella of technology, there are several terms with which you must be already familiar, such as artificial intelligence, machine learning, deep learning, blockchain technology, cognitive technology, data processing, data science, big data, and the list is endless. Just imagine, how would it be to survive in the pandemic outbreak if there would be no technology? What if there would be no laptops, PCs, tablets, smartphones, or any sort of gadgets during COVID-19? How would human beings earn for their survival and living? What if there would be no Netflix to binge-watch or no social media application during coronavirus? Undoubtedly, that’s extremely intimidating and intriguing at the same time. Isn’t it giving you goosebumps wondering how fast the technology is advancing? Let’s flick through some jaw-dropping statistics first. Did you know that there are more than 4.88 billion mobile phone users all across the world now? According to the technology growth statistics, almost 62% of the world’s population own a smartphone device.


Introducing the Trivergence: Transformation driven by blockchain, AI and the IoT

Blockchain is the distributed ledger technology underpinning the cryptocurrency revolution. We call it the internet of value because people can use blockchain for much more than recording crypto transactions. Distributed ledgers can store, manage and exchange anything of value — money, securities, intellectual property, deeds and contracts, music, votes and our personal data — in a secure, private and peer-to-peer manner. We achieve trust not necessarily through intermediaries like banks, stock exchanges or credit card companies but through cryptography, mass collaboration and some clever code. In short, blockchain software aggregates transaction records into batches or “blocks” of data, links and time stamps the blocks into chains that provide an immutable record of transactions with infinite levels of privacy or transparency, as desired. Each of these foundational technologies is uniquely and individually powerful. However, when viewed together, each is transformed. This is a classic case of the whole being greater than the sum of its parts.


Sustainability will be a key focus as the transport sector transitions in 2022

Delivery is also an area where we expect to see the movement towards e-fleets grow. We’ve already seen this being trialled, with parcel-delivery company DPD making the switch to a fully electric fleet in Oxford. It’s estimated that by replicating this in more cities, DPD could reduce CO2 by 42,000 tonnes by 2025. While third-party delivery companies offer retailers an efficient service, carrying as many as 320 parcels a day, this model is challenged when it comes to customers’ growing expectations they can receive deliveries within hours. Sparked by lockdowns, which led to a 48% increase in online shopping, the “rapid grocery delivery” trend looks set grow in 2022. Grocery delivery company Getir, for example, built a fleet of almost 1,000 vehicles in 2021 to service this need – and is planning to spend £100m more to expand its offering. Given the current driver recruitment crisis, which is currently affecting delivery and taxi firms, we are not expecting many other operators to invest that kind of money into building new fleets though. Instead, you are more likely to see retailers working with existing fleets. 



Quote for the day:

"Cream always rises to the top...so do good leaders." -- John Paul Warren