Daily Tech Digest - January 09, 2021

How to be cyber-resilient to head off cybersecurity disasters

Responsible parties in organizations should bite the bullet and choose security over convenience. For example, zero trust in digital communications means people wanting to communicate with someone within the organization must be verified before any communications will be allowed. This also can apply to remote employees. "All users who request access to company resources, even those within the network, should be cleared based on variables such as the device used, project type, geographical location, and role," the authors note. "If anything is amiss, advanced verification has to be done." In addition, even with verification, user access should be limited using the least-privilege principle, in which users or processes are only given privileges essential to perform the intended task. For example, there is no need to give a receptionist the privilege of installing software. In zero trust, those responsible for cybersecurity also need to worry about malicious domains. The authors explain, "To fully implement a zero-trust framework, security teams must perform domain-reputation assessments to prevent access to unreputable domains." 


2021 IT priorities require security considerations

AI's challenges include training the numerous deep learning algorithms that implement AI, the lack of labeled data for training and testing and, most importantly, issues with explainability of what AI does and why. Organizations must have experts on hand who understand internal processes and data before they can use AI effectively. Furthermore, AI can observe phenomena in data that humans have difficulty comprehending. Therefore, humans cannot place 100% trust in the results and recommendations, especially for life-critical applications. The potential for cyber attacks to cause physical harm to people and damage to equipment is one of the greatest concerns. Examples include disrupting the power grid or supply chains or internal attacks on the plethora of IoT devices used within companies. ... When executed mindfully, the cloud can provide a secure environment for organizations. Public cloud providers do an excellent job with the securing "of" the cloud, but it is up to organizations to manage security "in" the cloud. That is where a mindful security architecture and strategy comes in, including ensuring core cloud architecture adheres to best practices. All major public cloud providers have established framework models to use.


The 2021 Crystal Ball for Emerging Tech

Asad Hussain, PitchBook’s lead mobility analyst, says battery electric growth won’t stop anytime soon—but he believes that 2021 will be “the year of the self-driving SPAC.” SPACs are an attractive option for the AV sector for the same reasons as the EV sector: Capital-intensive startups without much (if any) revenue typically need cash quickly, and SPACs provide that. ... Uber officially acquired Postmates earlier this month, DoorDash went public last week, and Instacart’s IPO could come as soon as Q1 2021. Virtually all of the space’s leaders have moved beyond solely food delivery and into areas like convenience and retail. That's led to an even hotter market for last-mile delivery tech: This year, electric vehicle startups Rivian and Arrival partnered with Amazon and UPS, respectively, on future fleets of electric delivery vans. Amazon and Walmart’s delivery drone battle entered a new phase. And shipping giants like FedEx are rolling out autonomous same-day delivery bots. ... In 2021, experts told us, we can expect demand for data engineers and others who can help integrate AI and ML tools into a business’s existing infrastructure. “Small- and medium-sized businesses alike need to bring on the right skilled professionals to help integrate the right tools and systems [for AI],” says Paylor.


Explain How Your Model Works Using Explainable AI

In the industry, you will often hear that business stakeholders tend to prefer models that are more interpretable like linear models (linear\logistic regression) and trees which are intuitive, easy to validate, and explain to a non-expert in data science. In contrast, when we look at the complex structure of real-life data, in the model building & selection phase, the interest is mostly shifted towards more advanced models. That way, we are more likely to obtain improved predictions. Models like these are called black-box models. As the model gets more advanced, it becomes harder to explain how it works. Inputs magically go into a box and voila! We get amazing results. ... What if our data is biased? It will also make our model biased and therefore untrustworthy. It is important to understand & be able to explain to our models so that we can also trust their predictions and maybe even detect issues and fix them before presenting them to others. To improve the interpretability of our models, there are various techniques some of which we already know and implement. Traditional techniques are exploratory data analysis, visualizations, and model evaluation metrics. With the help of them, we can get an idea of the model’s strategy. However, they have some limitations.


How to Stay GDPR Compliant with Access Logs

Deleting user data from the database is easy. You have SQL for that. Deleting user PII from the log file is the tricky part. You might have different servers generating logs and you might feed logs to different cloud services. This might complicate how you perform record deletion. ... You have one month to respond to a user forget-me request. This actually means that you have one month to filter your log files from all user-related records – for example, filter out user IP addresses. Or you can limit the log retention period just to one month. All older log entries will get removed. This way you do not need to do anything besides a one-time configuration of the log retention period. ... PII found in the log events will be grouped together and encrypted. The initial setup will include one time generation of the log-entry password for each user. This password for example can be saved in the user profile stored in Databunker. As we need to know who the record owner is (to decrypt the record), we need to save the user id together with encrypted PII. So, another level of encryption will be used with a generic password. For user identified log events, PII will be encrypted twice. The first time the data will be encrypted using the user's log-entry password.


ThoughtSpot CEO - ‘I want to kill BI and I want all dashboards to die’

Nair argues that BI tools effectively decide what you want to see, which is counter to the idea of hyper-personalisation. ThoughtSpot is approaching this from a use case point of view. For example, Nair said that customer churn is an area that he believes the company can seriously ‘move the needle' for its customers. He gave the example of a large bank, which is unlikely to win lots of new customers in a saturated market, and as such, pleasing and keeping its existing customers is key. In this use case, Nair said, take a bank that has a customer that has a car loan, but is also now looking for a new home loan. But that same customer is annoyed with the bank, because they got charged interest for the car loan for making one payment a day late. This experience may put them off getting a home loan with the same bank and if the bank is just using aggregate, historical data on all customers with car loans, then they will not know the details of this unique customer. The problem is that just throwing more stuff at customers is creating more noise, not signal. So you need to distil the personalised data that you have. If the bank could go back to that customer and say ‘we messed up, we're sorry, here's the interest back, and by the way would you like a home loan?' - that's the bespoke experience and where data matters.


Will Publicly-Backed Companies Finally Embrace Blockchain?

Worthy of note is the fact that blockchain is decentralized. It is not centrally controlled by any bank, government, or corporation. The system is owned and controlled by each block of ownership. The more the network grows, the more decentralized it becomes, and the more decentralized, the safer the network. Many believe that this system of control – decentralization, is responsible for the attitude of the governments and the central bank of nations to blockchain technology. Through blockchain networks, decentralized finance (DeFi) has become possible. DeFi aims to create an open-source, permissionless, and transparent financial service ecosystem that is available to everyone and operates without any central authority. But in spite of the massive growth potential it presents, decentralized finance still faces a couple of challenges like stuck transactions, poor user experience, and impermanent losses, which may pose as a limitation to its adoption in the long run. It might seem unfair to expect men and women, especially renowned investors, who have mastered the current system of transacting and have gone on to build wealth despite the frailties, to accept the blockchain technology without question.


Malware Developers Refresh Their Attack Tools

The attack trends underscore that a multilayered approach to defenses is necessary to detect these attacks. While adversaries may manage to bypass one or more security measures, more potential points of detection will mean a greater chance of detecting intrusions before they become breaches. "Attackers will do what works," Unterbrink says. "If we would prepare ourselves for a certain new bypass technique, they would just use a different one. It is more important to track, find, and detect new techniques used in the wild as soon as possible." In total, the LokiBot dropper uses three stages, each with a layer of encryption, to attempt to hide the eventual source of code. The LokiBot example shows that threat actors are adopting more complex infection chains and using more sophisticated techniques to install their code and compromise systems. Distributing malicious actions over a number of stages is a good way to hide, says Unterbrink. "Due to increased operation system security and endpoint and network protection, malware needs to distribute the malicious infection stages over different techniques," he says. "In some cases, multiple stages are also necessary because of a complex commercial malware distribution system used by the adversaries to sell their malware in the underground as a service."


Bot-As-A-Service: Present Is Great, Future Even Better

Over the years, messaging platforms have created an immense potential for bots. Apart from just carrying out primary chat services, chatbots’ role may soon diversify, and its usage may extend to personal assistant, entertainment, travel agent, news, advertising, and promotion. Intelligent chatbots would continue to grow in the coming years. Some of the trends that can be expected of BaaS are: Bots will be more open and universal. This will allow users to instantaneously find and chat with a company’s bot, not dependent on which messaging is being used. Bots will become more accessible with a minimum complexity factor. This means that even non-developers will be able to build and operate a bot.  The bots will become language-agnostic. Currently, most bots use English as a medium for query solving. However, with the advancement in NLP technology, this is expected to include a larger pool of languages. One step towards making these bots’ universal’ would be to have a This would require developing a generalised framework to allow anyone to operate a bot. Intertwined with better sentiment analysis capabilities, chatbots can be trained to be more human-like. Apart from providing an effective response, chatbots in future will be able to cater to a delightful customer experience by responding to customer emotions accurately.


How to implement mindful information security practices

Employees are change-adverse even if, ultimately, the change helps them. "People default to what is simple and what they know," write Kahn and Beckmann. "Therefore, open dialogue is critical. It must be clear, consistent, and anchored to a 'why' that resonates with employees and makes their life better (not just simpler, but better)." Making an employee's life better is the key to eliminating the, "but this is how we have always done it" response and having employees become mindful stewards of the organization's information, which in turn builds a culture of awareness. Achieving a mindful information culture: For the mindful information culture to move past short-term enthusiasm, Kahn and Beckmann suggest that--just like muscle memory automating physical movements--implementing repeatable and logical processes and directives will also become automatic. "A mature information culture is a state of being, like a never-ending marathon," contend Kahn and Beckmann. "Culture is not a 'sometimes thing,' it is an 'all the time thing.' Building a mindful information culture can be achieved only by implementing a persistent, evolving cycle of assessing, planning, implementing, communicating, monitoring, resolving, and repeating."



Quote for the day:

"Leadership is a matter of having people look at you and gain confidence, seeing how you react. If you're in control, they're in control." -- Tom Laundry

Daily Tech Digest - January 08, 2021

Facial recognition: Now algorithms can see through face masks

This year, in response to the new imperatives brought by the COVID-19 pandemic, the rally has focused on evaluating the ability of AI systems to reliably collect and match images of individuals wearing an array of different face masks, with a view of eventually deploying the technology in international airports around the country. ... The results, however, varied greatly from one system to the other: for example, the best-performing technology correctly identified individuals 96% of the time, even when they were wearing a mask. The worst-performing system tested during the rally, for its part, only identified 4% of masked individuals. "This isn't a perfect 100% solution," said Arun Vemury, director of S&T's Biometric and Identity Technology Center, "but it may reduce risks for many travelers, as well as the frontline staff working in airports, who no longer have to ask all travelers to remove masks."  Facial recognition is currently used in a select number of US airports as part of a program called Simplified Arrival, which is deployed by the Customs and Border Protection (CBP). Under Simplified Arrival, the identity of international travelers who enter and exit the country can be verified at inspection points in the airport by the snap of a picture, rather than having to present a travel document.


How to make sure the switch to multicloud pays off

The first thing you need to think about before adopting the multicloud approach is whether you are actually ready for it. There are a number of things you need to have in place. For example, one non-negotiable element of your IT team is a DevOps culture. By being committed to agile processes and cross-team collaboration, you can make sure that you’re able to continuously make any necessary changes or updates to your product while the transition is underway. Not to mention, having a DevOps culture will enable teams to quickly adopt cutting-edge technologies made available by multicloud, like Spinnaker or Kubernetes. Next, you need to understand how to achieve high availability, resilience, and zero downtime strategies within your existing architecture. In addition, any legacy architecture will need to be modernized before launching a multicloud strategy. This will allow you to make use of modern cloud features like microservices and containerization, as well as achieve interoperability between clouds. For instance, applications that need to be split into multiple parts to run in separate clouds must be modernized, as legacy architectures would be unlikely to enable this.


AI Council advises government to do artificial intelligence moonshots

The roadmap document is partly based on 450 responses to a call, in October 2019, for input from what is described as an AI “ecosystem” of individuals interested in artificial intelligence. The introduction states “we need to ‘double down’ on recent investment the UK has made in AI [and] we must look to the horizon and be adaptable to disruption”. It says the council stands ready “to convene workshops with the wider ecosystem to capture more detail and work together to ensure that a future National AI Strategy enables the whole of the UK to flourish”. The Alan Turing Institute has a central place in the document. The council advises the government to “provide assured long-term public sector funding that will give the Turing Institute and others the confidence to plan and invest in strategic leadership for the UK in AI research, development and innovation”. On the skills front, the council advocates a decade-long programme of “research fellowships, AI-relevant PhDs across disciplines, industry-led masters and level 7 apprenticeships”. And it suggests that tracking diversity data to “invest and ensure that underrepresented groups are given equal opportunity and included in all programmes”.


Using Microsoft 365 Advanced Audit and Advanced eDiscovery to minimize impact

The Microsoft 365 Advanced Audit solution makes a range of data available that is focused on what will be useful to respond to crucial events and forensic investigations. It retains this data for one year (rather than the standard 90-day retention), with an option to extend the retention to ten years. This keeps the audit logs available to long-running investigations and to respond to regulatory and legal obligations. These crucial events can help you investigate possible breaches and determine the scope of compromise. ... In an account takeover, an attacker uses a compromised user account to gain access and operate as a user. The attacker may or may not have intended to access the user’s email. If they intend to access the user’s email, they may or may not have had the chance to do so. This is especially true if the defense in-depth and situational awareness discussed above is in place. The attack may have been detected, password changed, account locked, and more. If the user’s email has confidential information of customers or other stakeholders, we need to know if this email was accessed. We need to separate legitimate access by the mailbox owner during the account takeover from access by the attacker.


5 New Year's Resolutions To Improve How Organizations Work With Data in 2021

To ensure successful data democratization and extract the maximum value from an organization’s investment in data and analytics, data literacy should no longer be ignored. We wouldn’t let people drive cars without passing a test. So, let’s exercise some caution to ensure employees have the necessary training and understanding of data, analysis, and foundational statistical knowledge before reaching conclusions from their data. Building data literacy within an organization will require resources and a structure for ongoing training and development. Upskilling employees and ensuring their knowledge is current should be at the top of the agenda if businesses want to remain competitive. This is critical, especially when you want to use an employee’s analysis and the resulting insights as the basis for making business decisions. ... We often read and hear that artificial intelligence (AI) and machine learning will deliver significant advances in automation and replace jobs in many industries. And while this is certainly a possibility, there are still humans behind the algorithms. And humans carry biases – we all do – so there’s a chance that biases are introduced into the algorithms we are exposed to on a daily basis.


Artificial intelligence accelerated by light

With the rise of AI, conventional electronic computing approaches are gradually reaching their performance limits and lagging behind the rapid growth of data available for processing. Among the various types of AI, artificial neural networks are widely used for AI tasks because of their excellent performance. These networks perform complex mathematical operations using many layers of interconnected artificial neurons. The fundamental operation that uses most of the computational resources is called matrix–vector multiplication. Various efforts have been made to design and implement specific electronic computing systems to accelerate processing in artificial neural networks. In particular, considerable success has been achieved using custom chips known as application-specific integrated circuits, brain-inspired computing and in-memory computing, whereby processing is performed in situ with an array of memory devices called memristors. Electrons are the carriers of information in electronic computing, but photons have long been considered an alternative option. Because the spectrum of light covers a wide range of wavelengths, photons of many different wavelengths can be multiplexed (transmitted in parallel) and modulated (altered in such a way that they can carry information) simultaneously without the optical signals interfering with each other.


Five real world AI and machine learning trends that will make an impact in 2021

Computer vision trains computers to interpret and understand the visual world. Using deep learning models, machines can accurately identify objects in videos, or images in documents, and react to what they see. The practice is already having a big impact on industries like transportation, healthcare, banking and manufacturing. For example, a camera in a self-driving car can identify objects in front of the car, such as stop signs, traffic signals or pedestrians, and react accordingly, said Jung. Computer vision has also been used to analyze scans to determine whether tumors are cancerous or benign, avoiding the need for a biopsy. In banking, computer vision can be used to spot counterfeit bills or for processing document images, rapidly robotizing cumbersome manual processes. In manufacturing, it can improve defect detection rates by up to 90 per cent. And it is even helping to save lives; whereby cameras monitor and analye power lines to enable early detection of wildfires. At the core of machine learning is the idea that computers are not simply trained based on a static set of rules but can learn to adapt to changing circumstances. “It’s similar to the way you learn from your own successes and failures,” said Jung. “Business is going to be moving more and more in this direction.”


DevOps: Watch Out for These 5 Common Snags

Traditionalists often cling to waterfall methodology, which has long been favored in enterprise environments for its rigorous requirements of capture, documentation and governance. While there are times when waterfall may be appropriate, such as instances where customers want to see a clear product roadmap over a set time period, this is rarely the way the world works today. Upstarts are disrupting traditional business models at breakneck speed, with innovative, cutting-edge software applications being rolled out quickly. If an organization is to compete in this climate, it cannot afford the time spent using waterfall to manage and implement DevOps methods and features. That’s like trying to learn to speed row on a frozen lake. We believe that using agile and DevOps practices will help you transition to a faster and higher quality software delivery organization. The faster you can deliver new capabilities and features, the more competitive you’ll be. So, it’s best not to waste time using waterfall to implement DevOps if your ultimate goal is to produce software products that delight customers, ahead of your competition. The goal should always be progress, not perfection. There are many features and capabilities you can implement that will yield positive benefits.


Are No Code and Low Code Answers to the Dev Talent Gap?

The use of no-code and low-code platforms might give organizations ways to finally catch up on the talent gap that threatens to stall growth, says Katherine Kostereva, CEO and managing partner of low-code platform provider Creatio. She says there are almost 1 million IT jobs that remain unfilled in the UK alone. “The demand for IT staff is going to grow,” Kostereva says. “The only way out is to get technology into the hands of the employees of power users and that is exactly what low-code is doing.” Giving people who primarily come from the business operations side access to these platforms can help narrow the talent demand and address a common point of discord in many organizations. Kostereva says there is a continued misalignment where business teams have their own ideas on how interfaces and business processes should work, while IT teams must contend with limitations on resources and growing backlogs of change requests. The emerging market for low code, she says, can help business professionals take on more developer duties to a certain extent. This may be an inevitable trend as more organizations explore ways to use no-code and low-code platforms. 


The nation state threat to business

As the threat grows, it’s important to take action to prevent state sponsored cyber-attacks. For some companies, surviving the impact of this type of cyber-assault simply isn’t possible, says Amanda Finch, Chartered Institute of Information Security CEO. This is partly because fines that come in the wake of an attack can be “crippling”, she warns, adding: “The incident can lead to a loss of confidence from investors and stakeholders. Being cut off from financial resources can stall a company into inactivity, and even cause a collapse.” To protect themselves, organisations need to construct threat models to drive their cyber threat intelligence (CTI) collection plan, says Thornton-Trump. At the same time, Thornton-Trump says, a firm’s CTI team should be equipped to analyse threat actor activity against the organisation’s security controls. “Knowing what a threat actor may use to target the organisation and applying that knowledge can provide a massive defensive advantage.” He explains how the ultimate goal of a CTI program is to understand key mistakes, exploits or unfortunate circumstances that have occurred in the past. “This information can be used to prevent similar attacks on the organisation.”



Quote for the day:

"What good is an idea if it remains an idea? Try. Experiment. Iterate. Fail. Try again. Change the world." -- Simon Sinek

Daily Tech Digest - January 07, 2021

How to deploy 802.1x for Wi-Fi using WPA3-Enterprise

The enterprise mode of WPA has always allowed you to give each user a unique username/password to login to the Wi-Fi or to utilize unique digital certificates for each user to install on devices for even more security. Now with WPA3-Enterprise, the security is increased as clients are now required to make sure it’s communicating with the real authentication server before sending login credentials. That verification was optional with the earlier two versions of WPA. ... The difficulty of setting up a RADIUS server varies based on what solution you choose, and it’s usually streamlined if using a wireless controller or APs. If using an external server, you usually have to enter the IP address of the wireless controller or each AP and specify a shared secret that you later input in the controller settings or each AP. For traditional RADIUS servers, these are usually entered in the Network Access Server (NAS) list. On the RADIUS server you also have to configure user credentials either with usernames and passwords in a local database or external database/directory, or by generating digital certificates that you later install on devices. Some RADIUS servers support optional attributes you can apply to individual users or groups of users that become part of the policy applied to individual clients.


Why IoT has failed to take off and the impact for the world when it does

An IoT explosion will create previously unthinkable, disruptive business models. The first enabler for this is the ability to turn non-connected ‘dumb products’, such as running machines, lawnmowers and hairdryers, into connected ‘smart experiences’ which result in big data goldmines for their manufacturers. As more of these products are launched, more pressure will be exerted on competitors to follow suit and this will, in my view, result in accelerated adoption of IoT. Think of it as a form of accelerated Darwinism for product design. It’s the survival of the fittest. Businesses that don’t adapt and evolve will succumb to natural selection driven by the consumer. We will see a rapid rise in innovation, the likes of which we haven’t seen in decades. To give you an example of how this is already beginning to happen, within retail we’ve seen Costa Coffee disrupting the coffee industry with its Costa Express machines. Costa has essentially taken the coffee shop experience and distilled it into a machine that delivers a highly personalised coffee retailing experience akin to, or arguably better than, a traditional coffee shop. So much so that they market it as ‘a Barista without a Beard’. 


The rise of developer-led culture and how it can benefit your business

In a developer-led culture, companies recognize developers as innovators instrumental to solving some of the world's most complex problems. While in the past, only a small number of companies were able to succeed with a developer-led culture, we'll see the dramatic rise of this mindset permeate throughout the enterprise as business leaders and developers take advantage of new tools that make it easier for developers to drive innovation and that enable more technical collaboration between business and IT. Not only will far more developers have a seat at the table, they'll be the key drivers of the next wave of business innovation. ... A developer-led culture is best when it's rooted in collaboration between business leaders and IT. Today, despite massive amounts of money and effort, application development struggles to be effective and still runs up against the same roadblocks it did 20 years ago. Painfully few companies (e.g. Google, MSFT, Apple, Facebook, etc.) have the ability to take advantage of the full stack of traditional development by locking in the best engineering talent in the world and building massive teams. Everyone else has been left out—understaffed and stuck with complex, unapproachable technology that keeps them from using applications to their advantage. But, to be able to compete, companies need differentiated software, and to be able to build it themselves so that it adapts to always-evolving needs.


Strategic Agility at Scale: Applying Agile Across Your Organization

The Disciplined Agile (DA) tool kit is more sophisticated than the agile software development frameworks you may be familiar with. With DA we choose to address the actual challenge that you face, not just part of the challenge. As a result, DA distinguishes between four process layers as you can see in Figure 1. Each level is organized into several process blades , each of which describes in detail a process area or capability within your organization. The DA layers are:  The Foundation layer provides the conceptual underpinnings of the DA tool kit. This includes the DA mindset; foundational concepts from agile, lean, and serial/traditional ways of working (WoW); people-oriented issues such as roles, responsibilities, and teaming structures; and of course how to choose your WoW.  ... Disciplined DevOps is the streamlining of IT solution development and IT operations activities, and supporting enterprise-IT activities, to provide more effective outcomes to your organization. ... The value streams layer encompasses the capabilities required to provide value streams to your customers. A value stream is the set of actions that take place to add value to a customer from the initial request through realization of value by the customer.


A Covid-19 response to supply chain fragility

At the very beginning of the crisis, as country after country went into lockdown, we saw the supply chain start to dry up and major retailers and delivery services admitting that stock availability was limited or had run out completely – an almost unprecedented situation in peacetime. Even now, as national lockdowns are – in the main – giving way to more localised control methods, it is still difficult for businesses to predict demand effectively and control stock – especially in the run-up to Christmas, traditionally the busiest time for retail. Coupled with this, as a response to the pandemic, many businesses have switched to a truly multi-channel approach in order to survive and thrive. However, running a successful omnichannel strategy requires a change in mindset for most businesses – as well as being underpinned by the right technologies. Disparate teams must gain an understanding of each other’s value propositions and strategies, which can allow companies to reduce operational costs, improve productivity and boost efficiency, as well as being able to predict demand and control stock more effectively. Of course, nurturing any inter-departmental collaboration is easier said than done. It is challenging to integrate different domains. 


The real deal: How edge and IoT technologies boost business

Utilities are using edge and IoT technologies to improve customer service. Hoping to enhance customer loyalty by delivering better service, one energy production company in Italy looked for ways to increase the uptime of its equipment. “Instead of calling service technicians to deal with outages, this utility decided to prevent them,” says Wallis. “Using edge and IoT services from SAP, the company began collecting data at the edge, which is wherever assets run.” Then the company went one step further: by using predictive maintenance software, the utility scores the health of its assets. “An asset can be more or less healthy,” she states. “We provide that information right on the shop floor and also on dashboards, which users can access remotely.” In fact, no human workers need to be on the shop floor to get insight into the asset health information. But as soon as a problem is spotted, technicians can perform preventative maintenance – avoiding downtime and creating happier customers. “Real-world IoT information about the asset, the product, the worker, and the shop floor matters,” she continues. “Otherwise you’re just ‘guess-timating.’ And all of your planning is better when it’s married to real-world data like IoT data. Then you can constantly baseline yourself against what is happening in reality.”


Disaster recovery lessons from an island struck by a hurricane

The first lesson from this disaster is one of the most profound: as important as backup and recovery systems are, they might not pose the most difficult challenges in a disaster recovery. Getting a place to recover and a network to use can prove much more difficult. Mind you, this is not a reason to slack off on your backup design. If anything, it’s a reason to make sure that at least the backups work when nothing else does. Local accounts that don’t rely on Active Directory would be a good start. Services such as Active Directory that are necessary to start a recovery should have at least a locally cached copy of the service that works without an Internet connection. A completely separate instance of such a service would be much more resilient. Rehearse large scale recoveries as best as you can, and also make sure you are aware of how to do them without a GUI. Being able to login to the servers via SSH and run restores on the command line is more power efficient and flexible. As foreign as that seems to many people, a command-line recovery is often the only way to move forward. On Atlantis, electric service was at a premium, so using it to power monitors wasn’t really an option.


Healthcare Organizations Bear the Brunt of Cyberattacks Amid Pandemic

The shift to a remote work model for a lot of non-healthcare professionals, including IT and security personnel, also likely disrupted certain IT and security programs and operations, leaving organizations more vulnerable. The situation was likely exacerbated by the fact that the healthcare industry traditionally has lagged behind many other industries in IT. Zscaler's Desai says healthcare organizations often lack security controls that others have deployed and are often vulnerable to known issues. Prolonged FDA approvals also can hinder the adoption of more secure technology, making it harder for healthcare entities to implement new security controls. "For example, security in the healthcare sector is often hindered by legacy technology, with updates often delayed by prolonged FDA approvals," Desai says. They also face the challenge of preserving compliance with the security and privacy provisions of HIPAA while looking to migrate to potential more secure channels for operation, he says. "Without unified controls and centralized visibility and policy enforcement, the healthcare industry will continue to face gaps in their security controls that will always draw the attention of cybercriminals," Desai notes.


Data Architecture with Data Governance: A Proactive Approach

Key features of an effective Data Architecture include a Data Strategy that is in alignment with business drivers, targets essential data, delineates clear activities and milestones, and is flexible enough to evolve with the business needs and the technology available. Most importantly, architecture must be manageable. “You can never sort out all your data everywhere. You need to focus on the things that really make a difference.” ... Turner outlined a simple path to a Data Strategy. Start with the Business Strategy and determine what data is critical to supporting that strategy. Evaluate the data you have and decide if it’s up to the task, and if it isn’t, decide what is needed to improve it. Turner pointed out that improvements may need to come from the business side, rather then exclusively from IT. For example, if every department uses a different code or term to indicate “customer,” “Then that obviously would influence the business strategy, which might need to change in order to accommodate that barrier.” ... The volumes of data that companies and organizations are handling have increased phenomenally in the last ten years. Ninety percent of all the data currently stored today has just been created in the last two years.


The Convergence of Infrastructure and Security

Converging infrastructure and security allows an organization to put security anywhere on any edge -- the WAN and Remote Worker Edge (using things like SD-WAN and SASE), the cloud edge (using proxies), or the datacenter or LAN edge (through secured WiFi and ethernet controllers). This allows security to function as a fully integrated element of the network, and the integration of deployment, management, configuration, and orchestration ensure that all elements work together seamlessly across the entire network as a single framework. The convergence enabled by a security-driven networking strategy will be especially critical as new smart edge solutions are adopted. A smart edge is a collection of endpoint devices connected using cloud-native, highly scalable, and secure virtual platform that enables Software-as-a-Service (SaaS) applications to be deployed in or as close to the network edge as possible. It relies on things like 5G to ensure high performance and reliable connectivity. With a smart edge network in place, enterprises and communications service providers can enable cloud-like services closer to the user, whether on the customer-premise or at the network edge. But it absolutely depends on having a fully converged security and networking solution.



Quote for the day:

"Growth and change may be painful sometimes, but nothing in life is as painful as staying stuck where you don't belong" -- Daniel Goddard

Daily Tech Digest - January 06, 2021

Making CI/CD Work for DevOps Teams

The most fundamental people-related issue is having a culture that enables CI/CD success. "The success of CI/CD [at] HealthJoy depends on cultivating a culture where CI/CD is not just a collection of tools and technologies for DevOps engineers but a set of principles and practices that are fully embraced by everyone in engineering to continually improve delivery throughput and operational stability," said HealthJoy's Dam. At HealthJoy, the integration of CI/CD throughout the SDLC requires the rest of engineering to closely collaborate with DevOps engineers to continually transform the build, testing, deployment and monitoring activities into a repeatable set of CI/CD process steps. For example, they've shifted quality controls left and automated the process using DevOps principles, practices and tools. Component provider Infragistics changed its hiring approach. Specifically, instead of hiring experts in one area, the company now looks for people with skill sets that meld well with the team. "All of a sudden, you've got HR involved and marketing involved because if we don't include marketing in every aspect of software delivery, how are they going to know what to market?" said Jason Beres, SVP of developer tools at Infragistics.


How DNS Attack Dynamics Evolved During the Pandemic

The complexity of the DNS threat landscape has grown in the wake of COVID. According to Neustar’s “Online Traffic and Cyber Attacks During COVID-19” report, there was a dramatic escalation of the number of attacks and their severity across virtually every measurable metric from March to mid-May 2020 – particularly DNS-related attacks. That’s not surprising given the sharp rise in DNS queries from employees working from home. Whereas business networks tend to be relatively secure and protected by experienced security professionals, home routers are set up by un-savvy employees, and are therefore more vulnerable to DNS exploits. Hackers are taking advantage of this vulnerability using a technique called DNS hijacking. They gain access to unsecured home routers and change the devices’ DNS settings. Users are then redirected to malicious sites and unwittingly give away sensitive information like credentials, or permit attackers to remotely access their company’s infrastructure. Neustar has seen a dramatic rise in this type of attack since the onset of the pandemic. Given that many home networks remain exposed, this problematic trend is poised to continue well into 2021. Similar, simpler techniques are also becoming more prevalent.


Top 12 IoT App Trends to Expect in 2021

Automation requirements are everywhere, including industries, and IoT is well catering to all of them. IoT in industries has been mainly collecting and analyzing data and work routines for requirements of various devices and systems, and automating their working. Initially, the role of this technology was limited to increasing overall industry work efficiency and operation management with rationalization, automation, and applicable system maintenance in the manufacturing sectors, mainly within a smart factory environment. Coming forward, IoT is touted to cross $123 billion in terms of its industrial vertical only. The technology is set to help industries within the scope of optimization in their work procedures, intelligent manufacturing and smart industry, asset performance management, industrial control, moving towards an on-demand service model, amongst others, even for cross-industry scenarios in the coming times. It is also set to revamp the ways of providing services to customers and creating newer revenue models. It has been actively promoting and helping in enhancing aspects of industrial digital transformation.


‘The dawn of ‘Fintech 3.0’? ‘

“What we’re seeing is ecommerce moving up and down the value chain,” says Brear. “I don’t really know which one of the three credit cards I have is linked to Amazon. But I know, when I press that Amazon button, all of the fulfilment is done really well. Amazon is moving down that stack into the financial services space, and giving me three-to-four per cent cashback. Why would I not do that? “Universal banking as a principle was predicated on cross- and upselling, where banks were relying on the primacy of their customer relationship, and selling them 2.3 or 2.4 products, on average, to make the system work, from a profitability perspective. But, we’re now seeing that customer ‘ownership’ being unbundled and shared between other providers, whether Amazon or players like Snoop. They’re provoking customers into moving, and making it really easy for them to do so. “That’s the really scary thing. We’ve seen this play out in other industries – mobile network operators are a great example, because the consumer doesn’t care what that logo in the corner of the iPhone is now, they just care that it’s an iPhone. The networks have commoditised themselves into providing them with data and coverage, which every one of them does, so it doesn’t really matter [who they go with].


Why you should make cyber risk a business gain, not a loss

In a progressive approach to risk, compliance specialists come together with IT security and operations to improve posture and compliance across the organization. In theory, that means gathering and analyzing data on the regulatory environment, security and privacy, and configuration management at one time. Only through that deep level of operational alignment can true technology risk management take place. To do that effectively, we have to start by thinking of risk as something to gain, not to lose. In this view, risk becomes a window through which organizations can assess their health as it relates to operations, security and regulatory status—a view of the organization over time. ...  Many IT teams start their risk assessments by making decisions based on data from multiple products and discrete tasks. Unfortunately, this can result in a time-consuming process of reconciling these systems. ... Once data is gathered, it’s analyzed and categorized into various risk categories. Ideally, this is done continuously, not as a once-a-year effort. Infrequent assessments will fail to provide a clear and current picture of the organization’s risk posture. ... Once analysis is signed off, organizations should be well positioned to recommend or perform remediation actions to mitigate their risks.


What is a DataOps Engineer?

DataOps engineers’ holistic approach to the data development environment separates them from other technical team members. At CHOP, data engineers mostly work on ETL tasks while analysts serve on subject matter teams within the hospital. Mirizo, on the other hand, works on building infrastructure for data development. Some of his major projects have included building a metric platform to standardize calculations, creating an adaptor that allows data engineers to layer tests on top of their pipelines, and crafting a GitHub-integrated metadata catalogue to track document sources. On a day-to-day basis, he provides data engineers with guidance and design support around workflows and pipelines, conducts code reviews through GitHub, and helps select the tools the team will use. Prior to the creation of his position, CHOP’s data team relied on human beings to manually check Excel spreadsheets to ensure everything looked okay, engineers emailed proposed changes to code and metadata back and forth, and the lack of shared definitions meant different pipelines delivered conflicting data. Now, thanks to Mirizio, much of that process is automated and tools like Jira, GitHub, and Airflow help the team maintain continuous, high-quality integration and development.


Unlocking Your DevOps Automation Mindset

Today, enterprises are shifting from waterfall to agile weekly and daily releases. My belief is that every enterprise needs to adopt a 100% agile methodology, just like BMW did. Testing and continuous improvement/continuous development (CI/CD) is key for deploying code in small chunks and reducing merge issues and refactor efforts. Ultimately, this increases developer velocity and decreases lead time. The shift from a partial to a 100% agile model requires more than simply senior leadership’s resolve. It needs a dedicated pool of certified DevOps automation consultants, coaches and subject matter experts with experience in SAFE, LESS, Scrum and Kanban frameworks. Best-in-class enterprises and OSS toolchains that cater to DevSecOps, service meshes and omnichannel apps are essential. Simultaneously, agile-based delivery coaching, audits and continuous support to existing and new delivery teams are a must. While DORA metrics can serve as a good measure of an enterprise’s DevOps performance, businesses will need tools to assess DevOps maturity, improve developer productivity and provide specific recommendations for improvement. Data will play an important role in decision making and aid every developer’s performance, more than at any time in the past.


5G, behavioural analytics & cyber security: the biggest tech considerations in 2021

With transmissions speeds reaching ten gigabits per second, and latency less than 4-5 times that of 4G, 5G will first and foremost revolutionise IoT and innovative new edge computing services. With this comes the potential for the wider adoption of driverless cars and the remote control of complex industrial machinery, to name but two applications. These examples, however, are just the headlines. Behind the scenes, 5G holds huge potential for businesses across all sectors looking to ramp up their digital capabilities. Lower latency and greater bandwidth mean that the finance and retail industries can perform data analytics in real-time, paving the way for AI to power bespoke customer service experiences. Similar applications will be seen in the manufacturing and transportation sectors, where faster information gathering and enhanced IoT offers both safer and faster execution of services. An even bigger area of flux is in the relationship between IT and the workplace. Last year’s shift to remote working was one of the biggest occupational overhauls in recent memory, and as it stands, more than four-fifths of global workforce are ruling out return to office full-time, creating new priorities for CIOs.


Top Considerations When Auditing Cloud Computing Systems

Securing data in your cloud environments comes with unique challenges and raises a new set of questions. What’s the appropriate governance structure for an organization’s cloud environment and the data that resides within them? How should cloud services be configured for security? Who is responsible for security, the cloud service provider or the user of that cloud service?  Cloud compliance is becoming front of mind for organizations of all sizes. Smaller companies with limited staff and resources tend to rely more on cloud vendors to run their businesses and to address security risks (we’ll get into why this is a bad idea later in this article). Often roles will overlap with team members wearing many hats in smaller operations. Larger enterprises frequently keep more security and compliance duties in-house, using vast resources to create individual teams for threat hunting, risk management, and compliance/governance programs. Regardless of size, the challenge of balancing security and business objectives looms large for all companies. Security must be built around the business, and Jacques accurately describes the nature of the relationship: “Security is always a support function around your business.”


Every CIO now needs this one secret 'superpower'

"Emotional intelligence is something we define as self-awareness, self-management and relationship management," Rob O'Donohue, senior director analyst at Gartner, who worked on the report, told ZDNet. "With emotional dexterity, it's the next level. You have the ability to adapt and adjust to challenges from a soft-skills, emotional perspective." Historically, said O'Donohue, CIO roles have tended to focus on technical skills rather than emotional ones. But as the COVID-19 pandemic swept through the world, forcing entire organizations to switch to remote working overnight, IT teams were in the spotlight as they worked relentlessly to keep businesses afloat. "This put CIOs in a position where they needed to keep a hands-on, door-open policy, and show themselves as a leader that is willing to listen," said O'Donohue. This is where emotional skills came in handy – not only to support employees, but first and foremost to better manage the crisis from a personal point of view. O'Donohue's research, which surveyed CIOs working directly throughout the crisis, showed that those who self-scored above average on performance metrics over the past year were also more likely to cite daily commitments to self-improvement and self-control practices that helped them weather the crisis.



Quote for the day:

"Your first and foremost job as a leader is to take charge of your own energy and then help to orchestrate the energy of those around you." -- Peter F. Drucker

Daily Tech Digest - January 05, 2021

IoT adds smarts to IT asset monitoring

The market for IoT tools that can monitor IT assets (as well as many other devices) has attracted major technology vendors including Cisco, Dell, HPE, Huawei IBM, Microsoft, Oracle, SAP, and Schneider Electric, along with IoT specialists including Digi, Gemalto, Jasper, Particle, Pegasystems, Telit, and Verizon. IoT is often deployed in existing physical systems to increase the contextual understanding of the status of those systems, says Ian Hughes, senior analyst covering IoT at research firm 451 Research. "Compute resources tend to already have lots of instrumentation built in that is used to manage them, such as in data centers," he says. Companies can use IoT to provide additional information about the physical infrastructure of a building such as heating, ventilation, and air conditioning (HVAC) systems, Hughes says. Data centers would tend to need building- and environmental-related IoT equipment, to measure environmental conditions and possible security threats, he says. As with any IoT rollout, preparation is key. "Some approaches yield too much data, or non-useful content," Hughes says. "So understanding the context for measurement is important."


Three ways formal methods can scale for software security

FM is a type of mathematical modelling where the system design and code are the subjects of the model. By applying mathematical reasoning, FM tools can answer security questions with mathematical certainty. For example, FM tools can determine whether a design has lurking security issues before implementation begins; show that an implementation matches the system design; and prove that the implementation is free of introduced defects such as low-level memory errors. That certainty distinguishes FM from other security technologies: unlike testing and fuzzing, which can only trigger a fraction of all system executions, an FM model can examine every possible system behavior. Like machine learning, the roots of formal methods lie in the 1970s, and also like machine learning, recent years have seen rapid adoption of FM technologies. Modern FM tools have been refined by global-scale companies like Microsoft, Facebook, and Amazon. As a result, these tools reflect the engineering practices of these companies: rapid pace of iteration, low cost of entry, and interoperability between many complementary tools.


Why IATA is banking on cloud to help the airline industry weather the Covid-19 crisis

“The Covid-19 crisis is impacting the way we are responding and means we have to adjust our resources to what we can afford at the moment,” he says. “Our team understands that we need to change the way we are working to avoid wasting time and resources.” By his own admission, running an airline in 2020 is a very different business to what it was in 2019, and this, in turn, has created an additional need for new, artificial intelligence-based predictive models that factor in the impact of the pandemic. “Now our airlines are asking us [if we can] use the data for the last month to tell us what will happen in the next three months, and that means we have to build new predictive models,” he says. “We have to use technology like artificial intelligence, we have to use a lot of innovation and we need an environment that will allow us to do that.” It is worth noting that when this body of work began, around 60% of the organisation’s IT footprint was already in the Amazon Web Services (AWS) cloud, but there was definite room for improvement with regard to how that environment was being managed and used, says Buchner. “The way we were using AWS in the past is different from the way that we want to use it today. ...”


Modern Operations Best Practices From Engineering Leaders at New Relic and Tenable

Beyond the technical challenges of creating RCAs, there is a human layer as well. Many organizations use these documents to communicate about incidents to customers involved. However, this may require adding a layer of obfuscation. Nic shares, “The RCA process is a little bit of a bad word inside of New Relic. We see those letters most often accompanied by ‘Customer X wants an RCA.’ Engineers hate it because they are already embarrassed about the failure and now they need to write about it in a way that can pass Legal review. Dheeraj agrees, and believes that RCAs should have value to customers reading them. “Today, the industry has become more tolerant to accepting the fact that if you have a vendor, either a SaaS shop or otherwise, it is okay for them to have technical failures. The one caveat is that you are being very transparent to the customer. That means that you are publishing your community pages, and you have enough meat in your status page or updates." If legal has strict rules about what is publishable, RCAs can still be valuable. “We try to run a meaningful process internally. I use those customer requests as leverage to get engineering teams to really think through what's happened.


What the critics get wrong about serverless costs

There are a few main areas where people misunderstand serverless costs. They often exclude the total cost of running services on the web. This includes the personnel requirements and the direct payments to the cloud provider I just discussed. Other times, they build bad serverless architectures. Serverless, like cloud, is not a panacea. It requires knowledge and experience about what works and what doesn't -- and why. If you use serverless correctly, it shifts significant costs to the cloud provider. They keep your services running, scaling up and down, and recovering from hardware, software and patching failures. Most companies that run mission-critical web applications and/or APIs have operations staff who do exactly this. This is not to say that adopting serverless means putting people out of work. Charity Majors, co-founder and CTO of Honeycomb, wrote a great article on how operations jobs are changing rather than going away. But if you can hand off patching operating system and software vulnerabilities to a cloud provider, then the people on your staff who previously handled those tasks become available for more strategic and differentiated tasks for your organization. There also seems to be a shocking number of people who try to build something with serverless without fully understanding the technology first.


Hack your APIs: interview with Corey Ball - API security expert

In Corey’s opinion, because most APIs are primarily used/consumed by developers and machines they often get overlooked during security assessments. Compounding this problem, many organizations would struggle to actually list all the APIs they have on their systems. Worse still, because APIs are so varied, they’re difficult to scan. Even within a single organization, similar-looking endpoints could have completely different specifications from one another. Corey points out that many vulnerability scanners lack the features to properly test APIs, and are consequently bad at detecting API vulnerabilities. If your API security testing is limited to running one of these scanners, and it comes back with no results, then you run the risk of accepting false negative results. You can see the results of this in the news. The 2018 USPS incident (above) happened because security was simply not taken into consideration during an API’s design. A researcher was able to compromise the USPS application’s security using trivial methods, despite a vulnerability assessment having been carried out a month beforehand. The assessment had failed to spot the glaring issue. ... You can define business logic vulnerabilities as “deliberately designed application functionality that can be used against the application to compromise its security”.


2021 Cybersecurity Trends: Bigger Budgets, Endpoint Emphasis and Cloud

Upheaval in staffing needs and continued dependence on a remote workforce will create fertile attack vector for criminals looking to exploit insider threats. Forrester researchers believe the remote-workforce trend will drive uptick in insider threats. They explain, already 25 percent of data breaches are tied to insider threats and in 2021that percentage is expected to jump to 33 percent. Forcepoint warns in 2021 the growth of an “insider-as-a-service” model. This, they describe as organized recruitment infiltrators, who offer up highly-targeted means for bad actors to become trusted employees in orderto gather sensitive IP. “These ‘bad actors,’ literally, will become deep undercover agents who fly through the interview process and pass all the hurdles your HR and security teams have in place to stop them,” said Myrna Soto, chief strategy and trust officer for Forcepoint. Endpoint security issues equal some of the most challenging today and tomorrow. Inboxes are the chink in the armor security front lines, often the perfect vector for ransomware attacks, business email compromise scams and malware infection, according to a Crowdstrike analysis of the challenges. Moving forward, researchers warn that enterprises should expect a “major increase” in spear phishing attacks in 2021 – due to automation.


How the CTO can drive the enterprise’s shift to the cloud

The rapid rise of technology means that the CTO is no longer just seen as a business cost centre, but instead as something with the potential to generate increased revenue. One key ally for the CTO can be the CFO — to help them understand the difference in moving from a capex model to an opex one. The cloud and related services certainly make an attractive business case, with fewer sunk costs and investments into expensive hardware. However billing in the cloud space isn’t always as transparent as many CFOs might imagine, and re-structuring budgets and reporting will take time. For CTOs, all of the above will often require a mindset shift and a change in responsibility. ... In a global business environment, there’s an expectation that you can replicate, launch and relaunch your business anywhere on the planet. The reality is often far from this. CTOs need to be actively aware of potential pitfalls in plans to operate around the world, and limitations of the cloud. This can range from data regulations preventing part of your app from working, barriers that stop your services operating at an acceptable speed, or regional technology skills gaps that mean your onboarding costs will be excruciatingly high.


What experts say to expect from 5G in 2021

5G and open networking will likely be a successful pair, Nolle wrote in a CIMI blog post, because operators are guaranteed to deploy 5G even though it is unlikely to provide much revenue for them in 2021. As a result, 5G and any technology associated with it could have a sufficient financial life span. If operators want to head in the direction of open networking, they can pair their 5G timeline with their open network plans to ensure those plans get funding in the future. "When you're looking at operator technology initiatives, it's not the brilliance of the technology that matters, but how well the technology is funded," Nolle wrote. "Nobody questions 5G funding credibility for 2021, period. That makes 5G almost unique, and that makes things that are tied to 5G automatic concept winners." However, the potential for open models also forces operators to consider 3rd Generation Partnership Project standards for radio access and core networks, so operators don't start to deploy an open 5G network and, for any reason, have to reverse it or not fully deploy the open model. If operators conform to official standards, they can gradually implement an open model on a per-element basis, Nolle wrote. This could provide more flexibility and potentially lead to more widespread use of open networking models.


How the pandemic has affected women in the tech sector

It is important that employers understand the difference between remote and flexible working, and enable the latter to happen, points out Merici Vinton, founder and CEO of Ada’s List, a global community for women in tech. “It’s about the perception that people aren’t doing the work if they’re doing different hours, when really the important thing is outcomes and that the work gets done,” she says. “Enabling effective flexible working is about understanding the full picture of the employee experience while working at home.” Another problem relates to the risk of unconscious bias being compounded if people operate remotely, which can have a negative impact on their chances of career progression. A key challenge here, according to Rebecca George, president of BCS, is that “it’s easier for discriminatory behaviour to go unnoticed, or unchecked”. “Research has highlighted that managers often give ground to those who look like themselves, and with networking opportunities thin on the ground, it’s possible that without care and special attention, some people may have to work twice as hard to receive the opportunities and recognition they deserve,” she says.



Quote for the day:

"Great leaders go forward without stopping, remain firm without tiring and remain enthusiastic while growing" -- Reed Markham