Daily Tech Digest - November 13, 2020

Manufacturing is becoming a major target for ransomware attacks

For cyber criminals, manufacturing makes a highly strategic target because in many cases these are operations that can't afford to be out of action for a long period of time, so they could be more likely to give in to the demands of the attackers and pay hundreds of thousands of dollars in bitcoin in exchange for getting the network back. "Manufacturing requires significant uptime in order to meet production and any attack that causes downtime can cost a lot of money. Thus, they may be more inclined to pay attackers," Selena Larson, intelligence analyst for Dragos, told ZDNet. "Additionally, manufacturing operations don't necessarily have the most robust cybersecurity operations and may make interesting targets of opportunity for adversaries," she added. The nature of manufacturing means industrial and networking assets are often exposed to the internet, providing avenues for hacking groups and ransomware gangs to gain access to the network via remote access technology such as remote desktop protocol (RDP) and VPN services or vulnerabilities in unpatched systems. As of October 2020, the company said there were at least 108 advisories containing 262 vulnerabilities impacting industrial equipment found in manufacturing environments during the course of this year alone.


Humanitarian data collection practices put migrants at risk

“Instead of helping people who face daily threats from unaccountable surveillance agencies – including activists, journalists and people just looking for better lives – this ‘aid’ risks doing the very opposite,” said PI advocacy director Edin Omanovic. To overcome the issues related to “surveillance humanitarianism”, the report recommends that all UN humanitarian and related bodies “adopt and implement mechanisms for sustained and meaningful participation and decision-making of migrants, refugees and stateless persons in the adoption, use and review of digital border technologies”. Specifically, it added that migrants, refugees and others should have access to mechanisms that allow them to hold bodies like the UNHCR directly accountable for violations of their human rights resulting from the use of digital technologies, and that technologies should be prohibited if it cannot be shown to meet equality and non-discrimination requirements. It also recommends that UN member states place “an immediate moratorium on the procurement, sale, transfer and use of surveillance technology, until robust human rights safeguards are in place to regulate such practices”. A separate report on border and migration “management” technologies published by European Digital Rights (EDRi), which was used to supplement the UN report ...


Machine Learning Testing: A Step to Perfection

Usually, software testing includes Unit tests, Regression tests and Integration tests. Moreover, there are certain rules that people follow: don’t merge the code before it passes all the tests, always test newly introduced blocks of code, when fixing bugs, write a test that captures the bug. Machine learning adds up more actions to your to-do list. You still need to follow ML’s best practices. Moreover, every ML model needs not only to be tested but evaluated. Your model should generalize well. This is not what we usually understand by testing, but evaluation is needed to make sure that the performance is satisfactory. ... First of all, you split the database into three non-overlapping sets. You use a training set to train the model. Then, to evaluate the performance of the model, you use two sets of data: Validation set - Having only a training set and a testing set is not enough if you do many rounds of hyper parameter-tuning (which is always). And that can result in over fitting. To avoid that, you can select a small validation data set to evaluate a model. Only after you get maximum accuracy on the validation set, you make the testing set come into the game; and Test set (or holdout set) - Your model might fit the training dataset perfectly well. ...


How The Future Of Deep Learning Could Resemble The Human Brain

For deep learning, the model training stage is very similar to the initial learning stage of humans. During early stages, the model experiences a mass intake of data, which creates a significant amount of information to mine for each decision and requires significant processing time and power to determine the action or answer. But as training occurs, neural connections become stronger with each learned action and adapt to support continuous learning. As each connection becomes stronger, redundancies are created and overlapping connections can be removed. This is why continuously restructuring and sparsifying deep learning models during training time, and not after training is complete, is necessary. After the training stage, the model has lost most of its plasticity and the connections cannot adapt to take over additional responsibility, so removing connections can result in decreased accuracy. Current methods such as the one unveiled in 2020 by MIT researchers where attempts are made to make the deep learning model smaller post-training phase have reportedly seen some success. However, if you prune in the earlier stages of training when the model is most receptive to restructuring and adapting, you can drastically improve results.


Quantum Computing: A Bubble Ready to Burst?

If there is a quantum bubble, it’s inflated both by the new flurry of Sycamore-type academic work and a simultaneous push from private corporations to develop real-world quantum applications, like avoiding traffic jams, as a form of competitive advantage. We’ve known about the advantages that quantum physics can offer computing since at least the 1980s, when Argonne physicist Paul Benioff described the first quantum mechanical model of a computer. But the allure of the technology seems to have just now bitten enterprising businesspeople from the tiniest of startups to the largest of conglomerates. “My personal opinion is there’s never been a more exciting time to be in quantum,” says William Hurley. Strangeworks, the startup he founded in 2018, serves as a sort of community hub for developers working on quantum algorithms. Hurley, a software systems analyst who has worked for both Apple and IBM, says that more than 10,000 developers have signed up to submit their algorithms and collaborate with others. Among the collaborators—Austin-based Strangeworks refers to them as “friends and allies”—is Bay Area startup Rigetti Computing, which supplies one of the three computers that Amazon Web Services customers can access to test out their quantum algorithms.


C++ programming language: How it became the invisible foundation for everything, and what's next

As of September 2020, C++ is the fourth most popular programming language globally behind C, Java and Python, and – according to the latest TIOBE index – is also the fastest growing. C++ is a general-purpose programming language favored by developers for its power and flexibility, which makes it ideal for operating systems, web browsers, search engines (including Google's), games, businesses applications and more. Stroustrup summarizes: "If you have a problem that requires efficient use of hardware and also to handle significant complexity, C++ is an obvious candidate. If you don't have both needs, either a low-level efficient language or a high-level wasteful language will do." Yet even with its widespread popularity, Stroustrup notes that it is difficult to pinpoint exactly where C++ is used, and for what. "A first estimate for both questions is 'everywhere'," he says. "In any large system, you typically find C++ in the lower-level and performance-critical parts. Such parts of a system are often not seen by end-users or even by developers of other parts of the system, so I sometimes refer to C++ as an invisible foundation of everything."


Cybercrime To Cost The World $10.5 Trillion Annually By 2025

Cybercrime has hit the U.S. so hard that in 2018 a supervisory special agent with the FBI who investigates cyber intrusions told The Wall Street Journal that every American citizen should expect that all of their data (personally identifiable information) has been stolen and is on the dark web — a part of the deep web — which is intentionally hidden and used to conceal and promote heinous activities. Some estimates put the size of the deep web (which is not indexed or accessible by search engines) at as much as 5,000 times larger than the surface web, and growing at a rate that defies quantification. The dark web is also where cybercriminals buy and sell malware, exploit kits, and cyberattack services, which they use to stirke victims — including businesses, governments, utilities, and essential service providers on U.S. soil. A cyberattack could potentially disable the economy of a city, state or our entire country. In his 2016 New York Times bestseller — Lights Out: A Cyberattack, A Nation Unprepared, Surviving the Aftermath — Ted Koppel reveals that a major cyberattack on America’s power grid is not only possible but likely, that it would be devastating, and that the U.S. is shockingly unprepared.


Role of FinTech in the post-COVID-19 world

As the global economy recovers from COVID-19, one particular area of focus for FinTech is financial inclusion. According to the World Bank, there are currently around 1.7 billion unbanked individuals worldwide, and FinTechs will be central to efforts to integrate these people into the global banking system. Doing so will help to mitigate the economic and social impact of the pandemic. According to Deloitte, FinTechs, in strategic partnerships with financial institutions, retailers and government sectors across jurisdictions, can help democratise financial services by providing basic financial services in a fair and transparent way to economically vulnerable populations. Digital finance is also expanding in other areas. Health concerns in the COVID-19 era have made physical cash payments less practical, opening the door to an increase in digital payments and e-wallets. Though cash use was predicted to decline in any case, COVID-19 has hurried that decline, due to concerns that handing over money can cause human to human transmission of the virus. According to a Mastercard survey looking at the implications of the coronavirus pandemic, 82 percent of respondents worldwide viewed contactless as the cleaner way to pay, and 74 percent said they will continue to use contactless payment post-pandemic.



DNS cache poisoning poised for a comeback: Sad DNS

Here's how it works: First, DNS is the internet's master address list. With it, instead of writing out an IPv4 address like "173.245.48.1," or an IPv6 address such as "2400:cb00:2048:1::c629:d7a2," one of Cloudflare's many addresses, you simply type in "http://www.cloudflare.com," DNS finds the right IP address for you, and you're on your way. With DNS cache poisoning, however, your DNS requests are intercepted and redirected to a poisoned DNS cache. This rogue cache gives your web browser or other internet application a malicious IP address. Instead of going to where you want to go, you're sent to a fake site. That forged website can then upload ransomware to your PC or grab your user name, password, and account numbers. In a word: Ouch! Modern defense measures -- such as randomizing both the DNS query ID and the DNS request source port, DNS-based Authentication of Named Entities (DANE), and Domain Name System Security Extensions (DNSSE) -- largely stopped DNS cache poisoning. These DNS security methods, however, have never been deployed enough, so DNS-based attacks still happen. Now, though researchers have found a side-channel attack that can be successfully used against the most popular DNS software stacks, SAD DNS. 


CIOs tasked to make healthcare infrastructure composable

The composable healthcare organization is a healthcare organization that can reconfigure its capabilities -- both its business and operating model -- at the pace of market change. We have lived in a world and in an industry where there's been stable business and operational models. If you're a provider organization or a payer organization or a life sciences company, those heritage business models have been pretty stable. That's in terms of how organizations think, their culture, the way their business is architected -- so the organizational structures, the way they collaborate, all the way down to the way we've architected technology. They've really done that in service of a relatively stable business and operating model. What we're marking here are three main points. On a very simple level it's this: Adaptability is more important than ever, adaptability is more possible than ever, adaptability can be done by the people who you and I are speaking to -- the people you're reporting for and the people we work with on the Gartner health team. The idea of adaptability is nothing new to CIOs, in general. If you go back to when many of today's CIOs were in high school or even in college, there was reusable code, object-oriented programming -- we've just gone through a decade-and-a-half of more data services and agile development. 



Quote for the day:

"If you genuinely want something, don't wait for it--teach yourself to be impatient." -- Gurbaksh Chahal

Daily Tech Digest - November 12, 2020

The Ever-Expanding List of C-Level Technology Positions

In decades past, it was relatively uncommon for IT leaders to be part of the top tier of executive management. Even those who held the title of chief information officer (CIO) often reported to someone other than the chief executive officer (CEO). But digital transformation has changed that. As enterprises seek new ways of doing business, CIOs have begun playing a bigger role in directing the overall strategy of the business. Several different surveys have found that more than half of CIOs now report to CEOs, and many CEOs list their CIOs as one of their most trusted advisors. ... However, while they might not be ascending to the top job, IT leaders are finding more opportunities to join the executive team. The twin trends of digital transformation and the rise of big data analytics has led many enterprises to create new C-level positions directly related to technology. In fact, some industry analysts have begun to wonder if organizations have created too many new C-level technology roles. Some are forecasting that in the years ahead enterprises might be re-vamping their org structure to cut back on these new C-level positions. But for now, IT leaders seem to have more opportunities to fill C-level roles than ever before.


Applying Lean and Accelerate to Deliver Value: QCon Plus Q&A

It is important to understand that delay degrades the economic value of what we deliver - there is a cost to delays, and it can be significant. Think about the loss of opportunity or revenue if a software product is delivered late, especially in a highly competitive market segment. Delays also slow down feedback, which makes it harder to adapt to new information. You can also incur significant risk of outages or customer turnover if features are delivered late. With this in mind, just as we spend so much time optimizing and tuning the latency and throughput of our software systems, we should spend time to optimize and tune the latency and throughput of our development process. It turns out when you look at the math and dynamics of product delivery pipelines, the biggest contributor to delay is letting queues back up. Unlike in manufacturing, these queues are invisible in software development, so it is important that we make an effort to make them visible, and then address them quickly and aggressively. Two powerful ways to reduce queues are limiting work in progress and keeping your batch sizes small.


Banking Trojan Can Spy on Over 150 Financial Apps

The Kaspersky researchers first came across the Ghimob Trojan in August while examining a Windows campaign related to another malware strain circulating in Brazil. "We believe this campaign could be related to the Guildma [Brazilian banking Trojan] threat actor for several reasons, but mainly because they share the same infrastructure," according to the report. "It is also important to note that the protocol used in the mobile version is very similar to that used for the Windows version." Unlike other types of Android-focused malware, the Ghimob Trojan does not disguise itself as a legitimate app that is hidden within the official Google Play Store. Instead, the fraudsters attempt to lure victims into installing a malicious file through a phishing or spam email that suggests that the recipient has some kind of debt, according to the report. The message includes an "informational" link for the victim to click on, which starts the malware delivery. The malicious link is usually disguised to appear as either a Google Defender, a Google Doc or a WhatsApp Updater, according to the report. If opened, it installs the Ghimob Trojan within the device. The malware's first step is to check for any emulators or debuggers which, if found, are terminated.


How to stress-test your business continuity management

“You really need to be in a position to mitigate against any potential risks both before a system is live, and afterwards, so there are no nasty surprises. End to end testing of every platform, both independently and in terms of its integration with the wider network of systems, is therefore critical. However this needs to be balanced against the need to deliver with speed and certainty – so strong automated testing should be seen as a standard component of your production systems. “This will usually be provided by an independent quality assurance specialist. At Expleo we actually automate this process for clients to account for the complexity and speed of the technology and release cycles. Automated testing not only safeguards quality, but also adds value by providing immediate speed and efficiency gains. “First, ML cuts through the testing workload and sieves the data at scale, surfacing the highest-priority test cases. Then, AI analyses this data in real-time, so we can respond to risks before they become issues. This is used as the basis for predictive analysis – so you can predict where risk is going to emerge and mitigate it in the most cost effective way.”


What's next for AI: Gary Marcus talks about the journey toward robust artificial intelligence

Marcus points out this is a really deep deficiency, and one that goes back to 1965. ELIZA, the first expert system, just matched keywords and talked to people about therapy. So there's not much progress, Marcus argues, certainly not exponential progress as people like Ray Kurzweil claim, except in narrow fields like playing chess. We still don't know how to make a general purpose system that could understand conversations, for example. The counter-argument to that is that we just need more data and bigger models (hence more compute, too). Marcus begs to differ, and points out that AI models have been growing, and consuming more and more data and compute, but the underlying issues remain. Recently, Geoff Hinton, one of the forefathers of deep learning, claimed that deep learning is going to be able to do everything. Marcus thinks the only way to make progress is to put together building blocks that are there already, but no current AI system combines. ... A connection to the world of classical AI. Marcus is not suggesting getting rid of deep learning, but using it in conjunction with some of the tools of classical AI. Classical AI is good at representing abstract knowledge, representing sentences or abstractions. The goal is to have hybrid systems that can use perceptual information.


Passage of California privacy act could spur similar new regulations in other states

The COVID-19 crisis has derailed a lot of legislative activity across the country, making it difficult to get a solid sense of where privacy initiatives are headed. “The challenge you're going to find is that post-pandemic most of the state legislatures said anything that's not COVID related is not being considered,” Stockburger says. After the pandemic recedes from its urgent priority status, many states could kick new legislative efforts into gear. “Next year, that's when you're going to see big new developments and introductions,” he says. ... Another question that remains is whether the federal government will step in to create a more consistent privacy law framework. In the past, Silicon Valley giants stood staunchly opposed to the stringent provisions of the CCPA and sought a national privacy law to preempt and water down the CCPA’s requirements. However, their resistance has weakened over the past several years. “At the federal level, there's just a real challenge in getting any type of omnibus legislative efforts pushed through,” Stockburger says. “That’s been a challenge since probably 2016 when the Democrats got whooped in the midterms, and since then, we've had divided Congress.”


5 Things We’ve Learned from Digital Transformation in the Last 5 Years

While mobile offerings may have been a luxury five years ago, they are now an indispensable channel. Many organizations previously viewed mobile services as a nice-to-have, or as an offering geared towards a younger generation of tech-savvy consumers. However, now that contactless operations are the norm, offerings that incorporate mobile capture and mobile onboarding are a must-have for meeting the needs of the new digital-first consumer. From check deposits to application submissions, mobile services can go a long way in providing convenience, accessibility and ease. Organizations that embrace mobile capabilities and seamlessly connect them with back-end systems are well-positioned to enhance the customer experience and improve customer retention. Five years ago, it wasn’t uncommon for an organization’s process discovery methods to be defined by one-on-one interviews, firsthand observations and manual analysis. It was typical for business leaders to map out processes via post-it notes — what used to be referred to as “walking the wall.” Now, however, organizations are turning to machine learning and predictive analytics to discover and analyze their processes in a more accurate way.


DDoS Protection for Workloads on AWS with GWLB & DefensePro VA

There are many ways to deploy DefensePro VA with AWS Gateway Load Balancer to achieve north-south and/or east-west inspection. AWS Gateway Load Balancer adheres to multiple deployment use cases and network architectures. The AWS Gateway Load Balancer provides the VPC Endpoint Service, which allows customers to mimic on-prem networking paradigms, such as hub-and-spoke, across different VPCs and accounts. Customers can create a VPC dedicated to DDoS inspection where a group of DefensePro appliances is deployed with a Gateway Load Balancer. By utilizing AWS Ingress Routing, customers have full control of traffic routing to and from the DDoS inspection VPC. The following network topology illustrates a simplified deployment of DefensePro VA in a dedicated DDoS inspection VPC. There are two VPCs: the Customer VPC, which is Internet-facing, and DDoS-Inspection VPC. The Customer VPC has two Availability Zones for high availability of applications instances. Each zone includes an AGWe (end-point service) that steers traffic to/from the Gateway Load Balancer located in the DDoS-Inspection VPC. A group of DefensePro VAs is deployed in the DDoS Inspection VPC, spanning two Availability Zones, for high availability.


Does Your Business Need a Digital Transformation?

Because a digital transformation inevitably involves new systems, processes, and skills, it can be daunting for many leaders and teams. Embracing new technology involves a willingness to disrupt current processes and to develop new ones. This can be uncomfortable and challenging, and it’s important for leaders to acknowledge that from the outset. For many businesses, a digital transformation means completely rethinking systems and processes in order to embed technology throughout them. From the start, leadership teams need to be willing to make these major changes in order to take advantage of new tools. ... Perhaps the most important thing you can do is to prepare your team. Whenever there are major changes, leaders should expect some pushback. It’s important to anticipate and proactively address this issue to ensure that your team is ready and supportive of upcoming changes. A simple way to prepare your team is by being transparent about the planning process, goals, and anticipated shifts. Involving them in the process as much as possible will lead to increased buy-in and engagement from all levels of your team.


Stop thinking of cybersecurity as a problem: Think of it as a game

Companies can’t afford large-scale cyberattacks at any time, but especially right now. The pandemic has caused consumers who may have lost significant income to be picky with their purchases and investments. Companies need to be focused on retaining customer relationships so that they’ll weather the pandemic, and a take-down of the network could undercut customer trust in unrecoverable ways. But many companies won’t take action. They may view their older systems as good enough to ride the wave to the other side of the pandemic, and once there, they’ll go back to what they had used before, unprepared for the next attack. They may get through, but nothing will have changed — things will not go back to how they were, and you will no longer be able to rely on systems that protected a pre-COVID world. Now, there’s an opportunity to huddle up, form a new strategy, and go on the offensive. The pandemic can be an opportunity for businesses to take a look at their vulnerabilities, map their attack surface, and take appropriate actions to secure and strengthen their systems.



Quote for the day:

"Leadership is familiar, but not well understood." -- Gerald Weinberg

Daily Tech Digest - November 11, 2020

The Role of Relays In Big Data Integration

The very nature of big data integration requires an organization to become more flexible in some ways; particularly when gathering input and metrics from such varied sources as mobile apps, browser heuristics, A / V input, software logs, and more. The number of different methodologies, protocols, and formats that your organization needs to ingest while complying with both internal and government-mandated standards can be staggering. ... What if, instead of just allowing all of that data to flow in from dozens of information silos, you introduced a set of intelligent buffers? Imagine that each of these buffers was purpose-built for the kind of input that you needed to receive at any given time: Shell scripts, REST APIs, federated DB’s, hashed log files, and the like. Let’s call these intelligent buffers what they really are: Relays. They ingest SSL encrypted data, send out additional queries as needed, and provide fault-tolerant data access according to ACL’s specific to the team and server-side apps managing that dataset. If you were to set up such a distributed relay architecture to deal with your big data integration chain, it might look something like this


Malware Hidden in Encrypted Traffic Surges Amid Pandemic

Ransomware attacks delivered via SSL/TLS channels soared 500% between March and September, with a plurality of the attacks (40.5%) targeted at telecommunication and technology companies. Healthcare organizations were targeted more so than entities in other verticals and accounted for 1.6 billion, or over 25%, of all SSL-based attacks Zscaler blocked this year. Finance and insurance companies clocked in next with 1.2 billion or 18% of attacks blocked, and manufacturing organizations were the third-most targeted, with some 1.1 billion attacks directed against them. Deepen Desai, CISO and vice president of security research at Zscaler, says the trend shows why security groups need to be wary about encrypted traffic traversing their networks. While many organizations routinely encrypt traffic as part of their security best practices, fewer are inspecting it for threats, he says. "Most people assume that encrypted traffic means safe traffic, but that is unfortunately not the case," Desai says. "That false sense of security can create risk when organizations allow encrypted traffic to go uninspected."


Shadow IT: The Risks and Benefits That Come With It

Covid-19-induced acceleration of remote work has led to employees being somewhat lax about cybersecurity. Shadow IT might make business operations easier – and many companies certainly have been needing that in the last few months – but from the cybersecurity point of view, it also brings about more risks. If your IT team doesn’t know about an app or a cloud system that you’re using in your work, they can’t be responsible for any consequences of such usage. This includes those impacting the infrastructure of the entire organization. The responsibility falls on you to ensure the security of your company’s data whilst using the shadow IT app. Otherwise, your entire organization is at risk. It’s also easy to lose your data if your Shadow IT systems don’t back stuff up. If they’re your only method of storage and something goes wrong, you could potentially lose all your valuable data. If you work in government, healthcare, banking, or another heavily regulated center, chances are that you have local normative acts regulating your IT usage. It’s likely that your internal systems wouldn’t even allow you to access certain websites or apps. 


Refactoring Java, Part 2: Stabilizing your legacy code and technical debt

Technical debt is code with problems that can be improved with refactoring. The technical debt metaphor is that it’s like monetary debt. When you borrow money to purchase something, you must pay back more money than you borrowed; that is, you pay back the original sum and interest. When someone writes low-quality code or writes code without first writing automated tests, the organization incurs technical debt, and someone has to pay interest, at some point, for the debt that’s due. The organization’s interest payments aren’t necessarily in money. The biggest cost is the loss of technical agility, since you can’t update or otherwise change the behavior of the software as quickly as needed. And less technical agility means the organization has less business agility: The organization can’t meet stakeholders’ needs at the desired speed. Therefore, the goal is to refactor debt-ridden code. You’re taking the time to fix the code to improve technical and business agility. Now let’s start playing with the Gilded Rose kata’s code and see how to stabilize the code, while preparing to add functionality quickly in an agile way. One huge main problem with legacy code is that someone else wrote it. 


Interactive Imaging Technologies in the Wolfram Mathematica

A lot of mathematical problems that can be solved using computer algebra systems are constantly expanding. Considerable efforts of researchers are directed to the development of algorithms for calculating topological invariants of manifolds, knots, calculating topological invariants of manifolds of knots of algebraic curves, cohomology of various mathematical objects, arithmetic invariants of rings of integer elements in fields of algebraic numbers. Another example of ​​modern research is quantum algorithms, which sometimes have polynomial complexity, while existing classical algorithms have exponential complexity. Computer algebra is represented by theory, technology, software. The applied results include the developed algorithms and software for solving problems using a computer, in which the initial data and results are in the form of mathematical expressions, formulas. The main product of computer algebra has become computer algebra software systems. There are a lot of systems in this category, many publications are devoted to them, systematic updates are published with the presentation of the capabilities of new versions.


EU to introduce data-sharing measures with US in weeks

Companies will be able to use the assessment to decide whether they want to use a data transfer mechanism, and whether they need to introduce additional safeguards, such as encryption, to mitigate any data protection risks, said Gencarelli. The EC is expected to offer companies “non-exhaustive” and “non-prescriptive” guidance on the factors they should take into account. This includes the security of computer systems used, whether data is encrypted and how organisations will respond to requests from the US or other government law enforcement agencies for access to personal data on EU citizens. Gencarelli said relevant questions would include: What do you do as a company when you receive an access request? How do you review it? When do you challenge it – if, of course, you have grounds to challenge it? Companies may also need to assess whether they can use data minimisation principles to ensure that any data on EU citizens they hand over in response to a legitimate request by a government is compliant with EU privacy principles. The guidelines, which will be open for public consultation, will draw on the experience of companies that have developed best practices for SCCs and of civil society organisations.


Unlock the Power of Omnichannel Retail at the Edge

The Edge exists wherever the digital world and physical world intersect, and data is securely collected, generated, and processed to create new value. According to Gartner, by 2025, 75 percent6 of data will be processed at the Edge. For retailers, Edge technology means real-time data collection, analytics and automated responses where they matter most — on the shop floor, be that physical or virtual. And for today’s retailers, it’s what happens when Edge computing is combined with Computer Vision and AI that is most powerful and exciting, as it creates the many opportunities of omnichannel shopping. With Computer Vision, retailers enter a world of powerful sensor-enabled cameras that can see much more than the human eye. Combined with Edge analytics and AI, Computer Vision can enable retailers to monitor, interpret, and act in real-time across all areas of the retail environment. This type of vision has obvious implications for security, but for retailers it also opens up huge possibilities in understanding shopping behavior and implementing rapid responses. For example, understanding how customers flow through the store, and at what times of the day, can allow the retailer to put more important items directly in their paths to be more visible. 


4 Methods to Scale Automation Effectively

An essential element of the automation toolkit is the value-determination framework, which guides the identification and prioritization of automation opportunity decisions. However, many frameworks apply such a heavy weighting to cost reduction that other value dimensions are rendered meaningless. Evaluate impacts beyond savings to capture other manifestations of value; this will expand the universe of automation opportunities and appeal to more potential internal consumers. Benefits such as improving quality, reducing errors, enhancing speed of execution, liberating capacity to work on more strategic efforts, and enabling scalability should be appropriately considered, incorporated, and weighted in your prioritization framework. Keep in mind that where automation drives the greatest value changes over time depending on both evolving organizational priorities and how extensive the reach of the automation program has been. Periodically reevaluate the value dimensions of your framework and their relative weightings to determine whether any changes are merited. Typically, nascent automation programs take an “inside-out” approach to developing capability, where the COE is established first and federation is built over time as ownership and participation extends radially out to business functions and/or IT. 


Digital transformation: 5 ways to balance creativity and productivity

One of the biggest challenges is how to ensure that creative thinking is an integral part of your program planning and development. Creativity is fueled by knowledge and experience. It’s therefore important to make time for learning, whether that’s through research, reading the latest trade publication, listening to a podcast, attending a (virtual) event, or networking with colleagues. It’s all too easy to dismiss this as a distraction and to think “I haven’t got time for that” because you can’t see an immediate output. But making time to expand your horizons will do wonders for your creative thinking. ... However, the one thing we initially struggled with was how to keep being innovative. We were used to being together in the same room, bouncing ideas off one another, and brainstorms via video call just didn’t have the same impact. However, by applying some simple techniques such as interactive whiteboards and prototyping through demos on video platforms, we’ve managed to restore our creative energy. To make it through the pandemic, companies have had to think outside the box, either by looking at alternative revenue streams or adapting their existing business model. Businesses have proved their ability to make decisions, diversify at speed, and be innovative. 


Google Open-Sources Fast Attention Module Performer

The Transformer neural-network architecture is a common choice for sequence learning, especially in the natural-language processing (NLP) domain. It has several advantages over previous architectures, such as recurrent neural-networks (RNN); in particular, the self-attention mechanism that allows the network to "remember" previous items in the sequence can be executed in parallel on the entire sequence, which speeds up training and inference. However, since self-attention can link each item in the sequence to every other item, the computational and memory complexity of self-attention is O(N2)O(N2), where N is the maximum sequence length that can be processed. This puts a practical limit on sequence length of around 1,024 items, due to the memory constraints of GPUs. The original Transformer attention mechanism is implemented by a matrix of size NxN, followed by a softmax operation; the rows and columns represent queries and keys, respectively. The attention matrix is multiplied by the input sequence to output a set of similarity values. Performer's FAVOR+ algorithm decomposes the matrix into two matrices which contain "random features": random non-linear functions of the queries and keys. 



Quote for the day:

"Don't let your future successes be prisoners of your past failure, shape the future you want." -- Gordon Tredgold

Daily Tech Digest - November 10, 2020

CIOs to Hit the Gas on Digital Business in 2021

"We have to go into the year 2021 absolutely hating the word average," Lovelock said. "As soon as you say 'the average is', the only thing you are going to know for sure is that absolutely nobody is going to do that." Some CIOs spent on devices to get their workforces equipped to work from home. Others didn't. That's because executives looking to preserve cash in a crisis cut back where they could, according to Lovelock. "In 2020 devices is one of those first areas where you can save cash," he said. "When CIOs are faced with cash flow restrictions like they were in March and April, the first thing you save on or the first thing you defer is that deferable spending. That's mobile phones, laptops, all those hard things you can buy and pay cash up front form. You can sweat these assets." Meanwhile, categories that saw huge growth included desktop as a service and cloud-based video conferencing, according to Lovelock. These extremes in spending are part of what makes the 2020 recession different from the Great Recession of 2009 and 2010. That earlier economic downturn hit everyone across the board. "The decline in IT spending was more evenly spread," Lovelock said.


Microsoft adds a new Linux: CBL-Mariner

Investing in a lightweight Linux such as CBL-Mariner makes a lot of sense, considering Microsoft’s investments in container-based technologies. Cloud economics require hosts to use as few resources as possible, allowing services such as Azure to get a high utilization. At the same time, Kubernetes containers need as little overhead as possible, allowing as many nodes per pod as possible, and allowing new nodes to be launched as quickly as feasible. The same is true of edge hardware, especially the next generation of edge nodes intended for use with 5G networks. Here, like the public cloud, workloads are what’s most important, shifting them and data closer to users. Microsoft uses its growing estate of edge hardware as part of the Azure Content Delivery Network outside its main Azure data centers, caching content from Azure Web apps and from hosted video and file servers, with the aim of reducing latency where possible. The Azure CDN is a key component of its Jamstack-based Azure Static Websites service, hosting pages and JavaScript once published from GitHub. In the past Red Hat’s CoreOS used to be the preferred host of Linux containers, but its recent deprecation means that it’s no longer supported. Anyone using it has had to find an alternative. 


The future of programming languages: What to expect in this new Infrastructure as Code world

While we still use COBOL and other older programming languages, we also keep inventing new languages, each with its own advantages and disadvantages. For example, we have Rust and C++ for low-level, performance-sensitive systems programming (with Rust adding the benefit of safety); Python and R for machine learning, data manipulation, and more; and so on. Different tools for different needs. But as we move into this Everything-as-Code world, why can't we just keep using the same programming languages? After all, wouldn't it be better to use the Ruby you know (with all its built-in tooling) rather than starting from scratch? The answer is "no," as Graham Neray, cofounder and CEO of oso, told me. Why? Because there is often a "mismatch between the language and the purpose." These general-purpose, imperative languages "were designed for people to build apps and scripts from the ground up, as opposed to defining configurations, policies, etc." Further, mixing declarative tools with an imperative language doesn't make things any easier to debug. Consider Pulumi, which bills itself as an "open source infrastructure-as-code SDK [that] enables you to create, deploy, and manage infrastructure on any cloud, using your favorite languages." Sounds awesome, right?


Did Dremio Just Make Data Warehouses Obsolete?

The first new thing was caching data in the Apache Arrow format. The company employs the creators of Arrow, the in-memory data format, and it uses Arrow for in the computation engine. But Dremio was not using Arrow to accelerate queries. Instead, it used the Apache Parquet file format to build caches. However, because it’s an on-disk format, Parquet is much slower than Arrow. ... The second new thing that Dremio had to build was scale-out query planning. This advance enabled the massive concurrency that the biggest enterprise BI shops demand of their data warehouses. “Traditionally in the world of big data, people had lots of nodes to support big data sets, but they didn’t have lots of nodes to support concurrency,” Shiran says. “We now scale out our query planning and execution separately.”By enabling an arbitrary number of query planning coordinators in the Dremio cluster to go along with an arbitrary number of query executors, the software can now support deployments involving thousands of concurrent users. The third new element Dremio is bringing to the data lake is runtime filtering. By being smart about what database tables queries actually end up accessing during the course of execution, Dremio can eliminate the need to perform massive table scans on data that has no bearing on the results of the query.


What’s stopping job seekers from considering a career in cybersecurity?

The good news is that 71% of participants said that they view cybersecurity professionals as smart, technically skilled individuals, 51% view them as “good guys fighting cybercrime,” and 35% said cybersecurity professionals “keep us safe, like police and firefighters.” The bad news is that even though most view cybersecurity as a good career path, they don’t think it’s the right path for them. In fact, only 8% of respondents have considered working in the field at some point. “One of the most unexpected findings in the study is that respondents from the youngest generation of workers – Generation Z (Zoomers), which consist of those up to age 24 – have a less positive perception of cybersecurity professionals than any other generation surveyed. This issue in particular merits close attention by the cybersecurity industry at a time when employers are struggling to overcome the talent gap,” (ISC)² noted. The analysts posited that Generation Z’s perceptions of the cybersecurity field are shaped negatively by social media exposure, as social media platforms “tend to focus on the negative – arguments and venting.”


A Progressive Approach To Data Governance

For businesses, progressive data governance encourages fluid implementation using scalable tools and programs. The first step is to identify both a dataset and the relevant function. Using the same example as before, this could be the data in a reporting system the accounts department uses. That data could then be used during data literacy training hosted by a data governance software tool. Sticking with data literacy, after establishing one use case, an organization may decide to progress by expanding existing programs to other departments and then moving on to another function of data governance, such as identifying the roles and responsibilities of various data users or developing an internal compliance program. Businesses can scale the scope of the data they include in a governance program gradually, which gives them the chance to learn important lessons along the way. As an organization grows in confidence, it may widen its data scope and source it from other departments and locations. Progressive data governance can be described as a three-step process that incorporates the three C’s: catalog, collaborate and comply. Cataloging data assets makes data discoverable. 


The 4 Stages of Data Sophistication

As you start to rely on more data sources, and more frequently need to blend your data, you’ll want to build out a Data Lake—a spot for all of your data to exist together in a unified, performant source. Especially when you need to work with data from applications like Salesforce, Hubspot, Jira, and Zendesk, you’ll want to create a single home for this data so you can access all of it together and with a single SQL syntax, rather than many different APIs. ...  In the Lake stage, as you bring in more people to work with the data, you have to explain to them the oddities of each schema, what data is where, and what special criteria you need to filter by in each of the tables to get the proper results. This becomes a lot of work, and will leave you frequently fighting integrity issues. Eventually, you’ll want to start cleaning your data into a single, clean source of truth. ... When you have clean data and a good BI product on top of it, you should start noticing that many people within your company are able to answer their own questions, and more and more people are getting involved. This is great news: your company is getting increasingly informed, and the business and productivity results should be showing. 


Tales & Tips from the Trenches: In Defense of the Data Catalog

Most tools’ catalog interfaces provide many helpful features that together provide the context behind the data. The interface has many visual features that are certainly not vintage 1980’s. For example, many data catalog products have data quality metrics built in, which show dashboards of an asset’s quality on many of the “data quality dimensions.” These dashboards can be visible to the user and can help them determine if the data is suitable for their purposes. ... Data lineage is an extremely important feature of data catalogs; the products vary in how they perform it and how deep the lineage goes. One of my government sponsors felt data lineage was critical to their understanding, especially the visual depiction of the lineage. The data catalog’s data lineage diagrams tell the whole “back story” of the data: where it comes from, where it’s going, how “good” it is (based on whatever quality metrics are relevant), and some products even show the level of protection in the lineage diagram. The interface is important because it displays a visual diagram of the data flow along with descriptive metadata. See Figure 2 from Informatica which shows column-to-column mappings as data flows from one system to another, from source to warehouse or data lake. Notice that the actual transformations can also be shown for a given column.


A Seven-Step Guide to API-First Integration

This approach drastically reduces project delays and cost overruns due to miscommunication between frontend and backend teams leading to changes in APIs and backend systems. After designing the APIs, it can take some time to get the live backend systems up and running for the frontend teams to make API calls and test the system. To overcome this issue, frontend teams can set up dummy services, called mock backends, that mimic the designed APIs and return dummy data. You can read more about it in this API mocking guide. There can be instances where the requirements are vague or the development teams aren’t sure about the right approach to design the APIs upfront. In that case, we can design the API for a reduced scope and then implement it. We can do this for several iterations, using multiple sprints until the required scope is implemented. This way, we can identify a design flaw at an earlier stage and minimize the impact on project timelines. ... In software engineering, the façade design pattern is used to provide a more user-friendly interface for its users, hiding the complexity of a system. The idea behind the API façade is also the same; it provides a simplified API of its complex backend systems to the application programmers. 


Fintechs: transforming customer expectations and disrupting finance

With favourable tech regulation, massive mobile adoption, and shifting expectations across the demographics, digital challengers are well positioned to advance and evolve the personalised services they offer. Fintechs have the advantage of starting from scratch, without having to build on legacy IT infrastructure bureaucratic decision-making processes. They are lean and innovative, led by entrepreneurs on a mission to change the world. Using the latest technologies such as Artificial Intelligence (AI), Blockchain, Biometrics Security and Cloud, the processes, compliance requirements, policies and technology differ from conventional banks, providing lower operating and resource costs. With these foundations, they are well-positioned to pursue a highly customer-centric approach and rapid product innovation. By contrast, for traditional banks it can be an arduous task to innovate and reinvent. They are highly bureaucratic and slow-moving, with high-cost structures and substantial legacy tech. These characteristics prevent them from flexibly adapting to fast-changing consumer expectations. Service providers unable to live up to the expectations of best-in-class digital experiences will see high switching rates. As a result, providers are actively investing in initiatives that boost customer experience in a bid to increase long-term customer retention.



Quote for the day:

"Rarely have I seen a situation where doing less than the other guy is a good strategy." -- Jimmy Spithill

Daily Tech Digest - November 09, 2020

How to get the most out of intelligent automation

Getting a digital workforce up-and-running may prove difficult without buy-in from senior IT personnel, so ensure that their early support is gained. IT must also be involved from the start to support on many critical fronts, such as compliance with IT security, building and managing the underlying infrastructure and accessing target applications. Although digital workers are trained, governed and run by the business, not getting IT engagement is one of the fastest ways to curtail an automation program. In fact, working closely with IT must be an ongoing activity to ensure that the digital workforce operates optimally with minimal operational issues. For digital worker campaigns to be sustainable, they must also receive buy-in from senior executives who will sponsor and champion it. If they see it as a strategic business project, they’ll help provide the requisite financial and human resources and help remove any obstacles along the way. It’s important to plan where digital workers sit within the business so they most effectively scale as automation demand increases. While an organizational structure provides the foundation for standard operating procedures, selecting the right design for a digital workforce program is essential and should be based on key information, not gut feel.


Diversification and vision drive success in a crisis

What we learned the hard way is that some of our businesses are pretty dependent on supplies from a single country. For example, we get nearly all the vitamins for certain animal feed products from India. We knew this, but when you’re confronted immediately with a complete lockdown in India, you’re faced with the reality. A border closing is where the biggest risk is. We quickly started to find other sources to make sure we could continue supplying customers, and now we’re not dependent anymore on one location for the majority of our supplies. There were some scary moments that had more to do with closure of borders, but in the end, I don’t think the supply lines were that much disrupted. Safety stocks are now being held everywhere in the organization, which I understand on the one hand, because if I were in management, I would probably do the same. On the other hand, [doing so] eats into your liquidity because you’re building up a huge amount of working capital. We’re now focused on bringing that down again. An unrelated topic is Brexit, but that also doesn’t help, because in that area we need to prepare too [for the change in trade regulations]. I think there is a growing realization that [relying on] 100 percent just-in-time deliveries around the world might not be the best model.


EU moves closer to encryption ban after Austria, France attacks

The document states: “Law enforcement is increasingly dependent on access to electronic evidence to effectively fight terrorism, organised crime, child sexual abuse (particularly its online aspects), as well as a variety of cyber-enabled crimes. For competent authorities, access to electronic evidence is not only essential to conduct successful investigations and thereby bring criminals to justice, but also to protect victims and help ensure security. “The principle of security through encryption and security despite encryption must be upheld in its entirety. The European Union continues to support strong encryption. Encryption is an anchor of confidence in digitisation and in protection of fundamental rights and should be promoted and developed. “Protecting the privacy and security of communications through encryption and at the same time upholding the possibility for competent authorities in the area of security and criminal justice to lawfully access relevant data for legitimate, clearly defined purposes in fighting serious and/or organised crimes and terrorism, including in the digital world, are extremely important. Any actions taken have to balance these interests carefully.”


Why 90 percent of all machine learning models never make it into production

Companies aren’t bad at collecting data. However, many companies are highly siloed, which means that each department has its own ways of collecting data, preferred formats, places where they store it, and security and privacy preferences. Data scientists, on the other hand, often need data from several departments. Siloing makes it harder to clean and process that data. Moreover, many data scientists complain that they can’t even obtain the data they need. But how should you even start training a model if you don’t have the necessary data? Siloed company structures — and inaccessible data — might have been manageable in the past. But in an era where technological transformation is happening at breakneck speed, companies will need to step up and set up uniform data structures throughout. ... In addition, engineering isn’t always deemed essential for data scientists. This is a problem because engineers might not always understand all the details of what a data scientist envisions, or might implement things differently due to miscommunication. Therefore, data scientists who can deploy their models have a competitive edge over those who can’t, as StackOverflow points out.


What Quantum Computing Could Mean for Software Development

As speculative as quantum software development sounds, it is not an entirely alien concept. There is a broad class of quantum algorithms, said Yudong Cao, founder and CTO for startup Zapata Computing, that share similar features as machine learning models. “If you look at MLOps or AIOps, this is very much the sort of software engineering challenge [in quantum software] that people also face with AI.” He leads an effort at Zapata to provide software that might help industrial players explore possibilities of quantum computing. Cao said when he started in the field, quantum computing was still largely an academic discipline with theoretical works that might predict what could be done with a quantum computer, as well as experimental works that demonstrate what could be done. “Today we’re seeing this gap become narrower and narrower,” he said. “On the theory side, we’re improving our algorithms to reduce the amount of resources. On the other side, new hardware is coming online.” There is a frontier emerging for quantum computing thanks to software solutions and hardware maturing, but Cao said the confusing ecosystem needs to be sorted out. “What is needed is a set of tools that allow people to tap into this diverse landscape effectively,” he said. 


Companies gearing up for 5G era of industry digitisation

For Smart Cities, there will be considerably more ways to manage the community life in near real-time like enhanced security and mobility, measure pollution, optimise energy consumption, improve waste management, etc. The list is virtually endless. Although existing 4G networks have already been deployed in smart cities around the world, they are limited by the number of connections they can support, the data they can transmit, and most importantly the speed they can offer, all of which create hurdles in deployment of smart cities use cases. 5G is expected to overcome these hurdles, and allow large number of connections, providing super-high bandwidth, and ultra-low latency based communications, to build a connected city – a smarter city. 5G will provide the infrastructure to roll out these innovations that appear promising today. Therefore, 5G opens up a whole world of new possibilities. Until the fourth generation network (4G), consumers were the first to benefit from each new generation of wireless technology. However, the main interest of 5G lies in business-to-business markets. This next generation of mobile telecommunications will become the backbone of industrial operations in the broadest sense.


When It Comes to Culture, Does Your Company Walk the Talk?

To address whether stated values shape employee behavior, we first measured what companies say they value. The simplest way to quantify corporate culture would be to treat each value as binary — a company either listed it as a core value or did not. When Charles Schwab lists innovation as one of four core values, it is presumably more focused on it than Quicken Loans, which includes innovation among a laundry list of 19 elements of its culture. To quantify each company’s relative focus on a value, we weighted it by the inverse of the total number of values listed. So innovation was weighted at 25% for Charles Schwab and 5% for Quicken Loans. (A company that didn’t list a specific value received a weighting of zero for that value.) To control for differences across sectors, we assigned each company to one of 33 industries. We then ranked each company in its industry based on the weighting for each value we measured. To assess how well companies live up to their stated values, we used data from the 2019 Culture 500, which ranks companies on nine of the most commonly cited values. Every Culture 500 company received a sentiment score that measured how positively employees talked about a specific value in the free text of their Glassdoor reviews.


10 ways the pandemic affected cloud investment

One initiative that emerged as being one of the most important for the enterprise was remote resiliency, which was put to the ultimate test in March, when the organizations that had the ability to go remote, did. The pressure was on for IT departments who had to rally to be sure telecommuters had access to the company online, that their personal devices were safe, and keep vigilant watch, as hackers took advantage of the COVID-19 crisis' imposed mobilization of the industry. As coronavirus cases and the resultant deaths rise, the burden of investing in the cloud grows exponentially. A new report from SPR, "Cloud Investment in the Age of Accelerated Evolution" offers research culled from an investigation into the state of enterprise cloud adoption, in what it calls "this uniquely transformative moment."  The report cites 10 key findings, which include a reveal of the biggest barrier to effective cloud security: The organization's budget. But in the next 12 months, 41% of IT decision-makers said they plan to increase security budgets. The engagement with the cloud had been enthusiastic, but COVID-19 sent the innovations into fast forward, which meant that the enterprise was able to assess use quicker, but since it was a digital transformation that evolved from urgent need rather than careful observation and experience, it's now time to make the cloud more efficient.


Stargate: A new way to think about databases

DataStax is getting there by open sourcing the Cassandra coordinator code. DataStax started with Cassandra for obvious reasons: The company knows the database well and Cassandra is a great option for handling distributed data requests. But it’s that coordinator code that is the heart of Stargate, as Gosnell explained. The hardest aspect of the logic between a customer’s API and their back end is the distributed request coordination, i.e., ensuring proper load balancing, directing database requests to the right place, etc. This is what Cassandra’s coordinator code does well. The company wants more developers to “join our community and help us prioritize which features we need next” in Stargate, Gosnell stressed. It’s a great story, one that helps DataStax, of course, but also has the potential to be useful for other vendors and with other databases. And that is where Stargate could go from an interesting, single-vendor project into something much more noteworthy. Consider Kubernetes, for example. It was cool technology when its Borg ancestor ran exclusively within Google, and it remained cool as an open source project to which Google employees, almost exclusively, contributed.


Where Does Data Governance Fit into a Data Strategy

The data of the organization is a valuable tool to enable improvements in product development, customer interaction and satisfaction, quality improvement and impacts the organization’s bottom line in terms of decision-making, and improvements in quality, efficiency and effectiveness. Everybody, at all levels, must recognize that your organization’s data governance program will enable you to strategically manage data as an asset to achieve accurate, trusted, and secure data that delivers business intelligence focused on leveraging and building a competitive advantage. The impact on the data of the organization will be significant. Employees and partners will benefit from everything stated as the purpose of the data governance program (in the previous answer). The impact on individuals roles will depend on people’s present relationship to the data, however, the intent at your organization is to take a less pervasive approach that aims to minimize disruption to normal business activities. Individuals that define data will become educated on the aspects of data definition that drive improvements to the organization’s confidence in the data they use. Individuals that produce data will become educated in quality data production including timeliness, accuracy, completeness and relevance. 



Quote for the day:

"A real friend is one who walks in when the rest of the world walks out." -- Walter Winchell