Daily Tech Digest - July 18, 2020

Digital Is Great, But Where Are The New Business Models?

While executives are knowledgeable and aware of digital technologies, "the bad news is that most companies do not seem to act on this knowledge to transfer their business to the future,” according to the study’s co-authors, Philip Meissner, chair of strategic management and decision making at ESCP Business School, and Martin Mocker, research affiliate with MIT CISR. “And creating such a business model does not seem to be top of mind for most executives either. Only one-third said that they primarily think of digital business models when they think about digitization. Two-thirds focus on digital processes instead." The single most important focus of a transitioned business model is the customer, pure and simple. "Digital business models take your company directly to the consumer, wherever they are,” Meissner and Mocker state. “Their smartphone is always with them and is so is your business.” The recent Covid-19 crisis demonstrated to the world the immense value of a digital, customer-focused business model, they add. “While some businesses saw revenues decrease by more than 80% within weeks, companies with a digital business model thrived.


Top 5 Questions (and Answers) About GRC Technology

Business continuity plans (BCP) — and solid governance, risk, and compliance (GRC) policies, in general — can help businesses prepare for and navigate many disruptive events, including natural disasters, cybersecurity breaches, terrorist attacks, fraud, and embezzlement. We believe in the benefits of implementing technology to streamline policies, automate processes, and create repeatable workflows so organizations can quantify risk into digestible dashboards to gain a singular source of truth. [Editor's note: The author's company is one of several providers of GRC technology.] Most businesses, we've found, have the same questions about implementing tech to strengthen their GRC programs. So we asked our customer success team, who all come from GRC consulting backgrounds, what they're typically asked. ... Before choosing to implement any GRC technology, it's important that organizations align people and teams to a common goal and define the existing processes surrounding GRC. One of the biggest mistakes we see GRC leaders make during an implementation is overcomplicating a process that should be simple. Don't get distracted by shiny bells and whistles at initial go-live.


Augmented Intelligence: Blazing a Trail in Business Enterprises

With number of headlines suggesting that AI will soon take over a great number of jobs, thereby leaving a large proportion of the workforce’s skills redundant, this advanced technology is often more feared than revered. However, our research shows that over half of UK workers (59%) don’t actually believe their jobs are at risk of being replaced by AI in the next decade, and instead, embrace it as a tool to help enhance the way they work. 64% of UK employees say AI as making them more efficient. This is the definition of Augmented Intelligence – a combination of human power and AI to achieve stronger results, time after time. Above all, this concept relies on a seamless collaboration between people and AI to innovate, solve problems, and improve workplace processes with precision and ease. London’s black cab drivers are a prime example of how Augmented Intelligence can assist workers in performing their roles better. For decades, drivers have been required to pass the gruelling knowledge test, which demands a virtually encyclopaedic mastery of London’s streets. However, GPS technology is now so advanced that it could eliminate the need for this extensive familiarity – and the tradition of acquiring it – in one fell swoop.


The Key Benefits for High Availability Load Balancing 

High availability, which is the ability of a system or system component to be continuously operational for a desirably long period of time, can help IT departments implement an architecture that uses redundancy and fault tolerance to enable continuous operation and fast disaster recovery. ... High availability begins with identifying and eliminating single points of failure in the infrastructure that might trigger a service interruption—for example, by deploying redundant components to provide fault tolerance in the event that one of the devices fails. Load balancing, whether provided through a standalone device or as a feature of an ADC, facilitates this process by performing health checks on servers, detecting potential failures, and redirecting traffic as needed to ensure uninterrupted service. While ensuring fault tolerance for servers is obviously critical, a high availability architecture must also consider the load balancing layer itself. If this becomes unable to perform its function effectively, the servers below run the risk of overflow, potentially compromising their own health as well as application performance and application availability. This makes redundancy just as important for the load balancer or ADC as for any other component in the data centre.


Cybersecurity Recuperation: Ensuring a Safe Return to Work

Unlike the rushed, unexpected manner in which many organizations sent their employees home, the return to the office is something that can be planned and prepared for in a more organized and orderly fashion. Cybersecurity teams must not miss this window – they need to act now to ensure the necessary processes and tools are in place before employees head back to their workplace. To reduce risk and facilitate a quick return to normal operations, cybersecurity teams need to consider what threats employees may bring back with them to the office environment. Once these are identified, cybersecurity teams must take proactive steps to mitigate these risks. Below, are three key factors to consider as organizations prepare to return to work. Patching: Remote working creates new cracks through which users can slip. For instance, a VPN might not be able to sustain the high traffic generated by so many employees working from home; with users not connecting to the VPN for extended periods their laptops or desktops may fall behind on regular updates and patches. Some computers and servers left on-premise may have been shut down throughout the home-working period and could also have missed regular security upgrades; before returning to the office, cybersecurity teams must make sure that all software is patched across all devices or may expose users to cyber risks.


Digital public services: How to achieve fast transformation at scale

Navigating public services can be bewildering. Information about how to access services is often presented in hard-to-understand bureaucratic language, and users must visit different websites or offices for each service. Applications routinely require hard copies of supporting documents to still be printed and signed, and many online forms are just as complicated to complete as the paper versions. Furthermore, the user experience tends to vary across government websites, and users often require multiple accounts and digital IDs to manage their needs. All of this stands in stark contrast to expectations. More and more often, people see no reason why public services should be more complicated than shopping online. They want to be able to quickly find the most relevant services. They want information in clear and simple language and expect to complete all transactions via digital channels—ideally, through a single digital journey. For example, new parents could get a birth certificate, apply for child benefits, register for parental leave, and access other relevant services through one easy process instead of interacting with multiple agencies, often in-person, and sharing the same information multiple times.


Twitter hack jolts companies into a social media security check

While the nature of this hack suggests there was little account holders themselves could have done to prevent themselves from falling victim to this particular hack, there are several security measures any company that manages social media accounts should take regularly to avoid other potential risks. On the day following the hack, one large advertising company sent around internal communications emphasizing the importance of password security and reminding employees to ensure that people who no longer require access to advertising management accounts are removed from those systems. Similarly, employees were reminded that only people with a certain level of seniority and sign off should have the ability to be administrators, according to an executive at that agency who declined to be named. On Twitter specifically, account holders can review the number of active “sessions” and opt to log out other users and devices within their account settings. Often in the advertising and media industries, mid-level employees can have access to powerful tools — from CMS access, to customer-relationship management software and client social media accounts.


Cybercriminals Targeted Streaming Services to Provide Pandemic Entertainment

Attackers not only sought access to video services, but also access to industry services—such as first-release movies—and data on the subscribers, such as their location. The increase likely had to do with a combination of attackers having time ans an increase in demand for streaming content, says Steve Ragan, security researcher at Akamai. "Credential stuffing is a low-hanging, high-reward type of attack," he says. "Easy to do, and if successful, a complete ATO [account takeover] is the result. The trends show that the problem is consistent and continuing to rise."  While much of the increase in the first quarter of 2020 can be attributed to a single campaign against a popular broadcast TV service—the identity of which Akamai declined to discuss—the overall trend underscores that digital services continues to be a major focus of credential-stuffing attacks. Such attacks attempt to use usernames and passwords stolen from one provider against other providers, in hopes that the victim reused their credentials across services. "The criminal economy is a chained instance, where everything is connected somehow, and no piece of information is without worth," Akamai stated in the report.


5 Trends in Big Data and SQL to Be Excited About in 2020

SQL and analytics are becoming more collaborative. As discussed earlier, getting insights from data is becoming more prolific. That means more people are getting involved in creating queries, analytics, and metrics. Collaborative work started with products like Google Sheets. The trend has continued to expand into SaaS products like Figma (collaborative design) and PopSQL (collaborative SQL). Technologies like PopSQL offer the ability for your team to collaborate and track your work on queries easily through folders and version control. Now you don’t have to worry about someone accidentally changing your query on a report or dashboard. Version control allows you to revert what the query was at a previously saved state. This ensures that your team is constantly on the same page as far as SQL and the logic you are using to calculate your metrics. You also can easily share queries, update them, fork them, and visualize data. Also, tools like Figma, Google Sheets, and PopSQL integrate easily with other collaborative tools like Slack. These integrations further allow your team to share charts, queries, designs, and insights with ease.


Banks need to think like Google and not just follow it

Banks have for a long time been huge IT organisations, with the biggest often recruiting more IT professionals than the major IT suppliers. But a change in recruitment practices was brought on by digital transformation and the need for banks to keep pace with a changing tech environment. Today it is more about recruiting senior thinkers rather than foot soldiers and the people that fit the bill often work for the tech giants. Gareth Lodge, analyst at Celent, said banks have always been IT companies that offer financial services, but the ethos within is changing. “It’s more a realisation that effective IT can be a competitive differentiator,” he said. “Until now, many banks have seen IT as how they deliver products.” One IT professional in the financial services sector agreed there has been a change in mindset, with banks realising they are increasingly IT-driven and happen to sell financial services. Now, through recruitment, they are “looking for inspiration on how to do that better”, he said. “It has taken banks a long time to accept that IT is no longer a painful cost to be outsourced and is the key to their future.” The need for a new approach to IT will require more recruitment from outside the banking sector because the tech-savviness of parts of the industry might be overestimated, according to David Bannister, analyst at Aite.



Quote for the day:

"It is the capacity to develop and improve their skills that distinguishes leaders from followers." -- Warren G. Bennis

Daily Tech Digest - July 17, 2020

Digitisation accelerated by Covid-19 will change the insurance industry forever

Intelligent automation and technologies such as natural language processing (NLP) can help. In the UK for example, Zurich is working to create digital mailrooms with all paper mail scanned and routed digitally on arrival; everything is delivered to a central location, scanned and put into a workflow, with links emailed to the appropriate teams. Apply Intelligent Character Recognition (ICR) and NLP to that process, and you can start to automatically triage and respond to documents. If you have a medical or legal file coming in that might be hundreds of pages long, a good NLP engine can extract and highlight the relevant information before securely passing it on to an assessor. The machine doesn’t make the decision, but it does a ‘pre-read’, which is a huge help to the assessor (who still has access to the full file), who can then spend more time where the real value is, in assessing the claim. It removes the admin work, and the NLP is continually learning and updating, based on changes made by the assessor. Of course, the first step here is digitising the paper process in the first place — something that the current climate has made necessary, but the benefits of which will far outlive the pandemic.


The TLS 1.2 Deadline is Looming, Do You Have Your Act Together?

Together with its precursor SSL, TLS has long been in the crosshairs of both attackers and security researchers who understand that a weak or non-existent deployment of the protocol makes it trivial enough to carry out man-in-the-middle and other attacks against the vulnerable target. In the last five years, SSL/TLS has been one of the most likely components tied to branded vulnerabilities, a la Heartbleed, POODLE, BEAST, DROWN, you name it. This high-profile activity has driven the crypto community to keep working hard to refine TLS. It’s why the biggest standards bodies and regulators, including the Internet Engineering Task Force, the National Institute for Standards and Technology, and the Payment Card Industry Security Standards Council, mandate that operators of web servers ensure that they’re using the most up-to-date version of the protocol, TLS 1.2 before the end of 2020. Additionally, TLS 1.0 and 1.1 have been (or are in the process of being) deprecated in one way or another by major browsers. This means that major web browsers are also planning on turning the screws to organizations in the latter half of 2020, warning that they’ll soon throw up browser warnings when a user visits a site that doesn’t support TLS 1.2.


Microsoft's Android smartphone launcher just got a major makeover

To use the Microsoft Launcher app, your phone needs to be Android 7.0 or higher. But there's only so much a launcher can do, as Microsoft explains. "You must download Microsoft Launcher from Google Play Store. Downloading Microsoft Launcher will replace the default launcher. Microsoft Launcher does not replicate the user's PC home screen on the Android phone. Users must still purchase and/or download any new apps from Google Play," Microsoft notes.  There are a few glitches with the app still. Android 10 navigation gestures may not work on all phones, and system-level dark theme only works on Android 8.0 and above. Sticky notes sync issues may occur after upgrading to v6 and notification badges may need to be enabled again after the upgrade. Microsoft announced the Launcher update as part of the new Windows 10 build 20170 preview release. This preview for Windows Insiders in the Dev Channel is currently not available for PCs with an AMD processor due to a "bug impacting overall usability of these PCs". There's a new experience for sound settings at Settings > System > Sound > Manage sound devices. It lets users know which device is default and to pick a sound device as your default device or default communication device.


How Do CIOs Feel About a Return to On-Premises Work?

“The timelines are all over the map and subject to change. And sometimes frankly, they are meaningless for parents, if safe childcare or eldercare are not available. In this context, what date is chosen doesn’t matter but employer flexibility is of paramount importance,” said former CIO Joanna Young. CDO Jay Brodsky is concerned “about employees with families and the lack of adequate childcare options this summer.” For this reason, “planning must consider childcare and caring for people who might be high risk. There are a lot of considerations,” said CIO Martin Davis. Former CIO Michael Kail added, “I can't even get a haircut yet, so I am not sure how one plans to return to any semblance of on-premises work.” However, CIO David Seidl said, “In specific disciplines, we're getting ready. This includes wiring, classroom upgrades, network maintenance and limited tech support. We're handling delayed maintenance in physically distanced ways with safety measures in place. But how soon will we be back in a more significant way? There's a lot still up in the air. We are doing well with work from home for most of our IT organization. We'll be part of the overall planning and process.”


Haskell Web Framework IHP Aims to Make Web Development Type-Safe and Easy

While working through the software lifecycle we could see certain problems happen over and over. Especially quality issues when working with very dynamic languages and issues related to package-management. So we set out to solve these problems. While a lot of people think that the choice of programming languages does not matter, we believe that technology choices vary in power and have a strong impact on the product. We have looked at a lot of different technologies and found Haskell to be a great fit for our aim of highest quality software engineering and developer happiness. While Haskell is a great language, we could not find a really good framework to work with. We have been looking for something opinionated, well documented, and easy to get started with. After evaluating some solutions we decided to build it ourselves. The same process lead us to pick nix as our primary package management solution: We want our developers to be able to switch projects very quickly. We intended to make a completely standardized development environment.


Data Science: Why Humans Are Just as Important as Math

No amount of stress-testing could have prepared even the most sophisticated machine learning models for the extreme data variation that we’ve witnessed in the past few months. Analysts and data scientists have had to step in to calibrate models. The ability to apply a critical lens to data and insights is not one we can readily teach machines. Overlooking this important step of the process leaves us susceptible to falling into the hubris of big data and making decisions that miss important elements of context. For example, we saw an increase in demand for nonperishable foods across the supply chain, but once everyone has stockpiled their pantries, they’re unlikely to buy these items in similar quantities in the coming months. This will naturally lead to a drop in demand that we must prepare algorithms for, instead of automatically continuing to operate production lines as if such demand is the new normal. Another example is a machine learning application in cybersecurity, in which an algorithm may monitor for threats against a retailer’s website. To the model, a sudden tenfold increase in website visits may seem like an attack; but, if you were to factor in that it coincided with the retailer launching mask sales, you have the context to understand and accept the uptick in traffic.


New wave of attacks aiming to rope home routers into IoT botnets

This trend is concerning for several reasons. Cybercriminals are competing with each other to compromise as many routers as possible so they can be conscripted into botnets. These are then sold on underground sites either to launch DDoS attacks, or as a way to anonymize other attacks such as click fraud, data theft and account takeover. Competition is so fierce that criminals are known to uninstall any malware they find on targeted routers, booting off their rivals so they can claim complete control over the device. For the home user, a compromised router is likely to suffer performance issues. If attacks are subsequently launched from that device, their IP address may also be blacklisted – possibly implicating them in criminal activity and potentially cutting them off from key parts of the internet, and even corporate networks. As explained in the report, there’s a thriving black market in botnet malware and botnets-for-hire. Although any IoT device could be compromised and leveraged in a botnet, routers are of particular interest because they are easily accessible and directly connected to the internet.


BBVA explores quantum computing for banking

According to BBVA, the advancement is thanks to qubits, as opposed to bits, in traditional computing. “Qubits exponentially increase the computing capacity compared to classical computing. If the bits can perform calculations based on two possibilities (1 and 0), qubits can run calculations on all the possible combinations between 1 and 0 in parallel,” said the bank. Early results in the project show that quantum computing can resolve some complex problems quickly, accurately and efficiently, said BBVA. “Although this technology is still in an early stage of development, its potential to impact the sector is already a reality,” said Carlos Kuchkovky, BBVA global head of research and patents. “Our research is helping us to identify the areas where quantum computing could represent a greater competitive advantage, once the tools have sufficiently matured. We believe this will be, for certain concrete tasks, in the next two to five years.” A test done by BBVA’s team on the use of the technology for investment portfolio optimisation showed that it could be considerably faster when there are more than 100 variables in a calculation. But the advantages could be the same for less complex calculation as quantum hardware advances, said BBVA.


Wirecard Fraud Is Risk Management Lesson For Fintech Companies

Financial companies are expected to adopt a risk management program that provides a thorough and consistent evaluation of the nature and extent of risks to which they are exposed. Central to this is Enterprise Risk Management (ERM) which articulates and codifies how an organization approaches and manages risk. The tenets of an ERM framework include articulating risk appetites, putting formal policies into place, conducting risk assessments, establishing strong internal controls, and ensuring oversight by both senior management and boards of directors. Wirecard’s 2018 Annual Report had extensive disclosure of its “efficiently organized [enterprise] risk management system.” The weaknesses that were confirmed later in the company’s internal control and governance procedures remind fintech managers of the challenges that must be overcome to make risk management truly operational in a dynamic technology-driven firm. For risk management to be effective, management and the board must own and address it, and the risk management system supported by a healthy risk culture throughout the group.


Innovation Startups Modeling Agile Culture

So far, we have considered the concept of a data-driven startup. It should also be noted that the data itself means nothing to a company. So, what is the real purpose of analyzing huge amounts of data? It is to create valuable information, intelligence, Business Intelligence, that can efficiently connect a company with the market and its customers. This is a key point—every innovation startup knows the importance of collecting the right data to convert it into intelligence. The value of the Business Intelligence a startup provides in huge markets like the buildtech sector is the value of the service it provides. We all know what Business Intelligence is, but it is not easy to provide an example. So, here’s an example of how a data-driven startup focuses its business evolution on the intelligence it can generate. “Building Intelligence” is a fake company name in this example of a real startup in the buildtech sector. The vision of Building Intelligence is to avoid higher costs and overtime in construction projects due to failures. Nowadays, this company works over a SaaS, taking photos of construction sites, creating 3D models, and comparing them with the architectural plans.



Quote for the day:

"Leaders know the importance of having someone in their lives who will unfailingly and fearlessly tell them the truth." -- Warren G. Bennis

Daily Tech Digest - July 16, 2020

The Advancements In Real World Artificial Intelligence

Ongoing advances in artificial intelligence have come essentially in zones where information researchers can copy human recognition capabilities, for example, perceiving objects in pictures or words in acoustic signs. Figuring out how to perceive designs in complex signs, for example, sound streams or pictures, is amazingly incredible—ground-breaking enough that numerous individuals wonder why we aren’t utilizing deep learning procedures everywhere. Pushing ahead, as groups become adjusted in their objectives and techniques for utilizing AI to accomplish them, deep learning will turn out to be a piece of each data scientist’s tool box. Consider this thought. We will have the option to incorporate object recognition in a framework, utilizing a pre-prepared artificial reasoning framework. However, at long last, we will understand that profound learning is simply another tool to utilize when it makes sense. Now let’s explore how AI is benefitting the mankind and serving various fields like marketing, finance, banking and so on in the real world. Marketing is a way to glorify your products to attract more customers. In the early 2000s, in the event that we looked through an online store to discover an item without knowing its precise name, it would turn into a nightmare to discover the item.


Using the new normal to break from the past and innovate

Arguably, the big reason for the failure of online sales efforts by traditional automakers was the standard way of selling vehicles as good enough, and the effort and investment required to create an online channel wasn't perceived as worthwhile when no one (aside from Tesla) offered a similar capability. People had been buying cars through dealers for a century, and designing and implementing the technology, relationships, marketing, and execution required to create an effective online sales channel was perceived as throwing money at fixing a process that wasn't broken. A time-tested business model centered around driving customers into an enclosed space full of strangers and ideally getting them to sit in an even smaller space with a stranger, with no idea when that space was last cleaned, suddenly doesn't look that great during a pandemic brought about by a virus that spreads primarily through human proximity. Suddenly, dealer networks that saw vehicle delivery as "too expensive" and online or phone purchasing as distractions were able to implement these practices in a matter of weeks.


Businesses express concerns around ethical risks for their AI initiatives

“As organizations become more invested in AI, it is imperative that they have a common framework, principles and practices for the board, C-suite, enterprise and third-party ecosystem to proactively manage AI risks and build trust with both their business and customers,” said Irfan Saif, principal and AI co-leader, Deloitte & Touche. ”Our study results show that while early adopters of AI are still bullish, their competitive advantage may be waning as barriers to adoption continue to fall and more creative use of the technology grows. “In the era of pervasive AI, where capabilities are readily available, organizations should go beyond efficiency and push boundaries to create new AI-powered products and services to be successful.” — Nitin Mittal, principal and AI co-leader, Deloitte Consulting. As purchasing barriers have dropped and AI is more available, choosing the right technology is more important than ever. Those AI adopters surveyed tend to “buy” their capabilities rather than “build” them. To become smarter consumers, companies should evaluate the landscape, find the most advanced AI and integrate those technologies into their infrastructure.


Tech Sector Job Interviews Assess Anxiety, Not Software Skills

“Technical interviews are feared and hated in the industry, and it turns out that these interview techniques may also be hurting the industry’s ability to find and hire skilled software engineers,” says Chris Parnin, an assistant professor of computer science at NC State and co-author of a paper on the work. “Our study suggests that a lot of well-qualified job candidates are being eliminated because they’re not used to working on a whiteboard in front of an audience.” Technical interviews in the software engineering sector generally take the form of giving a job candidate a problem to solve, then requiring the candidate to write out a solution in code on a whiteboard – explaining each step of the process to an interviewer. Previous research found that many developers in the software engineering community felt the technical interview process was deeply flawed. So the researchers decided to run a study aimed at assessing the effect of the interview process on aspiring software engineers. For this study, researchers conducted technical interviews of 48 computer science undergraduates and graduate students. Half of the study participants were given a conventional technical interview, with an interviewer looking on.


Taking the Pain Out of Buying and Selling Data

Narrative’s SaaS-based application provides a platform to connect buyers and sellers. On the buy side, it helps companies acquire and integrate second- and third-party data, typically for the purpose of AI or analytics. On the sell side, companies that license Narrative’s software have a mechanism for reaching multiple buyers in an orderly and streamlined fashion. There’s a lot of work that goes into buying and using, on both sides of the equation, according to Jordan. There are all the usual questions about the format that the data takes (CSV, Parquet, JSON, etc.), the units of measurement. Once data scientists or analysts have studied a sample of the outside data and decided that it will work for their particular activity, then data engineers are called in to build the ETL pipelines to move the data, which can often take months. On top of the logistical questions, there are legalities that must be taken into account. Buyers and sellers both must take measures to assure that they’re not violating regulations for their particular geography. Finance teams typically gets involved to obtain usage data and make the payments. And if anything changes to the data or the contract, all the engineers, analysts, data scientists, lawyers, and finance folks get to drop whatever they’re doing and revisit the matter.


The Twitter mega-hack. What you need to know

There are a number of ways in which online accounts can get hijacked. These include, for instance: You might have made the mistake of reusing your Twitter password elsewhere on the net. If the other place suffers a data breach, a hacker might try to use that same password against your Twitter account. Two-factor authentication can help protect against this, but the best advice of all is to never reuse passwords; You might have had your password stolen from you via a phishing attack or keylogging malware. Two-factor authentication can also help protect against this. In addition, password managers and security software can also provide a layer of defence; You might have mistakenly told someone your password. Passwords should be secret. It’s hard to believe, however, that someone is big enough buddies with Bill Gates, Kanye West, Uber, and the rest to have had their passwords discussed over a candlelit dinner; Your account could be hijacked by a third-party app that is compromised. If the app had access to your Twitter account it could post tweets without your permission. An attack just like that happened to my Twitter account a few years ago.


How Chase is using AI to update banking

In response to the COVID-19 crisis, the U.S. government launched the Paycheck Protection Program (PPP) a couple of months back to ensure money continues to roll into the workforce — this, in turn, led to significant paperwork for banks, which have had to deal with a mountain of applications. The Small Business Administration (SBA) reportedly had to process 75 years’ worth of loan applications in just two months, which gives some idea as to the scale of this undertaking. Faced with such an unprecedented challenge, one that affected the lives and livelihoods of literally millions of Americans, Chase had to come up with a way of classifying documents that its customers were uploading as part of the PPP application process. It did so with a view toward helping its business banking division and underwriters wade through as many applications as possible. “They needed a way to understand what documents our customers were uploading, which we hadn’t yet tagged every single document as part of our workflow,” Nudelman explained. “So instead, after the fact, we worked with the people building the process and technology to use natural language processing (NLP) to ensure the documents that have been uploaded were tagged appropriately, which helps the underwriters’ ability to process those applications, getting customers their loans faster.”


Are we at the tipping point for global biometric payment card adoption?

Well, according to analysis from Goode Intelligence, there are several hurdles to overcome before biometric payment cards can be shipped to users in their millions – including cost and scheme certification. Despite being hailed as the future-tech solution to end our use of cash and cards, mobile payments haven’t reached anywhere near the expected level of public adoption in the UK. As of 2019, only around 19% of the UK population used mobile payments. Of course, the fact that Apple, giants in the payment app space, launched a physical credit card last year, and that Google is set to follow suit is further proof of the customer demand for bank cards over mobile payments. Therefore, it’s clear that the majority of the population still prefer the ease and familiarity of contactless cards. In fact, IDEX research found that six-in-ten (60%) UK consumers would not give up their debit card in favour of mobile payments, so it’s crucial that banks continue to evolve smart bank cards for the next generation of payments. Of course, cost caused by the manufacturing complexity of biometric payment cards has long been seen as the main barrier to mass adoption. 


Open Data Institute releases funding for ethical data sharing projects

Open Data Manchester (ODM) is also set to receive funding, but differs in its focus on helping hundreds of small-scale energy and eco-efficiency cooperatives share data among their members. “With regards to data, cooperatives are in quite a unique space because they’re intrinsically democratic organisations, so there will be some kind of representation or governance process where every member’s view should be represented at a board level, which means that you’ve got already got an environment of enhanced trust,” said Julian Tait, chief executive at ODM, adding that the relationship most people have with their current energy providers is “slightly begrudging” and one of “general dislike”. “If you’ve got an environment where you’re sharing data within the cooperative, they can understand my energy requirements [and]… you can start to design more responsive energy systems – that’s a bit harder to do, or it’s done very opaquely, in regards to the large energy providers.” He added the funding will help ODM work with Carbon Cooperative to design how a data cooperative could look.


Establishing Change Agents within Organisations Using Shu-Ha-Ri

How this can help us to achieve mastery of agile or business agility can be explained with a simple example. Let us take stand-ups, for instance. Shu: We need to make sure that teams start doing stand-ups and communicate the three basics of the stand up: what was done yesterday, what will be done today, and are there any impediments? We need to make sure that teams continue to follow this until they become good at it. Ha: At this stage, teams can come up with certain deviations, like adding "any other business" as a fourth thing or completely changing it to walking the board style to fulfill their requirements. Ri: This is the stage where the flow of information happens naturally and teams do not even need to think before doing stand-ups. This is the stage where this becomes an in-built thing for the team. So with these learning stages or paths, we can see organizations leaning towards agility by getting into the heart of agile by first collaborating to understand the vision and motive, then delivering with actual intent, and then introspecting and improving based on their needs. And when people in an organization start reaching towards the Ri stage, they are then ready to do different things.



Quote for the day:

"Leadership involves finding a parade and getting in front of it." -- John Naisbitt

Daily Tech Digest - July 15, 2020

The Microsoft-Android transformation is about to affect us all

Unlike a traditional Android app, though, a progressive web app can run on a computer, too — any computer — in that same single form. And that means it's way easier and more economical for developers to maintain a single progressive web app and have that one version of their program run everywhere. And if the end result is just as good as what you'd get with a native app — or close enough to seem practically the same, for most real-world purposes — then there's no real downside. It's a win-win-win, for developers, for gatekeepers like Microsoft and Google, and for us feline-impersonating land-people who rely on Android phones. For Microsoft, the move means more and more apps could run in identical forms on both Windows and Android — and thus despite the fact that it's venturing into uncharted territory by fully embracing Android and steering folks into its own mini-ecosystem within Google's universe, it can begin to offer a surprisingly consistent experience for anyone embracing a mix of Android and Windows. For Google, it means the amount of exceptional Android apps will only continue to grow and become more diverse. And remember, it isn't just about Android for Google, either; the company is equally interested in pushing Chrome OS forward


Detecting and Resolving Database Connection Leaks with Java Applications

Here removeAbandoned when set to true to try to remove abandoned connections and return them to pool again in configured removeAbandonedTimeout in seconds. Setting this to true can recover database connections from poorly written applications that fail to close a connection. The logAbandoned property is also very important as it can log the complete stack-trace which might be leaking the connection, thus can be very useful to identify connection leak in application. Stack-Trace is logged in terminal itself. In Red Hat Fuse we can see these stack-traces logged by logAbandoned in karaf terminal and not in application log or fuse.log file. All these properties are mentioned commons-dbcp doc. The timeBetweenEvictionRunsMillis property can also be helpful, it is set in milliseconds. When set than a separate thread will run to remove idle object evictor thread in every configured millisecond. Its default value is -1 which means this idle object evictor thread wouldn't be active and running and only when set to a positive integer then it would be effective.


Hack Brief: Microsoft Warns of a 17-Year-Old ‘Wormable’ Bug

While those organizations rarely expose their Windows DNS servers to the internet, both Check Point and Williams warn that many administrators have made architectural changes to networks—often questionable ones—to better allow employees to work from home since the beginning of the Covid-19 pandemic. That could mean more exposed Windows DNS servers that are open to full remote exploitation. "The threat landscape of internet-exposed things has risen dramatically" in recent months, Williams says. The good news, Check Point says, is that detecting SigRed exploitation of a Windows DNS server is relatively easy, given the noisy communications necessary to trigger the vulnerability. The firm says that despite the 17 years that SigRed has lingered in Windows DNS, it has yet to find any indication of an attack on its clients' networks so far. "We're not aware of anyone using this, but if they did, hopefully now it will stop," Herscovici says. But in the short term at least, Microsoft's patch could also lead to more exploitation of the bug as hackers reverse engineer the patch to discover exactly how the vulnerability can be triggered.


CIA behind APT34 and FSB hacks and data dumps

In an exclusive today, Yahoo News reported that the agency used its newly acquired powers to orchestrate "at least a dozen operations" across the world. The CIA was already authorized to conduct silent surveillance and data collection, but the new powers allow it to go even further. "This has been a combination of destructive things - stuff is on fire and exploding - and also public dissemination of data: leaking or things that look like leaking," a former US government official told Yahoo News. ... Citing former US officials, Yahoo News claims that such operations would have never been approved in the previous administrations, who have always been very cautious when attacking foreign adversaries, fearing blowback. However, in 2018, President Trump departed from the White House's classic stance on the matter and signed a document called a presidential finding, granting the CIA the ability to plan and execute covert offensive cyber operations under its judgment, rather than under the oversight of the National Security Council. The document effectively took the decision making and approval process from the White House and the National Security Council and placed it with CIA leadership in an attempt to expedite foreign hacking operations.


Why You Should Consider Blockchain As A Technology To Learn

The blockchain provides an ideal infrastructure for the universal application of cryptography, which can effectively promote the universal application of cryptography protocols. Their application can effectively protect personal privacy and business secrets, ensure the standard implementation of contracts and processes, strengthen trust and prevent fraud, and then the basic values of modern society: freedom, fairness and trust. ... The second-generation blockchain represented by Ethereum is equivalent to a computer where all nodes share state. On top of this infrastructure, smart contracts can code and automate complex business actions in a clear way. If the asset is digitized, the smart contract can automatically manage the digital asset according to a predetermined contract. Smart contracts promote the “code as law”. The biggest advantage of Ethereum is that it is a distributed consensus system without centralized control. In addition, because of the emergence of digital currency, we can use microeconomics to create a new system that subverts the tradition in a new way. The emergence of smart contracts provides an effective way for the blockchain to process data in a programmable and automated manner.


Juniper targets security portfolio at SASE race

Juniper uses AI-driven automation, insight and actions across the LAN, WLAN and WAN to optimize the end-to-end user experience, Madrid stated. This includes customized Service Level Expectations, event correlation across the LAN and WAN for rapid fault isolation and resolution, AI-driven support with proactive notifications and an interactive Virtual Network Assistant (VNA) called Marvis to provide recommended actions and/or keep the network humming autonomously, Madrid stated Juniper’s SASE plans come on the heels of recent announcements by other key players in what is expected to be a hot market. For example, VMware in June said it was advancing secure access for remote and mobile workers by mixing its Workspace ONE offering with its SD-WAN package. The resulting VMware SD-WAN Zero Trust Service promises to help enterprises handle growing distributed workloads for remote workers. The service also represents a big step toward SASE, the company said. “Speed and data are two of the most valuable business currencies in today’s rapid growth environment, both of which have rendered traditional security deployments insufficient and ineffective,” VMware stated.


How DigitalOps links together business models and digital platforms

“Step one is creating a shared and living ‘map’ of your business,” said Shearer. “We would recommend using Domain Driven Design, as it gives the DigitalOps team a good way to communicate with teams on important business elements, how they relate to one another, to users and to revenue. It also provides a pattern to follow when implementing new digital services. “Next you’ll need to determine the areas that are both mission critical and market differentiating. Everything else should be brought in or delivered with a partner. Focus on your core strengths and specialisms as this is where you stand the best chance of success. “Last but not least, this must be underpinned with a commitment to a culture of rapid innovation, with your users integrated into your product process. Without this, you simply can’t hope to succeed and good intentions can quickly turn into missed opportunities and lost competitive advantage.” Staying on the topic of culture within the workplace, White commented: “DigitalOps follows the same approach to other XOps approaches, such as DevOps, by focussing on the removal of barriers, silos and increasing collaboration between cross-functional teams.


Ensure remote users meet data protection standards

As measures to relax lockdowns are being delivered in phases, IT staff should recognize that the initial phase of business continuity has passed. The next phase requires a more measured approach. There was no time to train users and implement standard applications, but now administrators should audit all systems accessing corporate data and standardize on secure collaborative apps. This thorough approach is essential for remote data protection. IT administrators should contact users directly to ensure they are familiar with the standard work applications and processes. If administrators need to remove some consumer apps, they should explain why upending their established workflows is necessary. In many cases, these workers adopted new applications without much guidance. However, users will have to understand that the new best practices are the only way for IT to ensure data security going forward. Under no circumstances, however, should IT allow unsafe apps such as WhatsApp and Facebook Messenger to access business data; this is a direct threat to remote data security. Where users relied on personal devices for work, offer alternatives such as a unified endpoint management (UEM) policy with low restrictions.


Why the Merging of the DevOps Driven Cloud and Cybersecurity Will Create Dozens of New Category Leaders

The massive paradigm shift to cloud requires a very different skill set than on premises. Whereas once IT and DevOps were considered the foundation and cybersecurity was "a final 'check the box' for compliance", this model simply can’t exist in a dynamic cloud-based world. The acceleration with which remote and distributed activity is happening requires these two disciplines to mesh even faster. Everything that was once done on premise must now be done in the cloud and must be done using tools built and optimized for the cloud environment. That puts cloud-based cybersecurity innovators in a unique and valuable position of being revenue-generating quickly relative to other new categories, while simultaneously creating and defining a new space (cloud-first security products). ... Important to note the picks and axes of the cloud will continue to be dominated by a handful of the biggest tech companies in the world. Over the last decade, AWS, Microsoft Azure and Google Cloud Platform have grown to over $80B in annual cumulative revenue. The fast followers trying to take share in this area are not start-ups, but rather IBM, Oracle and Alibaba.


A Modern Data Storage Paradigm; Reducing the High Cost of Data Management

The new paradigm combines a file-based Primary Tier and an object-based Perpetual Tier. The Primary Tier (or Project Tier) holds all in-progress data and active projects. It is made up of flash, DRAM, and high-performance disk drives to meet the requirements of critical data workflows dependent on response time. The Perpetual Tier can accommodate multiple storage media types – including any combination of cloud storage, object storage, network-attached storage (NAS) and tape – to address data protection, multi-site replication (sharing), cloud and data management workflows. Data moves seamlessly between tiers as it is manipulated, analyzed, shared and protected. Implementing a proper storage management strategy within a two-tier paradigm allows organizations to address today’s most relevant data storage problems, while creating an environment open to future growth, development and change. Modern storage management software (SMS) maximizes efficiency by ‘smartly’ migrating data to the appropriate level of storage.



Quote for the day:

"Leaders need to strike a balance between action and patience." -- Doug Smith

Daily Tech Digest - July 14, 2020

Is The Business World Ready For A Chief Data Ethics Officer?

Data ethics is the new strategic imperative for leading corporations. In NewVantage Partners 2019 Executive Survey, more than half of executives — 55.7% — pointed to data ethics as a top business priority.  In a recent Harvard Business Review article, Tom Davenport and I discussed the “seven key types of Chief Data Officer (CDO) jobs”, noting that one of the emerging roles of the Chief Data Officer is as Data Ethicist. We noted that CDO as Data Ethicist is “growing in popularity, is [focused] on the ethics of data management, specifically on how it’s collected, safeguarded and shared and who controls it”. We confirmed that, “there is no doubt that consumers, regulators, and legislators are becoming more concerned about the misuse of data”. I spoke recently with Dennis Hirsch, Professor of Law and Director of the Program on Data and Governance at The Ohio State University Moritz College of Law, and research fellow at The Risk Institute, about data ethics and corporate responsibility in the context of today’s emerging algorithmic economy. Hirsch’s research is focused on how data analytics can pose ethical risks, and how leading companies are responding. 


SD-WAN and analytics: A marriage made for the new normal

“With the rapid increase in use of cloud services including video and IoT applications, which have only been accelerated with the ongoing global pandemic, wide area networking and remote connectivity stays a mission critical need for enterprise IT,” said Mehra. “Specifically, SD-WAN emerged as an evolution from enterprise routing and WAN optimization to address the needs of a more dynamic, intelligent architecture around these evolving application needs,” Mehra said. Probes or agents in vendors’ SD-WAN packages gather network, performance, security and other telemetry and combine it with historic customer and vendor-gathered data. Analysis of this data generates recommendations, policy changes or other actions to help IT keep the overall WAN environment humming. Analytics programs can also reduce the number of overall alerts IT teams deal with because the programs can focus on those things enterprises consider most important. Vendors such as Cisco, VMware, Versa, Silver Peak, Citrix, Cato and others have varying degrees of analytic sophistication in their SD-WAN packages, but all of them are marching toward supporting cloud-connected customers.


How enterprise IoT will evolve in a post-COVID world

With wired power and solar power both encumbered by significant drawbacks, businesses have been searching for a new solution: long-range wireless charging. Long-range wireless charging devices use infrared light or other physical phenomena to deliver power at a distance. Facility managers need only install the transmitter in a convenient area and then plug in or embed the receiver, which is roughly the size of a thumb drive, into their IoT device of choice. Long-range wireless charging can be installed from the ground up and that’s often the case when it comes to new construction. The technology is already being widely adapted. However, long-range wireless charging can often be retrofitted into operations that are already in place. This is especially important in light of the rapid changes enterprise-level businesses are expected to make due to the COVID-19 pandemic. IT specialists are looking for practical and flexible solutions that can enable them to resume operations safely. It should be noted, of course, that for all its benefits, long-range wireless charging may not be the solution for every business, just as IoT won’t solve every COVID-19-related challenge. Long-range wireless charging might have limitations in power delivery, range or other aspects. 


Drive to improve flash reliability

Originally, Nand flash was laid out in a plane, with each cell supporting two levels or a single bit, writes Alex McDonald, chair, SNIA EMEA. This is a single-level cell (SLC). Now we have progressed through MLC (four levels and two bits), through TLC (eight levels and three bits) up to 16 levels and four bits per cell (QLC), and improved densities by “building up” with a 3D stack of cells. The unit of write is not a single cell, but a page of cells of around 4KB. At a higher level, several tens of pages are organised into blocks. Prior to writing to pages, a block needs to be erased in preparation – program/erase (P/E) cycles. There are only so many P/E cycles that the flash can tolerate before failing. The advantages of QLC and 3D-based technology is their ability to build very high-density devices, with 32TB devices relatively commonplace. Generally, with more bits per cell, fewer P/E cycles can be performed. Interestingly, these limitations have not necessarily reduced the practical reliability of high-density Nand flash SSDs, because there are many techniques that are used to mitigate the possibly unreliable operation of single cells.


5 reasons AI isn't being adopted at your organization

It's important to remember that AI solutions are built by imperfect humans. We've seen examples of models that unintentionally generate discriminatory outcomes because the underlying data was skewed towards a particular segment of the population. Whether they resulted from bias in the dataset (e.g., exclusion or sample bias) or from humans' unconscious biases, these outcomes rightly erode trust in the technology and slow adoption.  We must balance freedom, ethics and privacy with efficiency and other benefits AI makes possible. This foundation for AI requires that people at all levels of an organization understand their role in building a governance structure. A strong governance system includes a set of ethical design and development principles that are regularly reviewed, creating a "feedback loop." It's important to consider these three points when developing a governance framework for AI: 1. Prioritize ethics early. 2. Build robust, transparent, and explainable systems that clearly yield an audit trail with the understanding as the models learn these can adjust. 3. Ensure measured, monitored roll-outs with robust governance and oversight, guided by clearly document processes.


Digital ecosystems: the future of insurance innovation

The insurance industry stands on the precipice of a paradigm shift. Digitisation is accelerating at pace, with new and innovative technologies, the greater use of data and a mobile-first approach not only changing how the industry operates, but also how customers expect it to operate. So too is the competitive landscape changing the playing field for those incumbents in the industry, forcing them towards a better defined, service-based approach similar to that being adopted by many of the larger players in the financial services industry. But, unlike that industry, the insurance sector has typically been slower to adopt digitisation. That has to change. Digital can no longer be the preserve of the innovators or pioneers in the sector, it should permeate every level of the competitive landscape if insurers wish to reshape their business in line with customer expectations. Indeed, according to research by Accenture that analysed close to 20 industries, insurance is among those most susceptible to future disruption. Accenture explained that, by 2022, carriers that are slow to respond to digital - or ‘hyper-relevant’ - competitors could suffer market share erosion close to $200bn and miss the opportunity to pursue new growth activities worth $177bn.


Data Management: An Unwitting Game Of Russian Roulette

The fear all senior management teams should have is that, in the race to quickly automate and complete the digital transformation journey, the post-COVID-19 reality of fiscally austere conditions will cause their organizations to skimp or ignore the unglamorous yet vital data management systems and disciplines. In the cases where this happens, those companies could face disastrous consequences, ranging from disruption of vital processes to high-profile breaches of privacy and/or security. Of one thing we can be sure: Without mastery of the data, the automated systems and their digital platforms will fail. However, the consequences of these failures are hard to predict. Some will be small and manageable; others could be severe. Effectively, companies without data discipline and strong data management will play Russian roulette. Every day they will spin the revolver chamber, put the gun to their proverbial head and pull the trigger. Eventually, a live round will go off. Despite the often frustratingly obscure nature of the problem and the need for expensive investments without clear paybacks, the need to take data management seriously and adequately provision it with talented teams backed with the necessary investments is paramount.


The Future of Remote Work, According to Startups

No matter where in the world you log in from—Silicon Valley, London, and beyond—COVID-19 has triggered a mass exodus from traditional office life. Now that the lucky among us have settled into remote work, many are left wondering if this massive, inadvertent work-from-home experiment will change work for good. In the following charts, we feature data from a comprehensive survey conducted by UK-based startup network Founders Forum, in which hundreds of founders and their teams revealed their experiences of remote work and their plans for a post-pandemic future. While the future remains a blank page, it’s clear that hundreds of startups have no plans to hit backspace on remote work. Based primarily in the UK, almost half of the survey participants were founders, and nearly a quarter were managers below the C-suite. Prior to pandemic-related lockdowns, 94% of those surveyed had worked from an external office. Despite their brick-and-mortar setup, more than 90% were able to accomplish the majority of their work remotely.


Lead Through Volatility With Adaptive Strategy

Strategy defines the long-term choices and actions the enterprise must take to create, deliver and capture value as envisaged in the business model. But the more time spent creating a plan, the less time there is to execute it, increasing the risk that the world has moved on and the plan is out of date. Implementing promptly also helps to surface the plan’s flaws and identify where to improve. Adaptive strategy doesn’t require perfect or complete information to execute; it uses available information to identify the most immediate actions required to be successful. Given today’s highly disrupted conditions, few enterprises can afford to wait a year to review strategy as was typical when business context moved slowly, and disruption happened infrequently, if at all. Some now review their strategy on a quarterly or even a monthly basis, but a truly adaptive enterprise monitors its business context on an ongoing basis, initiating a strategy review whenever new information is available to reframe the context.  The vision that guides an adaptive strategy can still be long-term and bold — but should be continually extended (not changed completely once every few years) to push the boundaries of what the enterprise must do to succeed.


5 Key Research Findings on Enterprise Artificial Intelligence

The pandemic has caused a drastic shift in consumer behavior as individuals stay at home and adjust their daily routines. Many travel, hospitality, and restaurant workers are out of work, and those fortunate to still be employed have shifted their spending patterns. This in turn has put pressure on AI and machine learning teams to ensure the accuracy of their predictive models in this changed environment, yet only 33% are monitoring their models in production. ... While the board of directors and C-suite almost universally appreciate the importance of AI (100% of respondents indicate is either or fully accepted as a strategic imperative), it does appear that there will be more pressure to show clear ROI and cut through the hype to provide a mature and sophisticated approach to AI. With 65% reporting that building a team with the right skills is a medium or large barrier for success, it’s likely that teams will continue to invest in efficient processes, streamlined development to production environments, and centralized approaches to AI governance and skills and resource management. 



Quote for the day:

"Think left and think right and think low and think high. Oh, the thinks you can think up if only you try!" -- Dr. Seuss

Daily Tech Digest - July 13, 2020

How to choose a robot for your company

There are lots of reasons a company might entertain automating processes with robots. According to Kern, the main reason is a labor shortage. Prior to COVID-19-related slowdowns, a competitive labor landscape and rising costs of living in many countries around the globe made hiring tough for skilled and unskilled positions alike. Automation, which often promises ROI efficiencies over time, particularly when it comes to repeatable tasks, is an attractive solution. "Robots can save money over time, not just by directly eliminating human labor, but by cutting out worker training and turnover," according to the Lux report for which Kern served as lead. "Most companies turn to automation and robotic solutions to deal with labor shortages, which is common in industries with repetitive tasks that have a high employee turnover rate. Companies also frequently use robots to automate dangerous tasks, keeping their employees out of harm's way." Post-COVID-19, there are also considerations like sanitation and worker volatility. As I've written, the perception of automation is changing almost overnight. Where robots were once, very recently, associated primarily with lost jobs, there's been a new spin in the industry to tout automation solutions as commonsense in a world where workers are risking infection when they show up at physical locations.


How the cloud fractures application delivery infrastructure ops

The traditional infrastructure team still operates ADCs and load balancers in the data center, while preferring the vendors they have worked with in the past. DevOps and CloudOps have taken control in the public cloud, choosing to use software and cloud provider services that are more integrated with their DevOps toolchains. This fractured operations model is problematic. Companies with divided Layer 4-7 operations are less likely to be successful with this infrastructure. EMA research participants also revealed why they feel a need to close this operational gap. First, 43% of enterprises said this situation has introduced security risks. In most enterprises, application delivery infrastructure is an important component of overall security architecture. Companies need to take a unified approach to network security. Research participants identified compliance problems (36%) and operational efficiency (36%) as the top secondary challenges associated with fractured operations. And 30% said platform problems -- such as issues with scale, performance, functionality or stability -- are a major challenge.


The enormous opportunity in fintech

Technology providers to specific areas of finance have created significant businesses. Across the insurance ecosystem, Guidewire, Applied Systems, and Vertafore capture $10 billion of value. BlackKnight, the leading analytics provider to the mortgage industry, is an $11 billion business. Are you thinking about managing financial documents for your public company? You may turn to Broadridge, which makes a pretty penny in this business, boasting a $13 billion market cap. While these are massive markets, it is not easy to disrupt incumbents. A combination of regulatory hurdles, entrenched behavior, low risk-tolerance, and the benefits of larger balance sheets have kept upstarts at bay for decades. However, as venture capital supports the ecosystem, modern technology creeps into the sector (cloud, APIs), connectivity and data exchanges improve, and consumers grow tired of incumbents, the tide continues to shift. This shift and the challenge to the status quo by fintech upstarts will have lasting effects. Even when incumbents acquire their biggest disruptors, such as Visa’s acquisition of Plaid, innovations pioneered by those startups become integrated into the system and help move the industry forward.


Somehow, Microsoft is the best thing to happen to Chrome

What strange times we live in. Who’d have thought that I’d be writing an article on how Microsoft is the best thing to happen to Google Chrome? A few years ago the idea of Microsoft getting involved in an open source project would cause a mixture of laughter and dread. You know… Microsoft, the foe of open source who had a CEO that once said that Linux was “a cancer that attaches itself in an intellectual property sense to everything it touches.” The company that couldn’t make a decent web browser to save its life. But, believe it or not, I really do think that Microsoft’s involvement has made Chrome a much better browser. ... Basically, since dropping its opposition to open source, and not only embracing it, but putting its money where its mouth is, the thought of Microsoft being involved with an open source project is no longer the stuff of nightmares. It’s proved to be a valuable contributor to the open source community already. But how does this affect Google’s Chrome browser? Well, ever since Microsoft stopped using its own web engine, EdgeHTML, for its Edge web browser, and instead built a brand-new version that’s based on Chromium, it’s been contributing a steady stream of fixes and new features to Chromium – and those have not just been benefitting Edge, but Chrome as well.


IBM just changed the automation game. Hello Extreme Automation

The technology provides a low code, cloud-based authoring experience for the business user to create bot scripts with a desktop recorder, without the need of IT. These scripts are executed by digital robots to complete tasks. Digital robots can run on-demand by the end-user or by an automated scheduler. Arguably, WDG is on a par with Softomotive – acquired by Microsoft for considerably more money. What is clear is these RPA firms are offering pretty much the same functionality for the basic scripting and recording.  WDG is focused heavily on quality customer service ops and is great at integrating with chatbots, digital associates and other AI tools. Pre-Covid, most RPA was focused on low-risk back-office processes, especially in finance. Now customers are desperate to automate the customer-facing and revenue-generating processes and need tools proven to work in the environments. Noone has a huge advantage in the CX automation space so this provides a greenfield opportunity for IBM. The WDG automation software sits under IBM Cognitive and Cloud giving it a broader playing field to compete with the likes of MSFT, Pega, Appian, and even ServiceNow. Arguably, this is the real play that excites IBM’s top brass.


The Importance of Domain Experience in Data Science

Restated — domain knowledge is the learned skill to communicate fluently in a group’s data dialect. Its component parts are: general business acumen + vertical knowledge + data lineage understanding. For example, a data scientist in people analytics requires a foundational knowledge of the business + human resources + the inner-workings of their company’s HR tools and processes which create the data they work with. Those processes and other inputs to the dataset are crucial. A data scientist can’t create meaningful insights before they understand what the data is saying today. Is it telling a story? Is it, or subsets of it, too polluted to use today? Are some data points proxies for or inputs to others? The more complex your business processes and associated data lineage, the longer your data dialect will take to learn. For digital native companies whose data collection is automated with intuitive dialects (i.e. a “click” is a “click”), domain knowledge can be developed much more quickly than for large, longstanding companies which have undergone transformations, acquisitions and/or divestitures. If you hire a data scientist, how long will it take them to learn your data dialect? And can you provide air cover for them to do so before applying pressure to produce “insights?”


Hiring developers: While coding is important, there are other things to consider

A recruiter can learn a lot about the candidate in that half hour, including any side projects they might be involved in or games they've written. These "are often a window into a developer's willingness to take initiative," Volodarsky said. Learning what a developer does in their spare time can also provide great insight into their personality, he said. "Hiring great coders is important, but you also want to collaborate with interesting people, too." When it comes to hiring freelance developers it's important that they understand both the code and the nuances of the business they're contracting for, and this will come through in that conversation over a falafel, or the like, he said. In terms of motivating factors, not surprisingly, an overwhelming 70% said they were looking for better compensation, while 58.5% said they want to work with new technologies, and 57% said they were curious about other opportunities. Close to 70% of respondents said they learn about a company during a job hunt by turning to reviews on third-party sites such as Glassdoor and Blind. However, a large number also said they learned from viewing company-sponsored media, such as blogs and company culture videos.


Is Singapore ready to govern a digital population?

Singapore over the past several years has invested significant resources towards becoming a digital economy, rolling out an ambitious smart nation roadmap, driving the adoption of emerging technologies, and overhauling its own ICT infrastructure. With the global pandemic now adding new impetus to digital transformation, the government has made a concerted effort to drive digital adoption deeper into the business community and local population. It established a new office to work alongside the business community and local population to push the "national digitalisation movement". Initiatives would include the deployment of 1,000 "digital ambassadors" to help stallholders and seniors go digital and setting up of 50 digital community hubs across the island to offer one-to-one assistance on digital skills. A new ministerial committee will also coordinate the country's digitalisation efforts and focus on priorities such as assisting people in learning new skills and galvanising small businesses to go digital. More funds and resources have been further directed to facilitate digital transformation initiatives.


AIOps tools expand as users warm slowly to autoremediation

AIOps has generated industry hype since 2017, as advances in machine learning algorithms prompted IT monitoring vendors to envision a new method of automation for their products. At the same time, complex microservices infrastructures became impossible to manage entirely by human hands alone. Since then, AIOps tools have grown more sophisticated, adding automated remediation features to event correlation and automated root cause analysis, and AIOps vendors that began in specialized areas have also broadened the workloads their tools can support. Most recently, those vendors include Epsagon, which emerged in 2018 with AI-supported distributed tracing for serverless environments and expanded in 2019 to include container and cloud workloads. It now offers AIOps features it calls Applied Observability, which automate menial incident resolution tasks in response to metrics and logs in addition to traces. Last month, Epsagon launched a partnership with Microsoft centered on Kubernetes environments after previously inking a deal with AWS focused on its Lambda serverless compute service.


How Microfrontends Can Help to Focus on Business Needs

The concept of building sites from small web applications integrated via hyperlinks is (still) very common. There have also been a lot of concepts of rendering pages from smaller, independent building blocks in the past, such as Java Portlets. Even if the term microfrontend nowadays is used to refer to modern JavaScript apps, there are multiple possible approaches. So, when I use it in this article I refer to an application that: is basically a JavaScript Rich Client (for example a SPA or a Web Component) that runs isolated within an arbitrary DOM node and is as small and performant as possible; does not install global libraries, fonts, or styles; does not assume anything about the site it is embedded in; especially it does not assume any existing paths, so all the base paths to assets and APIs must be configurable; has a well-defined interface consisting of the startup configuration and some runtime messages (events); should be instantiable; ideally inherits the shared styles from the site and ships only styles absolutely necessary to define its layout.



Quote for the day:

"Leadership is familiar, but not well understood." -- Gerald Weinberg