Daily Tech Digest - March 15, 2021

How to scale AI with a high degree of customization

Scaling machine learning programs is very different to scaling traditional software because they have to be adapted to fit any new problem you approach. As the data you’re using changes (whether because you’re attacking a new problem or simply because time has passed), you will likely need to build and train new models. This takes human input and supervision. The degree of supervision varies, and that is critical to understanding the scalability challenge. A second issue is that the humans involved in training the machine learning model and interpreting the output require domain-specific knowledge that may be unique. So someone who trained a successful model for one business unit of your company can’t necessarily do the same for a different business unit where they lack domain knowledge. Moreover, the way an ML system needs to be integrated into the workflow in one business unit could be very different from how it needs to be integrated in another, so you can’t simply replicate a successful ML deployment elsewhere. Finally, an AI system’s alignment to business objectives may be specific to the group developing it. For example, consider an AI system designed to predict customer churn.


The snags holding back DevOps: culture, delivery and security

Cultural issues create this disjointed relationship between Dev and Ops. "Culture is the number one missing component, but there is also a failure to truly connect and automate across functional silos," Dawson says. "This results in lack of shared visibility, consistent feedback to drive improvement and, potentially, a negative experience which inhibits adoption." There are too many tools competing for Dev and Ops teams' mindshare as well. "A single team may have anywhere between 20 to 50 tools," says Kakran. "Separating signal-from-noise when you are bombarded by hundreds of alerts per hour is quite challenging." The continuous delivery piece is also a snag in the continuous integration / continuous delivery (CI/CD) that should flow effortless through DevOps. "Enterprises are lagging in test automation and are increasing efforts to automate continuous testing, which is a core component of CD," says Venky Chennapragada, DevOps architect with Capgemini North America.. "Some enterprises are unable to adopt a high level of CI/CD because their application portfolio mostly consists of packaged software, legacy software or ERP systems."


PyTorch Code for Self-Attention Computer Vision

Self-Attention is gradually gaining prominent place from sequence modeling in natural language processing to Medical Image Segmentation. It replaces conventional recurrent neural networks and convolutional neural networks in many applications to achieve new state-of-the-art in respective fields. Transformers, its variants and extensions are well-utilizing self-attention mechanisms. Self-Attention Computer Vision, known technically as self_attention_cv, is a PyTorch based library providing a one-stop solution for all of the self-attention based requirements. It includes varieties of self-attention based layers and pre-trained models that can be simply employed in any custom architecture. Rather than building the self-attention layers or blocks from scratch, this library helps its users perform model building in no-time. On the other hand, the pre-trained heavy models such as TransUNet, ViT can be incorporated into custom models and can finish training in minimal time even in a CPU environment! According to its contributors Adaloglou Nicolas and Sergios Karagiannakos, the library is still under development by updating the latest models and architectures.


How to Choose the Right Cybersecurity Framework

Start by setting goals for your cybersecurity program that align with the business's needs. Stakeholders from across the organization — from the C-suite and upper management to support teams and IT — should be involved in the initial risk-assessment process and setting a risk-tolerance level. While deciding where to start your implementation can feel like trying to boil the ocean, one way to make it less intimidating is to run a pilot program focused on a single department. This can help uncover lessons about what does and doesn't work, what tools will help you succeed, and best practices for a wider rollout. From there, identify the type of data the organization processes and map out its life cycle. A simple model will help lay a foundation for understanding the organization's cybersecurity risk and identify points along the supply chain to invest more time and resources. Business tools and software are often important sources and collectors of data, so ask vendors about their data privacy policies to ensure they reflect your goals. ... A good cybersecurity framework will help you identify risks, protect company assets (including customer data), and put steps in place to detect, respond, and recover from a cybersecurity event.


Is Data Science a science?

Before we tackle the idea of whether Data Science is a science or not, something that doesn’t seem to have a definitive answer, let’s step back and look at the idea of proof. This is a word that is overused quite frequently as there are many different kinds of proof: for example, there are scientific proofs, legal proofs, and mathematical proofs. In mathematics, a proof is an inferential argument that shows a statement is true as supported by axioms, definitions, theorems, and postulates. Mathematicians normally use deductive reasoning to show that the premises, also called statements, in a proof are true. A direct proof is one that shows a given statement is always true and the proof is usually written in a symbolic language. In an indirect proof, mathematicians usually employ proof by contradiction, where they assume the opposite statement is true and eventually reach a contradiction showing the assumption is false. In science, an inherently inductive enterprise,² we cannot prove any hypothesis to be true as that would require an infinite number of observations so the best we can hope to do is use inductive reasoning as the basis of our generalization and hold it to be provisionally true.


Seven lessons on how technology transformations can deliver value

Not only do the transformations focused on talent strategy stand out in their value potential, but they are also much more commonplace at top-performing companies. Top-quartile respondents are more than three times likelier than their bottom-quartile peers (41 percent, compared with 12 percent) to say they’ve pursued a transformation of their talent strategy in recent years. Yet the need to address talent is universal and urgent. Respondents believe that more than 40 percent of their workforce will need to be either replaced or fundamentally retrained to make up for their organizations’ skills gaps. But only 15 percent of respondents say their companies plan to pursue a talent-strategy transformation in the next two years, even though the talent challenge remains considerable. At companies that have pursued recent transformations, the top challenges to doing so continue to revolve around talent as well as culture: namely, skill gaps and cultural differences, the difficulty of changing cultures and ways of working, and difficulty finding talent to fill new roles—which is as challenging for top performers as it is for everyone else. Talent also appears to impede progress at the companies that haven’t pursued technology transformations;


More Intelligent Medicine

The combination of human and machine intelligence could optimize the practice of clinical medicine and streamline health care operations. Machine learning-based AI tools could be especially valuable because they rely on adaptive learning. This means that with each exposure to new data, the algorithm gets better at detecting telltale patterns. Such tools have the capacity to transcend the knowledge-absorption and information-retention limits of the human brain because they can be “trained” to consider millions of medical records and billions of data points. Such tools could boost individual physicians’ decision-making by offering doctors accumulated knowledge from billions of medical decisions, billions of patient cases, and billions of outcomes to inform the diagnosis and treatment of an individual patient. AI-based tools could alert clinicians to a suboptimal medication choice, or they could triage patient cases with rare, confounding symptoms to rare-disease experts for remote consults. AI can help optimize both diagnostic and prognostic clinical decisions, it can help individualize treatment and it can identify patients at high risk for progressing to serious disease or for developing a condition, allowing physicians to intervene preemptively.


Surviving Zombie Scrum

There’s not one specific cause of Zombie Scrum, but in relation to the symptoms we described earlier, we can share some common causes. Generally speaking, Zombie Scrum systems occur in organizations that optimize for something else than actual agility. This creates problems that the teams can usually not solve on their own. For example, Scrum Teams that operate in environments with Zombie Scrum rarely have a clear answer as to what makes their product valuable. Much like zombies that stumble around without a sense of direction, many Zombie Scrum Teams work hard on getting nowhere in particular. While they still produce something the question remains whether they are actually effective. ... Another cause is the struggle many organizations face with shipping fast. Often heard excuses are that the product is too complex, technology doesn’t support it, or customers aren’t asking for it. Shipping fast is perceived as a “nice to have”, instead of a necessary activity to manage risk and deliver value sooner. Without shipping fast, Scrum’s loop of Empirical Process Control collapses. In Zombie Scrum, organizations don’t create safety to fail. Teams can’t improve when they experience no room for uncertainty, doubt, or criticism. They often develop all kinds of defensive strategies to prevent uncertainty.


AI Must Play a Role in Data Cloud Management

Data intelligence, or the use of data to glean useful information, allows a business to both increase revenue and their position in the market. But the continual multiplication of data and its sources are making an already substantial challenge even more laborious. This emphasis on data is where artificial intelligence (AI) can play an especially useful role. By leveraging the cloud and AI for the storage, collection, and analysis of data, a business can monetize information in a fast, effective manner. Indeed, mastering data management through the use of the cloud will continue to be top of mind for many IT groups as they are asked more and more to improve business agility through the fostering of better business intelligence. Thus, data science -- the large umbrella under which AI, machine learning, automation, data storage, and more all fall within -- will see huge leaps in growth both this year and in the years ahead. The cloud is perfectly positioned to assist organizations in AI because of its unique ability to provide business with flexibility, agility, scalability, and speed that other models of infrastructure simply can’t achieve at the same level. If the core of a business isn’t managing a datacenter, then the cloud is all the more appealing, since it allows IT teams to focus on the value-driving projects that will truly make a difference for employees and customers.


Why data privacy will be the catalyst for digital identity adoption

Firstly, the story around digital identities needs to change. What they won’t be is a one-stop-shop to access every piece of personal information about you at the touch of a button, shareable and stealable. What digital identities could be, if we put data privacy at their core, is selective. We have the opportunity to create a technology, which means people only need to share the specific data they need at any one time, withholding as much data as they can to get the job done. This doesn’t seem too big of an ask, either. Mastercard recently partnered with Deakin University and Australia Post to test out a digital ID solution enabling students to register for their exams digitally. This removed the need for tiresome paperwork and trips to campus, but also reduced the amount of data shared about each student. Students created a digital identity with Australia Post, using this to gain access to their university exam portal. With each registration, only specific personal information was required to allow students’ entry to the exam portal – nothing was shared than didn’t need to be. Now imagine this in our banks, shops, and workplaces. Rather than revealing most of your ‘identity’ with every purchase of alcohol, you only show your ID documents when you first create the identity – to verify that you are who you say you are.



Quote for the day:

"Don't dare to be different, dare to be yourself - if that doesn't make you different then something is wrong." -- Laura Baker

Daily Tech Digest - March 14, 2021

The Agile Manifesto 20 years on: agility in software delivery is still a work in progress

What's missing from many agile initiatives is "ways to manage what you do based on value and outcomes, rather than on measuring effort and tasks," says Morris. "We've seen the rise of formulaic 'enterprise agile' frameworks that try to help you to manage teams in a top-down way, in ways that are based on everything on the right of the values of the Agile Manifesto. The manifesto says we value 'responding to change over following a plan,' but these frameworks give you a formula for managing plans that don't really encourage you to respond to change once you get going." ... Ritchie agrees that there's too much of a tendency to pigeonhole agile into rigid processes. "The first and most-common mistake is the interpretation of agile as simply a process, or something you can just buy and do to immediately call yourself agile," says Ritchie. "This more often than not results in process for the sake of process, frustration, and - contradictory to the intent of agile - an even further disconnect between business outcomes and the IT professionals chartered to deliver them." Related to this, he says, is there often can be a "dogmatic agile zealot approach, where everything a particular framework says must be taken as gospel...'"


Combining edge computing and IoT to unlock autonomous and intelligent applications

Edge computing isn’t limited to just sensors and other IoT; it can also involve traditional IT devices, such as laptops, servers, and handheld systems. Enterprise applications such as enterprise resource planning (ERP), financial software, and data management systems typically don’t need the level of real-time instantaneous data processing most commonly associated with autonomous applications. Edge computing has the most relevance in the world of enterprise software in the context of application delivery. Employees don’t need access to the whole application suite or all of the company’s data. Providing them just what they need with limited data generally results in better performance and user experience. Edge computing also makes it possible to harness AI in enterprise applications, such as voice recognition. Voice recognition applications need to work locally for fast response, even if the algorithm is trained in the cloud. “For the first time in history, computing is moving out of the realm of abstract stuff like spreadsheets, web browsers, video games, et cetera, and into the real world,” Thomason said. Devices are sensing things in the real world and acting based on that information.


From The Vault: Top Statistical Ideas Behind The Data Science Boom

Improved data collection strategies (think sensors, Internet) have resulted in enormous datasets. But, data collection and curation consumes nearly 80% of a data engineer’s typical day. Data is still a problem. More so a couple of decades ago. The idea behind bootstrap distribution is to use it as an approximation to the data’s sampling distribution. According to researchers, parametric bootstrapping, prior and posterior predictive checking, and simulation-based calibration allow replication of datasets from a model instead of directly resampling from the data. Calibrated simulation in the face of uncertain data volumes is a standard procedure rooted in statistics and helps in analysing complex models or algorithms. Gelman and Vehtari believe the future research will lean more towards inferential methods, taking ideas such as unit testing from software engineering and applying them to problems of learning from noisy data. “As our statistical methods become more advanced, there will be a continuing need to understand the links between data, models, and substantive theory,” concluded the authors. The ideas mentioned above have laid the foundation for modern-day deep learning and other such tools.


Driving innovation with emotional intelligence

EQ is increasingly recognized as a competitive advantage, according to a survey by Harvard Business Review Analytic Services. It found that emotionally intelligent organizations get an innovation premium. These organizations reported more creativity, higher levels of productivity and employee engagement, significantly stronger customer experiences, and higher levels of customer loyalty, advocacy, and profitability. Organizations that did not focus on emotional intelligence had “significant consequences, including low productivity, lukewarm innovation, and an uninspired workforce,” said the report. ... Verizon surveyed senior business leaders both before and after covid-19. Before the pandemic, less than 20% of respondents said EQ would be an important skill for the future. But since covid, EI increased in significance for 69% of respondents. ... “A sure way to stifle innovation is to not have the emotional maturity to recognize that innovation and creativity can come from many sources,” says Steele. “I think that our agency has hugely benefited from research institutes, large businesses, small businesses, and individual contributors.” She continues, “The capacity to recognize untapped sources of innovation, then bringing them together in a system, is a great ability to have.”


Three flaws that sat in Linux kernel since 2006 could deliver root privileges to attackers

While the vulnerabilities “are in code that is not remotely accessible, so this isn’t like a remote exploit,” said Nichols, they are still troublesome. They take “any existing threat that might be there. It just makes it that much worse,” he explained. “And if you have users on the system that you don’t really trust with root access it, it breaks them as well.” Referring to the theory that ‘many eyes make all bugs shallow,’ Linux code “is not getting many eyes or the eyes are looking at it and saying that seems fine,” said Nichols. “But, [the bugs] have been in there since the code was first written, and they haven’t really changed over the last 15 years.” As a matter of course, GRIMM researchers try “to dig in” and see how long vulnerabilities have existed when they can – a more feasible proposition with open source. That the flaws slipped detection for so long has a lot to do with the sprawl of the the Linux kernel. It “has gotten so big” and “there’s so much code there,” said Nichols. “The real strategy is make sure you’re loading as little code as possible.”


Master Data Management Much More Than Technology

Industry experts define data governance as the “authority over the management of data assets” and assigning “accountability for the quality of your organization’s data.” Having authority over data assets is the function of data ownership. Being accountable for the quality of these data assets is the function of data stewardship. Data is a business asset, and business assets are controlled by business people. Therefore, data owners and data stewards should be business people. They must be careful not to manage their data within the narrow focus of their own business unit (department or division); instead, they must ensure that their data is managed from an enterprise perspective so that it can be used and shared by all business units. Enterprise information management (EIM) is about the administration of data. One industry expert describes EIM as “a function, typically dedicated to an organization in IT, for maintaining, cataloging, and standardizing corporate data.” This is done with the help of data stewards under the umbrella of a data strategy, and by establishing data-related standards, policies, and procedures.


Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI?

Lights travel faster than electrons. The concept of using light as a substitute for carrying out heavy tasks (aka photonics computing/optical computing) dates back to the 1980s, when Nokia Bell Labs, an American industrial research and scientific development company, tried to develop a light-based processor. However, due to the impracticality of creating a working optical transistor, the concept didn’t take off. We experience optical technology in cameras, CDs, and even in Blue-Ray discs. But these photons are usually converted into electrons to deploy in chips. Four decades later, photonic computing gained momentum when IBM and researchers from the University of Oxford Muenster developed the system that uses light instead of electricity to perform several AI model-based computations. Alongside, Lightmatter’s new AI chip has created a buzz in the industry. According to the company website, Envise can run the largest neural networks three times higher inferences/second than the Nvidia DGX-A100, with seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.


Why Data Management Needs An Aggregator Model

The traditional approach to managing unstructured data has been storage-centric; you move data to a storage system, and the storage system manages your data and gives you some tools to search it and report on it. This approach worked and made things easier when data volumes were small and all of an enterprise's data could fit in a single storage solution. As enterprises shift to a hybrid multicloud architecture, they can no longer manage data within each storage silo, search for data within each storage silo and pay a heavy cost to move data from one silo to another. As GigaOm analyst Enrico Signoretti pointed out: "The trend is clear: The future of IT infrastructures is hybrid ... [and] it requires a different and modern approach to data management." Another key reason an aggregator model for data management is needed is that customers want to extract value from their data. To analyze and search unstructured data, vital information is stored in what is called "metadata" — information about the data itself. Metadata is like an electronic fingerprint of the data. For example, a photo on your phone might have information about the time and location when it was taken as well as who was in it. Metadata is very valuable, as it is used to search, find and index different types of unstructured data.


3 Ways To Improve Board-Level Focus on Third-Party Risk Management

Assessing the risks that third parties bring to your business shouldn’t begin once you have signed the contract. Instead, security and procurement teams should be reviewing known risks in potential vendors during the sourcing and selection stage of the vendor lifecycle. Unfortunately, though, only 31% of companies conduct thorough pre-contract due diligence, indicating there is a long way to go to overcome this obstacle. ... Third-party risk management can’t be a one-and-done task. It needs to be a continuous process built into the risk DNA of the enterprise. However, most organizations can get easily tripped up with performing vendor risk assessments, since half are still using manual spreadsheets to manage their vendors, and a further 34% say it takes over a month to complete an assessment of a top-tier vendor. This traditional static annual assessment approach must give way to a more dynamic process that incorporates real-time risk metrics. Agility should be the order of the day in assessing third parties. ... Effectively reducing vendor risk requires an understanding of how vendors are performing against expectations – both security and performance-related.


Compromised devices and data protection: Be prepared or else

While encryption alone isn’t fully sufficient to secure data, it’s also the case that multiple layers of encryption are often necessary to ensure that any exposed data is rendered unreadable and unusable. For example, an encryption tool like Bitlocker, if used on its own, can leave data vulnerable in certain scenarios such as if a power failure interrupts the encryption process, or if a system administrator’s credentials are compromised. In the wrong hands, a system administrator account will be able to view all files as decrypted and in clear text. However, deploying a solution like Encrypted File System (EFS) as a secondary encryption layer on top of Bitlocker will provide additional file-level encryption. In this way, EFS makes it possible to ensure the encryption of sensitive data, even if an attacker has gained access to device hardware and has powerful credentials in hand. This approach provides the added benefit of making it possible to service devices without it being necessary to allow data access or present any risk of exposure. By implementing a layered encryption strategy with protection at both the full drive and file levels, organizations can take peace of mind that the loss of a particular device is hardly a loss at all.



Quote for the day:

"Becoming a leader is synonymous with becoming yourself. It is precisely that simple, and it is also that difficult." -- Warren G. Bennis

Daily Tech Digest - March 13, 2021

4 ways to keep the cybersecurity conversation going after the crisis has passed

The report recommends adding a business information security officer (BISO) to improve business security alignment, building a top-down measurable program, and changing reporting structures so the CISO reports directly to the CEO. Ultimately, analysts say it’s the CISO’s responsibility to build relationships with executives and the board and have regular conversations with them. “It’s not just the board ignoring things or executives minimizing things, but cybersecurity people staying in their lane,” says Jon Oltsik, senior principal analyst at Enterprise Strategy Group and author of the report. “We need progressive and proactive CISOs to kind of shake the world up.” To maintain momentum, CISOs must keep the board’s attention with a steady stream of relevant information delivered in business terms and presented in the form of risk and strategy for cybersecurity, not just tech solutions. Security leaders and analysts offer some tips, tools, and frameworks to help translate security into strategy and keep the conversation going. If CISOs want to speak in board terms, “you have to speak strategically, and there are strategic business tools to do that,” says Lance Spitzner, director of SANS security awareness. 


The concept of justifiable healthcare and how big data can help us to achieve it

As a first necessary but not sufficient step, the evaluation requires consideration of any evidence on the efficacy of the intervention. AI and big data can be of help to mine and digest existing literature and evidence, at a much higher speed and in a more exhaustive manner than is currently possible using human skills. It is crucial that for this evaluation of efficacy, standardized core outcome sets are applied. These are sets including only outcomes that are: relevant to patients; measurable in an accurate and reliable manner; and discriminative. At the micro-level of the healthcare professional and the patient, justifiable healthcare is in fact an essential element to allow for genuine shared decision making. Indeed, justifiable healthcare provides all stakeholders involved with the argumentation and information necessary to decide which intervention has the highest probability to lead to the desired outcome given this specific condition affecting this specific patient and taking into account other available interventions. Big data and AI can be of help to present alternative options in a way that both patients and healthcare workers can easily understand, and finetuned for the specific case of the patient.


Quantum computing: Quantum annealing versus gate-based quantum computers

All is not lost for gate-based methods – quite the contrary, in fact. GSK's researchers foresee that the expected increase in qubit count in computers like these will allow quantum devices to show a significant performance advantage over classical hardware, for pharmaceutically-relevant life science problems, but also many other types of application. The results of the scientists experiments are still in pre-print, and are yet to be certified by peer review; in addition, the trials only focus on a specific problem – the use of quantum computing to assist drug discovery. Nevertheless, the research offers a valuable overview of the capabilities of quantum devices as they stand, and of the limitations of different approaches to quantum computing. The problem addressed by the scientists is well-established in classical computing. Called codon optimization, it consists of finding sequences of genetic code, called codons, that will ultimately lead to the expression of a particular gene. Up to six codons can be required to represent an amino acid, which in turn form the proteins that determine the gene.


Could No-Code Enable Everything Ops?

“The story of B2B software is largely about automation,” said Chou. If we consider classic examples of corporate applications, we see one common thread — B2B software has always sought to automate traditional office work. And so, it came to pass that computing and software devoured all these office tasks, from accounting to time-tracking, communications and many other areas. Though computer-based office work delivered huge efficiency boosts, these interfaces often introduced new hurdles; bad UI design, difficult navigation and a hundred open tabs, just to name a few. From there, robotic process automation (RPA) aimed to operationalize a growing number of tedious digital workflows across Excel spreadsheets, web apps and desktop apps. In Chou’s words, “RPA took the concepts of test automation software, and then pointed it toward production systems.” Though RPA’s process of recording screen interactions gathered much interest and attention (and funding), screen scraping is ultimately too fragile to be effective. It adds technical debt over legacy systems. It can be expensive to implement. Lastly, it’s not processes-centric and doesn’t implement reusable software-defined APIs.


Here's how digital transformation and sustainability can flourish together

The rapid digitalisation catalysed by COVID-19 presents the opportunity to rethink how we make decisions and how we apply technology in new and meaningful ways. Immense opportunity exists for enterprises that can capture the value of data to drive more sustainable solutions. For example, it’s estimated that the value unlocked by artificial intelligence in helping design out waste for food, keeping products and materials in use, and regenerating natural systems, could be up to $127 billion a year in 2030. The digital transformations of today must be purpose-led, delivering for all stakeholders as a requisite for company success. Spearheading that effort is the Forum's CEO Champions group on Accelerating Digital Transformation in a Post-COVID-19 World, which is led by Antonio Neri, the CEO of Hewlett Packard Enterprise (HPE). Today, this group published a playbook, Bridging Digital and Environmental Goals, designed to provide leaders with recommended actions and examples to leverage data-led insights and create products, strategies and business models that minimise their impact on the planet.


Security awareness programs: The difference between window dressing and behavior change

Instead of merely checking the annual compliance security box, good security awareness programs are focused entirely on real-world outcomes and results. To achieve measurable results, companies need to make a real change in educating employees on cybersecurity and their role in protecting their companies. The core issue with “cookie-cutter” security training, in which all employees receive the same phishing simulation, is that they often do not target at-risk users at the critical moment when a potential attack is in progress. Nor are they conducted with enough frequency to remain top of mind for employees. By implementing policies, controls, and technologies that focus on the individual, organizations can more effectively teach employees the right behaviors that will result in a cyber-savvy culture. ... Taking a behavior-based approach to security awareness training is more effective than traditional initiatives, reduces costs, and provides a measurable ROI for organizations. Consider lane assist technology. While the reason why a driver might drift into another lane can range from fatigue to inattention to an inability to see the lines, alerting drivers exactly when they might be dangerously drifting into another lane helps drivers avoid a collision.


AI and you: how confusion about the technology that runs our world threatens democracy

Machine learning occupies an interesting position in the story of scientific progress. On one hand it’s a natural outcome of developments in computer science that began in the 1980s. On the other hand, its total dependence on information — and its ability to make do with all sorts of information, including things like your keystroke and heart rate — marks what could turn out to be a more radical break with previous technologies. Machine learning uses existing information to generate new information. But it also allows that new information to be put to a variety of questionable uses, including surveillance and manipulation. If you’ve ever been recommended products while shopping online, you’ve probably been profiled. Ever been denied an application for a credit card in short order? Again, you’ve probably been profiled. Algorithmic profiling presents a host of ethical and legal challenges, particularly around discrimination and privacy. But profiling is just the tip of an ever-expanding iceberg. Many uses of big tech pose a threat to individuals as individuals, which is bad enough.


The cyber security risks of working from home

The most obvious risk is that most of our tasks are conducted online. After all, if something’s on the Internet, then there’s always the possibility of a cyber criminal compromising it. They might attempt to do this by cracking your password. This could be easier than ever if you’re reusing login credentials for the various online apps you need to stay in touch with your team. Meanwhile, according to CISO’s Benchmark Report 2020, organisations are struggling to manage remote workers’ use of phones and other mobile devices. It found that 52% of respondents said that mobile devices are now challenging to protect from cyber threats. ... Organisations should also be concerned about remote employees using their own devices. This might have been unavoidable given how quickly the pandemic spiralled and the suddenness of the government’s decision to implement lockdown measures. Still, where possible, all work should be done on a corporate laptop subject to remote access security controls. This should include, at the very least, 2FA (two-factor authentication), which will mitigate the risk of a crook gaining access to an employee’s account.


The future of data privacy: confidential computing, quantum safe cryptography take center stage

Quantum safe cryptography aims to tackle the problems that will arrive with the day we have a working quantum machine. While quantum computing is being actively worked on by engineers worldwide, with Honeywell, for example, ramping up the capacity of its own System Model H1 to a quantum volume of 512, it is estimated that a full-capacity quantum computer could exist within the next 10 to 15 years. When that day arrives, however, the high computational power of these machines would render "virtually all electronic communication insecure," according to IBM, as quantum computers are able to factor large numbers -- a core precept of today's cryptography. To resolve this, standards based on lattice cryptography have been proposed. This hides data in complex algebraic structures and is considered to be an attractive option for future-proofing data privacy architectures. According to IBM cryptographer Vadim Lyubashevsky, adopting lattice frameworks is unlikely to impact end-users -- and may actually improve computational performance. But why bother now, when full quantum machines do not exist? 


Enterprise architecture: a tool for business recovery?

From entirely new ways of working, permanent shifts in customer behaviour and operational networks, the world beyond the crisis is set to look drastically different. To emerge from the pandemic in a stronger position, organisations will need to directly address the vulnerabilities the pandemic has exposed. For instance, people may continue to be adverse to gathering in large groups, ecommerce is unlikely to lose the gains it has obtained during multiple lockdowns, and of course, businesses globally have realised the benefits that the work from home model brings. These emerging trends will significantly alter the roadmap ahead, but more importantly, it’ll accelerate the exploration of new digital tools. A recent McKinsey report shows that nearly all organisations, whether traditional companies or startups are re-orienting their business models to be more digital as a direct result of the impact Covid-19 has had on changing consumer behaviours, and many of these changes will outlive the current landscape. As we delve into this virtual world, we must prepare and ask ourselves, could parts of hospitality and tourism be replaced by VR? Will business meetings make use of holographic technology for a blended experience? Will self-driving or delivery drones spearhead the future of retail?



Quote for the day:

"The world is full of obvious things which nobody by any chance ever observes." -- Arthur Conan Doyle

Daily Tech Digest - March 12, 2021

Hiring developers? Here's how to keep them happy and productive

"Whiteboard coding was another thing that was just totally broken in engineering hiring. Asking people to code on a whiteboard is a different skill set. People don't do it for their day-to-day. It was silly for us to ask people to put code on a whiteboard, but we did it for years!" A better strategy for onboarding developers remotely, Pillar says, is a sort of BYOD policy, whereby hiring managers ask candidates to bring their laptops along to the interview with the understanding that they'll be performing some form of on-the-spot coding while they share their screen with the interviewer. "That's a way more productive way to get an excellent signal about the quality of a developer, because it's actually their environment and you can see them using the tools that they're familiar with," he explains. ... "A meeting is an extremely expensive thing for an engineer. It's way easier, unfortunately, to interrupt an engineer's flow in a remote world with a meeting because their calendar is open, you can just throw it in there and you don't even really think about it. Some software providers now provide analytics tools that will measure how workers' time is spent, some of which include the ability to measure interrupted time – also known as 'friction time'. 


How Security Architecture Is Shaping Up for 2021

Access is often referred to as zero-trust network access, which seems incorrect to me since its application access, which is network access, which is the old traditional VPN piece. But that architecture makes no difference if you’re on or off the network. It uses an access proxy to provide a security and control context, it’ll provide identity components for users and devices. So it gives you this application [information] and then contextual information as applied per session. That’s one architecture — one of the problems, obviously, when you have some of these architectures is that you try and build it, and you’ve got five different vendors, you’re trying to build it from code union and endpoint solution, you to proxy, you need security, you need identity, you need these other contextual engineering management [techniques]. So customers have trouble when they try to build it across maybe five or six vendors. That’s why I think it’s a really important architecture, especially when I think people are gonna be more and more often on the network and backward and forwards, it doesn’t really matter whether it’s a zero trust architecture. So that’s one really important component.


IT security strategy: A CISO's 5 essentials

One of the most common cyberattack vectors remains exploiting known vulnerabilities in OS software and applications. To combat these attacks, stay on top of the maintenance level of your hardware and software. Unsupported components should be upgraded or replaced as soon as possible. Conduct vulnerability scans for the full infrastructure monthly, and correct issues as soon as possible. Ensure your scans include third-party products and applications. ... A famous baseball coach once said, “You can observe a lot by just looking.” Make better use of the logs and reports provided by the systems and applications running your business. Delineate baselines and metrics defining security health. A change in activity patterns or metrics may be an early indicator of trouble brewing. Develop, maintain, and test a practical security incident management plan so you will know what to do if faced with a real incident. Composing a secure foundation isn’t easy in the best of times. While these five tips may not be as exciting as hunting for hackers or implementing a sophisticated security incident event management (SIEM) system, they are the building blocks of a strong foundation and offer the best way to move organizations forward safely.


Power Equipment: A New Cybersecurity Frontier

While IoT has been the catalyst for many positive developments, there are challenges with these expanding interconnections. For power management, the ability to connect backup equipment like an uninterruptible power supply (UPS) can prove helpful in enabling IT teams to monitor and maintain essential infrastructure more efficiently. However, like any other network-connected devices, they become assets that need to be secured from potential cyber breaches. Though UPS doesn't traditionally come to mind when envisioning ways cybercriminals infiltrate a network, the same could also be said for other inconspicuous devices like HVAC units. Yet, that's exactly what hackers pursued when they were able to gain access to Target's system and steal data on over 40 million credit and debit cards. And consider how hackers were able to penetrate the network of a North American casino utilizing an Internet-connected thermometer inside an aquarium. Finding the vulnerability in a fish tank, of all places, allowed hackers to access the casino's database and ultimately steal private customer data.


How do I select a SOAR solution for my business?

A SOAR solution should enable teams to automate the identification and response process across significant volumes of disparate data streams, so that the prioritisation of threats and vulnerabilities becomes almost seamless, not least far more operationally efficient. If implemented correctly, Security Operations Centres (SOC) can benefit from using SOAR solutions helping them to deal with threats faster and more efficiently. Integrating SOAR with other security tools, such as Security Information and Event Management (SIEM), can transform SOC teams business and technology outcomes through automation, while also increasing efficiency. Combining forces, organisations can use SOAR to augment the capabilities of SIEM, offering an all-comprehensive solution. SIEMs collect and store data in a useful manner which SOAR can use to automatically investigate and respond to incidents and reduce the need for manual operations. What’s more, in tackling one of the biggest challenges for SOC teams to date, SOAR solutions can help to ingest information, sort, prioritise and combine duplicate alerts to reduce the number of false positives.


Fintech Innovation Done Right: Be A Creator

Fintech can also create entirely new product categories. One mechanism I’ve explored previously are embedded fintech strategies. A financial product can be embedded into other products to change the nature of, availability and engagement model with customers. Companies like Opendoor give customers the ability to make cash offers for homes to make them more competitive. Boost allows companies to launch insurance products and bundle them into a broader offering. Zola bundles loans and mobile repayments with Pay-As-You-Go financing to unlock demand for home solar systems in Africa. Without the built-in financing, the systems would be unaffordable making the loan a core piece of the business model, rather than a feature. Similarly, many boot camps engage in income sharing agreements – rather than charging tuition, the program is repaid through a percentage of future earnings for a set period of time. Finally, players like ZhongAn have created fully automated insurance built into products. For instance, in a partnership with a telephone provider, they can automatically detect a broken screen.


Untangl CEO discusses how Insurtech startups are disrupting finance markets

“There’s no doubt that technology is going to disrupt the insurance sector like it has any other industry,” said Stewart. “But I think insurance has been particularly slow when it comes to modernising, and that’s been highlighted by the rapid shift to the cloud. “The pandemic has been another catalyst in a rethink of operations going forward, but a cultural problem has been present around an industry that’s underinvested in technology, while finding it difficult to innovate in such a risk-averse, high margin landscape.” Stewart went on to explain that companies in the space can often spend up to 18 months making decisions to solve inquiries in response to potential problems, a reflection that he described as “a reflection of how not to do it”. However, while the insurance sector has found innovating quickly with short term projects more difficult than other sectors, the past year has seen areas such as personal lines become more agile and intuitive. “It’s not easy because the industry has experts in their complex fields, who are representing stakeholders with billions in capital behind them, and any mistakes can be financially disastrous,” Stewart added.


The Brain of Security

In fairness security analysts are seeking to make risk-informed decisions, as the human brain does this instinctively. However, they can only do that based on the information they are provided. There are not many security programmes where business context was provided to the analyst to aid in decision making. Recognising this reality, organisations are seeking to quantify their cyber risk to better align security to the business, drive remediation and response activities, support investment decisions and demonstrate return on security investment. Many have already embraced the move to a quantified understanding of risk – only to be let down as current approaches require too much manual data collection, too much training and professional services support, don’t connect this newfound understanding with the ability to take action and fail to meet the need to efficiently and cost-effectively mitigate risk. Organisations need to acknowledge that understanding and quantifying risk is critical to building an effective security programme in this day and age. Solely orchestrating and automating security actions with an intelligence-led approach is not enough.


CIO Agenda for Right Now: Priorities a Year Into the Pandemic

First, the COVID-19 pandemic brought a period of rapid change and challenges for organizations, and that has accelerated technological change. Future conditions will be significantly different from the past and even from the present, according to White. Second, operating models have had to change. Now that the dust has settled, organizations will be using the rest of 2021 to review and consolidate all of the changes that have happened in organizations, White said. Third, the pandemic has raised new business priorities. Work from home has been one of them. But deeper in that trend, the pandemic has disrupted traditional research conducted by business and has raised different priorities for innovators, according to White. Plus, the work-from-home trend will drive significant organizational changes. Remote leadership poses challenges for presence and influence, according to White. Leaders and managers will need to adapt their styles to encompass non-line-of-sight supervision and performance management. Fourth, the CIO role has changed and will continue to change. Technology and the CIO's response to the pandemic, lockdown, and economic downturn, meant that many organizations were able to survive the initial crisis.


OpenAI’s state-of-the-art machine vision AI is fooled by handwritten notes

Researchers from machine learning lab OpenAI have discovered that their state-of-the-art computer vision system can be deceived by tools no more sophisticated than a pen and a pad. As illustrated in the image above, simply writing down the name of an object and sticking it on another can be enough to trick the software into misidentifying what it sees. “We refer to these attacks as typographic attacks,” write OpenAI’s researchers in a blog post. “By exploiting the model’s ability to read text robustly, we find that even photographs of hand-written text can often fool the model.” They note that such attacks are similar to “adversarial images” that can fool commercial machine vision systems, but far simpler to produce. Adversarial images present a real danger for systems that rely on machine vision. Researchers have shown, for example, that they can trick the software in Tesla’s self-driving cars to change lanes without warning simply by placing certain stickers on the road. Such attacks are a serious threat for a variety of AI applications, from the medical to the military.



Quote for the day:

"The most difficult thing is the decision to act, the rest is merely tenacity." -- Amelia Earhart

Daily Tech Digest - March 11, 2021

Microsoft cracks the RPA market! ... not really

The big question is, what do you actually automate as individual users? People, more often than not have no idea what it is they actually want to do with the software. This will change as eureka moments are shared and Windows users begin to understand that they no longer have to produce weekly reports, update databases or populate CRM systems. But this is not a disruptive moment in the history of RPA, it's more akin to a dripping tap eventually filling up the bath. Microsoft is late to the game, very late. While RPA is new to some and still unheard of by others, it's now considered as a fairly mainstream business software solution, propped up by process discovery, mining and a plethora of consulting firms all benefiting from the success of the technology. Of course the very entrance of Microsoft into the market is much more than 'a bit of news' but it really just validates the potential for RPA to evolve further and might even justify the hype that some find so distasteful and worthy of their online wrath. Contrary to what the detractors are claiming, most, if not all of the large RPA implementations are delivering high ROI. Technology tends to succeed or fail based on what it delivers.


New free software signing service aims to strengthen open-source ecosystem

Sigstore uses the OpenID authentication protocol to tie certificates to identities. This means a developer can use their email address or account with an existing OpenID identity provider to sign their software. This is different from traditional code signing that requires obtaining a certificate from a certificate authority (CA) that's trusted by the maintainers of a particular software ecosystem, for example Microsoft or Apple. Obtaining a traditional code signing certificate requires going through special procedures that include identity verification or joining a developers program. The sigstore signing client generates a short-lived ephemeral key pair and contacts the sigstore PKI (public-key infrastructure), which will be run by the Linux Foundation. The PKI service checks for a successful OpenID connect grant and issues a certificate based on the key pair that will be used to sign the software. The signing event is logged in the public log and then the keys can be discarded. This is another difference to existing code signing because each signing event generates a new pair of keys and certificate. Ultimately, the goal is to have a public proof that a particular identity signed one file at a particular time.


Sustainability and Availability Big Wins for Digital Twins

“The impact of digital twins goes beyond the asset and extends to logistics,” Weiss explains.” This has significant implications for mission readiness if an asset deploys to remote or hazardous locations. By understanding the condition of any given asset at any given time, sustainment leaders can anticipate maintenance requirements, ensuring the right components and personnel are in the right place at the right time, making real the concept of condition-based maintenance plus (CBM+).” CBM – sometimes called Predictive Maintenance (PdM) - is a maintenance methodology that utilizes sensors to assess the health of the system. The health information, in addition to other inputs, helps to drive the maintenance activities. In a CBM environment, operating platforms, embedded sensors, inspections, and other triggering events determine when restorative maintenance tasks are required based on evidence of need. Combined, these end benefits drive affordable and resource-optimized sustainment operations and data-informed decisions to significantly increase operational availability. In a single-use case assessment of two aircraft engine components, a U.S. military service branch reported potential savings of approximately $42 million annually by using a digital twin.


Enabling Your Workforce for the Digital Workplace

Changing company culture is just one step on the path to successful digital transformation. It requires the resources, knowledge, and skills to support and sustain the initiative. Providing sufficient training and targeted reskilling or upskilling for current employees is a necessary part of building the workforce of tomorrow. Digital literacy dovetails with key soft skills, including adaptability, problem-solving, effective communication, and emotional intelligence, which all correlate with effective teamwork and leadership. At the same time, investing in the workforce’s digital literacy helps nurture employee engagement and can improve retention. The costs of recruiting and training new staff can be substantial, so assessing and augmenting the current workforce’s digital literacy should be a central feature of any strategy. Building those foundational skills helps support adoption. Cybersecurity is another crucial focus area, and building a culture of security is central to organizational resilience. That’s why it’s no surprise to see that 59% of CFOs plan to increase budgets for IT in 2021. The number and sophistication of cyber threats have risen substantially, and a company is only as secure as its weakest link. This is especially true for the digital workplace.


When AI Alone Isn’t Enough?

Things are always more complicated than we would hope, especially when it comes to the promises of new technologies. After more than 30 years as a consultant and analyst in emerging technologies, I’ve learned that the implementation of promises is always more difficult and problematic than we expect. How is this playing out with machine learning models? First, there is the basic problem of the data itself. Without a clear cycle of data management, machine-learning models may cause more problems than they solve. You must then think about how a business can explain how decisions derived from a machine learning model were made. Here are a few questions that may arise: What are the hidden and not so hidden biases with the data selected to create a model?; What are the ethical challenges that must be managed as we move to this brave new world of Artificial Intelligence?; For example, how can a manager explain what business processes and rules are behind a model? Are those rules ethical?; Do these models adhere to either corporate or governmental requirements?; How does an organization know if there is bias in a model that can impact business outcomes?


Multiple Attack Groups Exploited Microsoft Exchange Flaws Prior to the Patches

Security firms including Red Canary and FireEye are now tracking the exploit activity in clusters and anticipate the number of clusters will grow over time. ESET researchers have detected at least ten APT groups using the critical flaws to target Exchange servers. When used in an attack chain, the exploits for these vulnerabilities could allow an attacker to authenticate as the Exchange server and deploy a Web shell so they can remotely control the target server. When Microsoft released patches for the four Exchange server zero-days, it attributed the activity with high confidence to a Chinese state-sponsored group called Hafnium. Now, as researchers observe Web shells stemming from suspected Exchange exploitation, they believe far more groups are responsible for the growth in attack activity. In a blog post released March 9, Red Canary analysts report none of the clusters they observe significantly overlap with the group Microsoft calls Hafnium; as a result, they are now tracking these clusters separately. "We don't know who is behind these clusters – we aren't sure if it's the same adversaries working together or different adversaries completely," the researchers write. "We're focusing narrowly on what we observe on victim servers for our clustering."


Clearview AI sued in California by immigrant rights groups, activists

Clearview AI, the controversial firm behind facial-recognition software used by law enforcement, is being sued in California by two immigrants' rights groups to stop the company's surveillance technology from proliferating in the state. The complaint, which was filed Tuesday in California Superior Court in Alameda County, alleges Clearview AI's software is still used by state and federal law enforcement to identify individuals even though several California cities have banned government use of facial recognition technology. The lawsuit was filed by Mijente, NorCal Resist, and four individuals who identify as political activists. The suit alleges Clearview AI's database of images violates the privacy rights of people in California broadly and that the company's "mass surveillance technology disproportionately harms immigrants and communities of color." ... The lawsuit is the latest attempt by grassroots groups to clamp down on facial-recognition software, which is not widely regulated in the United States. In the absence of clear federal rules regarding the usage of the technology, a number of cities — such as San Francisco, Boston, and Portland, Oregon — have banned the technology in some capacity.


DeFi and tokenization together reshape the financial system

By integrating tokenized assets and equity into DeFi protocols, the functions of, for instance, existing liquidity pools and interest rate based protocols can be effectively applied to real-world assets. Thus, mainstream adoption for DeFi is fostered. But with tokenization, even features such as atomic swaps or flash loans become possible for real-world assets. This will substantially rejuvenate the current illiquidity in the DeFi sector and also enable a far more diverse range of opportunities for investing from the perspective of CeFi (centralized finance). Tokenization efficiently bridges the gap between these two worlds. Semi-automated lending and trading systems atop blockchain networks are the future of finance. Here, cutting out intermediaries while democratizing access through blockchain-based governance models make room for new solutions. In theory, anyone with a connection to the internet can leverage DeFi protocols. Individuals worldwide can now conduct financial transactions more efficiently and at lower costs, especially when compared to the global remittances market. Currently, this still requires users to own crypto-assets.


How To Build A Career In IoT?

Internet of Things has truly lived up to its hype. Between 2014 and 2019, the number of businesses employing IoT technologies grew from 13 percent to 25 percent. According to McKinsey, the number of IoT-connected devices would touch 43 billion by 2023. The blooming IoT domain has opened up a world of possibilities for skilled engineers and professionals. The growing demand has widened the supply-demand gap. An Immersat Research Programme research showed that as many as 47 percent of surveyed companies lacked appropriate IoT skills and were forced to outsource projects. A suboptimal IoT workforce means up to 75 percent of the IoT projects take twice as much time to complete, as noted by a Gartner survey. There are no fixed eligibility criteria to enter this field. However, an engineering graduate specialising in IT, computer science, electrical and electronics are better fits. A few colleges provide undergraduate courses in IoT or have computer science specialisation with IoT as a major subject. ... Since an IoT engineer deals with a large amount of data that is often unreliable, the ability to manage such data gains paramount importance. 


Getting your application security program off the ground

Like water, attackers are always trying to find a path of least resistance. It should not be surprising that instead of trying to get through sophisticated defenses on the infrastructure side, they explore vulnerabilities in web, mobile applications and web services. ... A potential attacker could exploit a vulnerability by executing an API call with a specially crafted parameters/payload, and this may lead to an injection attack and result in the attacker obtaining sensitive data (e.g., financial information) or executing unauthorized actions (e.g., transferring money to a bank account controlled by the attacker). In many cases, Dzihanau notes, such actions would be difficult to distinguish from typical application usage. ... “Unfortunately, automation is not everything, and developers need to obtain the necessary knowledge and make security part of their day-to-day work. Security aspects need to be addressed not only during testing but continuously in design development and deployment too. While terms security by design and shift-left are well known, organizations only start to realize now what changes and implications this brings to the development process.”



Quote for the day:

"Leadership is working with goals and vision; management is working with objectives." -- Russel Honore

Daily Tech Digest - March 10, 2021

7 challenges women face in getting ahead in IT

It’s a razor thin line that’s nearly impossible to walk. Not being assertive won’t get you promoted. Being assertive makes you unlikeable. Either one makes promotion difficult. Not being liked holds women back, according to Lean In, an organization dedicated to overcoming these obstacles, because women who aren’t liked are seen as less competent. ... “My advice to women is to make sure you speak up,” says Easwar. “Don’t let your ability to express your opinions be impacted by prejudice. You have to break that mold because not speaking up can be more detrimental and can be a disservice to other women trying to follow a similar growth path.” Women can’t change this without help, though, and altering the way your company manages meetings can go a long way to preventing this. “We switch who runs meetings every week,” says Breanne Acio, co-founder and CEO of SÄ“kr, a female-run tech startup. ... Women are often left out of group events or conspicuously ignored because they are not a member of the “in group,” which, in IT, is often white men. Whether this is conscious or unconscious, it is a bias that makes it difficult to get promoted.


How to mitigate security risks as cloud services adoption spikes

As organizations continue to move to the cloud, it’s vital to build a trusted foundation for computing. Security is only as strong as the layer below it, which is why it’s important to have trusted security technologies rooted in hardware for cloud. For example, technologies such as disk and network traffic encryption protect data in storage and during transmission, but data can be vulnerable to interception and tampering while in use in memory. “Confidential computing” is a rapidly emerging usage category that protects cloud data while it’s in use in a Trusted Execution Environment (TEE). TEEs enable application isolation in private memory regions called enclaves to protect code and data while in use on a CPU. For instance, healthcare organizations can securely protect data with TEEs — including electronic health records — and create trusted computing environments that preserve patient privacy across the cloud. Full memory encryption is another technology used to protect hardware platforms (in the cloud) to ensure that memory accessed in CPUs is encrypted, including customer credentials, encryption keys, and other IP or personal information on the external memory bus.


What is the future of the Proptech space?

With Proptech companies needing to increasingly digitise their operations to meet evolving customer demands, they will also need to continue paying attention to regulations, which may be equally liable to change. Roger Tyrzyk, director of global gaming and sales for UK/I at IDnow, believes that such rules would need to be reconsidered for true innovation to occur, and that this could be increasingly prioritised by regulators in the future. “Consumers have changed; they expect processes to be much quicker, more efficient and certainly not an inconvenience,” said Tyrzyk. “In response, the property sector – both conveyancing and rental markets – must adapt and the only way to do this is to leverage the innovative technologies that already exist. “However, to empower sectoral change, property regulators and the Government must revisit the archaic laws that currently exist in the UK and align them with the 21st Century. For example, under current laws, every letter a solicitor sends out must be wax sealed. To deliver a truly digitised process and harness the power of Proptech, this must be addressed and overhauled.


How to protect backups from ransomware

If your backup system is writing backups to disk, do your best to make sure they are not accessible via a standard file-system directory. For example, the worst possible place to put your backup data is E:\backups. Ransomware products specifically target directories with names like that and will encrypt your backups. This means that you need to figure out a way to store those backups on disk in such a way that the operating system doesn’t see those backups as files. For example, one of the most common backup configurations is a backup server writing its backup data to a target deduplication array that is mounted to the backup server via server message block (SMB) or network file system (NFS). If a ransomware product infects that server, it will be able to encrypt those backups on that target deduplication system because those backups are accessible via a directory. You need to investigate ways to allow your backup product to write to your target deduplication array without using SMB or NFS. All popular backup products have such options.


How the UK Has Set its Sights on Becoming a Fintech Haven in the Wake of Brexit

With global competition for fintech talent within the sector, cities like London face fresh competition from European destinations like Berlin, Barcelona and Amsterdam - which are becoming increasingly popular for fintech professionals with the right to work across the EU. This exodus is exactly what the UK is looking to prevent, and the danger posed by the situation has been underlined by Ricky Knox, CEO at fintech bank, Tandem, who said: “Tech visas are a great thing and essential if we are going to keep a competitive tech and fintech sector,” he added. “Over half of our coders are from outside the UK and some have already left due to Brexit.” Another aspect of the review has called on the UK to revise its approach to the regulation of crypto-assets as a means of welcoming more fintech businesses in the future. Recent restrictive measures by UK regulators involve bans on the sale of crypto derivatives and an anti-money laundering register that have created a somewhat hostile environment for blockchain or decentralised finance fintech businesses to set up camp in London.


The world's most powerful supercomputer is now up and running"

The ultra-high-performance computer Fugaku is about to go into full-scale operation," said RIST president Yasuhide Tajima. "I look forward to seeing this most powerful 'external brain' ever developed by humanity helping to expand our knowledge, enabling us to gain deeper insights into the foundation of matter in both time and space, giving us better structural and functional analysis of life, society, and industry, enabling more accurate predictions; and even designing the unknown future of humanity." Fugaku is designed to carry out high-resolution, long-duration, and large-scale simulations, and boasts up to 100 times the application performance of its predecessor, the K supercomputer, which was decommissioned in 2019. This unprecedented compute power has earned the device the top spot for two consecutive terms in the Top500 list, which classifies the 500 most powerful computer systems around the world. At 442 petaflops, Fugaku stands a long way ahead of competitors, with three times more capability than the number two system on the list, IBM's Summit, which has a performance of 148.8 petaflops.


Researchers Describe a Second, Separate SolarWinds Attack

Secureworks describes Supernova as a Trojanized version of the legitimate dynamic link library used by the SolarWinds Orion network monitoring platform. The hackers were spotted using Supernova to conduct further reconnaissance on a SolarWinds client's network, which eventually led to the exfiltration of some credentials, the researchers say. SolarWinds confirmed to ISMG that the Supernova attack researched by Secureworks is associated with earlier warnings of a second group of hackers exploiting a SolarWinds Orion flaw, which has since been patched. Secureworks' researchers discovered the Spiral attack on one organization in November 2020 when they spotted hackers exploiting a SolarWinds Orion API vulnerability on an internet-facing SolarWinds server during an incident response effort. "The threat actor exploited a SolarWinds Orion API authentication bypass vulnerability (CVE-2020-10148) to execute a reconnaissance script and then write the Supernova web shell to disk," the researchers say. "There is no known connection between Spiral activity and the recently reported compromises of on-premises Microsoft Exchange servers," says McLellan.


The Perfect Pair: Digital Twins and Predictive Maintenance

Sensors are the heart of any measurement, control, and diagnostic devices. Device telemetry is collected using the smart sensors available on the hardware / software environment and then used to create the digital twin model of the physical equipment. All of the data is then aggregated and compiled to generate actionable information. The digital twin model is then continuously updated to mirror the current state of the physical thing. It can then be used to effectively model, monitor, and manage devices from a remote location. It also enables continuous intelligence & estimated time for the next needed maintenance, which the maintenance system can use to schedule at the optimal time. ... Not all use cases or business problems can be effectively solved by predictive maintenance using a digital twin. Here are important qualifying criteria that needs to be considered during use case qualification: The problem must be predictive in nature, meaning there should be a target or an outcome to predict; The problem should have a record of the operational history of the equipment that contains both good and bad outcomes; ... 


Talent and workforce effects in the age of AI

To meet their AI aspirations, companies will likely need the right mix of talent to translate business needs into solution requirements, build and deploy AI systems, integrate AI into processes, and interpret results. However, most early adopters face an AI skills gap and are looking for expertise to boost their capabilities. In fact, 68 percent of executives surveyed report a moderate-to-extreme skills gap, and more than a quarter (27 percent) rate their skills gap as “major” or “extreme.” The gap is evident across all countries surveyed, ranging from 51 percent reporting moderate-to-extreme gaps in China to 73 percent reporting the same in the United Kingdom. What do leaders regard as the “most needed” roles to fill their company’s AI skills gap? The top four most-needed roles are “AI builders,” who are instrumental in creating AI solutions: researchers to invent new kinds of AI algorithms and systems, software developers to architect and code AI systems, data scientists to analyze and extract meaningful insights from data, and project managers to ensure that AI projects are executed according to plan


Digitally Transforming Trusted Transactions Through Biometrics, ML & AI

The EIU study found that that biometrics will become the dominant customer-authentication method for payments: "The survey shows optimism that evolving technological innovations like biometrics (fingerprint, facial, or voice recognition) could further reduce the trade-off between fraud, security, and [consumer experience], with 85% expecting biometrics to be used to authenticate the vast majority of payments in the next 10 years." Biometrics eliminates one of the riskiest forms of authentication: the username/password combination. Although there are some risks involved with biometrics — identity thieves have figured out ways to steal a fingerprint — they continue to be safer than other forms of authentication. During digital transactions, they give customers peace of mind that their data is not at risk. In fact, a recent study by biometrics company Fingerprints found that 56% of consumers would prefer to use a biometric sensor on their payment card instead of a PIN, signaling market appetite for embedding biometric authentication solutions into the payment process. Pattern recognition also plays a major role in fraud detection. Historical data shows patterns of how customers behave, what they are searching for, etc.



Quote for the day:

"The key to successful leadership today is influence, not authority." -- Ken Blanchard