Daily Tech Digest - December 29, 2020

The lessons SaaS businesses must learn from 2020

Put simply, there are two types of churn for SaaS businesses, and two stages when it happens. Voluntary, or “active” churn is when a customer chooses to cancel their subscription with a business. Involuntary, or “passive” churn comes from subscriptions being cancelled due to accidental reasons, like failed payments. Typically, you would expect 20-40% of churn to be involuntary, and most of that will be coming from card payment users, where a payment fails because customers haven’t been charged successfully. This is positive, because it means you can put in place measures for better payment acceptance to stop involuntary churn happening. However, because of the Covid crisis, and the higher number of companies competing for the same amount of customers, it’s likely that the percentage of voluntary churn will be higher in 2021, as customers shop around for a better deal. At Paddle, we talk to around 200 new software companies a month, as well as our 2,000 existing customers when advising them on how to sell into over 200 countries across the world. Therefore, we’ve seen first-hand the impact churn reduction strategies can have on a software business’s growth.


Mac Attackers Remain Focused Mainly on Adware, Fooling Users

A recent report by The Citizen Lab at the University of Toronto underscored that the commercial sale of zero-click exploits in iMessages, for example, continues to allow governments to buy access to target dissidents. Now, malware families that have previously only targeted Windows, and sometimes Linux, are also being ported to target Macs, says Ian Davis, a senior threat researcher at BlackBerry. "Historically MacOS threats mainly centered around adware and trojanized downloaders of well-known software," he says. "While these less-than-lethal families are still the majority of encountered samples, advanced attacks and toolsets are now being developed and deployed along with their counterparts for Windows and Linux." Overall, the sophistication of MacOS threats is increasing, the two researchers say. Previously encountered families on Windows or Linux are also now targeting MacOS systems. In 2020, the community saw increased cases of ransomware, botnet campaigns, and information-stealing backdoors in MacOS environments. Mac User = The Vulnerability While at least a quarter of the threats encountered by Windows systems are malware, less than 1% of those encountered by Mac systems are considered malware, Malwarebytes stated in its February report. Instead, attackers targeting the Mac look to fool the user into taking the necessary steps to allow malware to run.


2021 blockchain predictions from Energy Web

Numerous countries—from China and Singapore in Asia to Sweden and France in Europe to Saudi Arabia and the United Arab Emirates in the Middle East—are all exploring centralized bank digital currency (CBDC) equivalents of their respective fiat currencies. Crypto exchanges like Kraken are taking the unprecedented step of getting bank licenses. Decentralized exchanges are overtaking centralized incumbents (in August, for example, Uniswap surpassed Coinbase Pro in trading volume). And in mid-December Bitcoin reached an all-time high, for the first time crested US$23,000, mainly driven this time by the interest of large enterprises. Meanwhile, the ‘data for free’ model that has existed for years is coming to an end, and not just because of legislation such as the EU’s GDPR and California’s CCPA. Consumers are fighting back against losing control of their own data as tech giants find themselves the target of lawsuits. In April, a U.S. federal appeals court revived litigation that accused Facebook of violating users’ privacy rights by illegally tracking their Internet activity. In September, a coalition of Canadian provinces sued Google in a proposed class action lawsuit alleging the Internet giant was collecting data without consent. That same month the Irish Data Protection Commission issued a preliminary decision to halt Facebook’s trans-Atlantic data transfers.


Team-Level Agile Anti-Patterns - Why They Exist and What to Do about Them

At the team level, lack of adequate training, mentoring and coaching is responsible for a good bit of it, but it is hard to divorce the team from the organisation. Negative organisational culture will of course affect its teams. Agile can be counter intuitive, especially when it contradicts traditional business experience, but a good Scrum Master/Coach should not only explain a best practice, but should also explain why it’s best practice and should explain what bad things happen if the anti-pattern remains unaddressed. Some examples in my personal experience: I once worked on a team where a tech lead met with the rest of the Development team immediately after Sprint Planning to allocate Stories to each member of the team. I initially didn’t know this was happening, but my suspicions were soon raised by a couple of things: Sprint Backlog items were not being picked up in priority order; and The tech lead only worked on the easier items. I asked individuals why they were working on lower priority stories when there was a higher priority story remaining in the To Do column. That’s when it came out in the wash. The tech lead didn’t mean any harm. When I spoke with him, he told me that’s what was expected of him by managers in his previous postings.


The Great Data Protection Debate: India’s new Data Protection Bill

The Data Protection Bill suggests that personal data should include data “…relating to a natural person who is directly or indirectly identifiable, having regard to any characteristic, trait, attribute or any other feature of the identity… or any combination of such features, or any combination of such features with any other information…” [Section 3(28)]. Verbiage apart, the Bill essentially says that any data that identifies you in connection with any other information is your personal data. Naturally, this creates a recipe for competing claims. What if ‘any other information’ were to include somebody else’s personal data? All these complications have led data experts to argue that citizens should hold control over their data collectively, rather than individually. These ‘data-co-operatives’ would act as trade unions within conventional markets. Among others, they may negotiate rates for data, ensure quality digital output, invoice organizations that benefit from the output, and distribute the profits. Global data trusts may not be far away. In January, Microsoft’s CEO, Satya Nadella, at the World Economic Forum called for greater respect for “data dignity” - meaning individuals should have greater control over their data and a larger share in the value it creates.


The need for zero trust security a certainty for an uncertain 2021

After a few years of relative predictability, data privacy promises to get more “interesting” in 2021. The GDPR and CCPA regulatory regimes each notched milestones in 2020. The GDPR (as of this writing) had assessed a record level of fines totaling €220 million. California’s CCPA enforcement kicked in on July 1st, and voters in that state passed additional privacy restrictions via a November ballot initiative (the California Privacy Rights Act or CRPA). The CRPA extends and modifies the CCPA, with new mandates taking effect at the end of 2022. Here’s where things are going to get interesting. Optimistically, effective COVID-19 vaccines will facilitate the ability for in-person work by mid-year. But it’s just as likely delays in distribution, reluctance to inoculate and lingering stress on the healthcare system will extend work-from-home practices for many through 2021. Either way, organizations will face obligations and temptations to collect more data on their employees – about their immunization status, health situation, work habits, even their social interaction patterns – than ever before. Today, most practitioners focus on risks from external threat actors. But with a bracing action in October, the GDPR authority showed they’re equally concerned with human resources data when they slapped clothing retailer H&M with a €35 million fine for illegal employee surveillance.


DevSecOps: The good, the bad, and the ugly

DevSecOps requires patience and tenacity. Any DevSecOps implementation takes a minimum of a year—anything less than that is incomplete. It will involve a lot of planning and designing before you start setting up the solution. You must first identify the gaps in your current process and then determine the tools required to support the process you intend to implement. You will need to coordinate with a variety of teams to get buy-in and instruct them to implement the required changes. None of this happens overnight. Making changes to your process affects all people involved in the process and all applications following the process. If all your applications are being scanned using a common set of libraries, any change in these libraries will impact all apps unless you put in specific conditions. Adding a new application to this process may take a long time. Onboarding .Net applications usually take a lot more time because they must build correctly. Visual Studio tends to hide a lot of build errors and provides dependencies at runtime; this is less true for MSBuild. In cases when the app team built an application using Visual Studio and checks it in, an automated process using the MSBuild command line can break due to a variety of reasons.


Reference Architecture For Healthcare  – Core Capabilities 

Users of the reference architecture are planners, managers, and architects. They need to be able to deal with various aspects – the delivery of healthcare, use of technology, commercial viability, adherence to quality, regulatory compliance. They need to plan, establish, and maintain capabilities required in their healthcare organization. For these users, we need to provide a formal and versioned specification that outlines the elements of the reference architecture, and how these elements relate to each other. In addition, this specification needs to provide guidance how to implement and use the reference architecture. To make the reference architecture actionable asks for a reference implementation, which is a released model of the specification. Ideally, the authors of the reference architecture should make this reference implementation available for download. Let us assume the reference implementation is developed in a specific modeling tool. For users of different modeling tools, the reference implementation should also be available in a neutral industry-standard exchange format, such as XMI or MOF. ... In many countries, healthcare organizations need to establish a Quality Management System. They want to use a blueprint to achieve compliance with ISO 9001 for healthcare.


Trends push IT and OT convergence opportunities and challenges

Historically, IT excluded real-time OT localized data and OT lacked IT data aggregation. Edge AI capabilities require both real-time computing and aggregation. Organizations have struggled to incorporate IoT and edge data into current processes because the data must be actionable in real-time, Devine said. Organizations must feed the data from the physical OT system to learn from it and make decisions from it. To aggregate data, organizations must break down data silos in different systems, such as manufacturing supply chains. Approximately 75% of data loses its value in milliseconds and data is only valuable to organizations if it is actionable, Devine said. If organizations must send data from the edge to the cloud, then real-time actions aren't viable. The challenge is getting an aggregate view across data silos to take localized action, but when real-time aggregation is achieved, organizations can derive more insights and look for new revenue opportunities. "IoT is the great provider of data. CEOs and CIOs [must] continually look to see how data can fuel digital transformation and drive innovation. IoT data is the fuel for analytics, machine learning… but it's also the source for CIOs to help fuel new business models [such as] as-a-service [and] work from anywhere," Turner said.


Using Microsoft 365 Defender to protect against Solorigate

From the threat analytics report, you can quickly locate devices with alerts related to the attack. The Devices with alerts chart identifies devices with malicious components or activities known to be directly related to Solorigate. Click through to get the list of alerts and investigate. Some Solorigate activities may not be directly tied to this specific threat but will trigger alerts due to generally suspicious or malicious behaviors. All alerts in Microsoft 365 Defender provided by different Microsoft 365 products are correlated into incidents. Incidents help you see the relationship between detected activities, better understand the end-to-end picture of the attack, and investigate, contain, and remediate the threat in a consolidated manner. ... The threat analytics report also provides advanced hunting queries that can help analysts locate additional related or similar activities across endpoint, identity, and cloud. Advanced hunting uses a rich set of data sources, but in response to Solorigate, Microsoft has enabled streaming of Azure Active Directory (Azure AD) audit logs into advanced hunting, available for all customers in public preview. These logs provide traceability for all changes done by various features within Azure AD.



Quote for the day:

"As a leader, you set the tone for your entire team. If you have a positive attitude, your team will achieve much more." -- Colin Powell

Daily Tech Digest - December 28, 2020

6 habits of successful IT leaders in 2021

“One of the many things we have learned from this crisis is how much improvement many of us need as IT leaders. Getting into the habit of working on developing our emotional intelligence daily will make us better leaders. This is often pointed out in others. However, we need to examine ourselves and find better ways to deal with the many emotions that arise from our current circumstances. IT leaders need to examine their own level of empathy as they manage folks they may no longer be able to walk over to and have a conversation with as you please. As we lead during this time of flexible schedules and distributed workforce, focus on developing more empathy and, honestly, just a bit more grace.”  “Be vulnerable and provide an atmosphere that will allow your team to feel supported to still do their best work even in this difficult time. Do not be that leader with a team that looks to get as far away from you following this crisis, or the leader whose team members throw in the towel before this crisis ends just to maintain their sanity.” – Cedric Wells, Director, IT Infrastructure Services, The Gorilla Glue Company ... Meditation is a powerful habit that can unlock this superpower. Many top business leaders like Ray Dalio, bestselling authors like Yuval Harari owe all their success to meditation.


SolarWinds Attack Gives Rise to New Runtime Security Models

A critical observation to make about this attack is that even though the attackers already had a digitally signed backdoor, they still needed to bring additional malicious code into the environment. The backdoor was a pretty big chunk of code and contained several C2 (command and control) functions compiled as part of the legitimate product. And yet, even this unusually big backdoor had no means to spread and perform sophisticated injection and theft scenarios. It required a post-deployment file-less malware (FireEye called it TEARDROP). It is thought that TEARDROP deployed a version of the Cobalt Strike BEACON payload, a penetration testing tool made for red teams that can also be used by attackers. This fact is critical since it is true to almost any attack and most of other backdoor cases. They look like tiny innocent coding oversights – basically, like any other vulnerabilities created as an honest mistake. From this point on, intentional backdoors and incidental vulnerabilities are used in very similar ways. Both are utilized to bring real malicious code – the exploit – into the target environment and perform the actual attack.


2021 will be the year open source projects overcome their diversity problems

In October 2020, the Linux Foundation announced a new Software Developer Diversity and Inclusion project to draw on science and research to deliver resources and best practices that increase diversity and inclusion in software engineering. Following the age-old tenet that “you cannot manage what you don’t measure”, the Hyperledger Diversity, Civility, and Inclusion (DCI) Working Group is focused on “measuring and improving the health of our open source community.”  In the OpenJS community, the Node+JS diversity scholarship program provides support to those from traditionally underrepresented or marginalized groups in the technology or open source communities who may not otherwise have the opportunity to attend the event for financial reasons. At KubeCon + CloudNativeCon this year, The Cloud Native Computing Foundation announced The Inclusive Naming Initiative to help remove harmful, racist, and unclear language in software development. At IBM, we had a similar program underway, and we have joined the CNCF initiative to further the cause. ... The AI Inclusive initiative seeks to increase the representation and participation of gender minority groups in AI. They offer offers events, tutorials, workshops, and discussions to guide community members in their AI careers.


Homomorphic Encryption: The 'Golden Age' of Cryptography

The origins of homomorphic encryption date back to 1978. That's when a trio of researchers at MIT developed a framework that could compute a single mathematical operation (usually addition or multiplication) under the cover of encryption. The concept gained life in 2009, when Craig Gentry, now a research fellow at the blockchain-focused Algorand Foundation, developed the first fully homomorphic encryption scheme for his doctoral dissertation at Stanford University in 2009. Gentry's initial proof was simply a starting point. Over the past decade, security concerns related to cloud computing, the Internet of Things (IoT), and the growing demand for shared and third-party data have all pushed the concept forward. Along the way, more powerful homomorphic algorithms have emerged. Today, the likes of IBM and Microsoft have entered the space, along with the US Defense Advanced Research Projects Agency (DARPA) and an array of startups. "There is a tremendous benefit to being able to perform computations directly on encrypted data," says Josh Benaloh, senior cryptographer at Microsoft Research. "This allows computations to be outsourced without risk of exposing the data."


How to securely hash and store passwords in your next application

A "salt" is a random piece of data that is often added to the data you want to hash before you actually hash it. Adding a salt to your data before hashing it will make the output of the hash function different than it would be if you had only hashed the data. When a user sets their password (often on signing up), a random salt should be generated and used to compute the password hash. The salt should then be stored with the password hash. When the user tries to log in, combine the salt with the supplied password, hash the combination of the two, and compare it to the hash in the database. Without going into too much detail, hackers commonly use rainbow table attacks, dictionary attacks, and brute-force attacks to try and crack password hashes. While hackers can't compute the original password given only a hash, they can take a long list of possible passwords and compute hashes for them to try and match them with the passwords in the database. This is effectively how these types of attacks work, although each of the above works somewhat differently. A salt makes it much more difficult for hackers to perform these types of attacks. Depending on the hash function, salted hashes take nearly exponentially more time to crack than unsalted ones. 


SaaS security in 2021

It’s clear to IT leaders that unvetted SaaS solutions (shadow IT) pose a variety of risks, including exposure of sensitive information, data ownership issues and regulatory compliance problems. The question is who is best suited to mitigate those risks, and in 2021, more companies will find that it takes a multidisciplinary strategy. A proactive governance approach requires a defined process involving a multidisciplinary team that ensures visibility and directly addresses risks to keep exposure within acceptable levels. Companies have to classify data in terms of integrity, confidentiality and availability to find the ideal balance between security and costs and determine acceptable risk levels. Cloud providers share responsibility to keep data secure along with the company, so it’s important to define exactly who is responsible for what. Companies typically manage user access, endpoint devices and data while SaaS vendors oversee apps, virtual machines, databases, etc. To fulfill their governance objectives, IT leaders will look for SaaS providers that offer multiple configuration options, including password settings/identity federations and authorization models, as well as availability plans to meet goals related to recovery time and recovery points.


Top five telecoms trends for 2021

5G has had many false starts, but 2021 could be the year when it really starts to take a predominant role in the telecoms space. With so many people now working remotely and using collaboration and messaging tools or video calls to communicate, we’ve started to see the demise of the traditional phone call. 5G is the ideal solution to replace landlines, using a SIM card as a fixed wireless access (FWA) to a cell tower, rather than having to install fibre cables physically into streets and homes. Investing in 5G infrastructure to give more workers around the country access to high quality, superfast connectivity is looking more and more like a political imperative to keep as much of the economy as possible working and productive. If it’s in the national interest, we might even see government support being provided to networks to deliver widespread 5G… or so networks will be hoping. ... Working from home has been a dominant theme of the coronavirus pandemic. Even if vaccination programmes soon return life to “normal” next year, some workplaces may not reopen their doors, on the basis that there is no longer a compelling commercial case to maintain a physical presence. All the necessary infrastructure businesses need to function, including telecoms, can be hosted in the public cloud. Remote connectivity is all they need.


The future of work is happening now thanks to Digital Workplace Services

It’s absolutely true that the pandemic elevated digital workplace technology from being a nice-to-have, or a luxury, to being an absolute must-have. We realized after the pandemic struck that public sector, education, and more parts of everyday work needed new and secure ways of working remotely. And it had to become instantaneously available for everyone. You had every C-level executive across every industry in the United States shifting to the remote model within two weeks to 30 days, and it was also needed globally. Who better than Dell on laptops and these other endpoint devices to partner with Unisys globally to securely deliver digital workspaces to our joint customers? Unisys provided the security capabilities and wrapped those services around the delivery, whereas we at Dell have the end-user devices. ... One of the big challenges in a merger or acquisition is how to quickly get the acquired employees working as first-class citizens as quickly as possible. That’s always been difficult. You either give them two laptops, or two desktops, and say, “Here’s how you do the work in the new company, and here’s where you do the work in the old company.” 


7 predictions for what lies ahead for health tech in 2021

The industry has heard about advances in AI for years, but in 2021, healthcare will start to see the benefits of machine learning in solutions that are highly scalable, predicts Tom Knight, CEO of Invistics. New technology funded by the National Institutes of Health (NIH), for example, can detect and fix many problems with medication administration, while helping to raise hospital revenues by millions of dollars annually, Knight said. Providers are increasingly accepting AI’s role in medicine and the capability to identify sequences and trends in data that humans cannot,” said Yann Fleureau, co-founder and CEO of Cardiologs. Kimberly Powell, vice president and general manager at NVIDIA Healthcare, predicts that hospitals will get “smarter.” Similar to the experience at home, smart speakers and smart cameras will help automate and inform activities. The technology, when used in hospitals, will help scale the work of nurses on the front lines, increase operational efficiency and provide virtual patient monitoring to predict and prevent adverse patient events, said Powell. ... John Matthews, managing director of healthcare and life sciences at Teradata, said the smart money is on leaders that recognize the difference between solutions that solve problems and trends that attract mob mentality and next-silver-bulletism.


Remote work: 10 ways to upgrade your working from home setup

Cybersecurity is more important than ever for this newly distributed and heterogeneously equipped workforce, for whom commuting is a fading memory (along with real-world interaction with colleagues and clients). Although there are obvious downsides to remote working, including work/life balance and long-term mental health, many of us are likely to continue working from home on a regular basis after the pandemic. That being so, it's obviously a good idea to have the best equipment for the job: there's a big difference between spending a couple of hours on your laptop at the kitchen table outside normal working hours and making this arrangement your primary workspace. To get an idea of the kind of setups that knowledge workers should be looking at in 2021 and beyond, it's worth examining the contents of ZDNet contributors' home offices, as featured on this site over recent weeks. These are journalists who have been working from home for years, and who are also, by definition, up-to-speed with the latest technology. This means that their gear is mostly at the power-user end of the knowledge worker spectrum, giving a good indication of what may become standard fare in the 'new normal'.



Quote for the day:

“People rarely succeed unless they have fun in what they are doing.” -- Dale Carnegie

Daily Tech Digest - December 27, 2020

Adopting AI Responsibly to Prevent Risks to Your Brand and Customer Experience

For years, AI has been successfully used in a variety of sectors, from chatbots to the automotive industry. The easier question to answer would be: Where is AI not being used yet? The challenge is how to place ethics above boardroom priorities, which put profit ahead of people. I wish I could say this is a thing of the past, but many companies still have this goal, even if it’s not intentional. I am not saying that profit is not or should not be the aim of businesses. I mean that fairness toward customers should be considered above any self-interest. Profit is often the result of substantial and desirable ethical work for employees, customers, partners, and stakeholders. Unfortunately, we still have companies that unintentionally (?) break customer trust, and in some cases, had even been aware of the issue. This can seriously affect any past efforts to develop a brand people trust. Even if your CX is perceived of as great, with one breach of trust, you will lose your key assets: employees and customers. Customers oftentimes love AI and machine learning (ML). Both businesses and customers alike reap the benefits of this complementary technologies. What we all want – based on a recent BCG survey – is something quite simple.


Why Is Cloud Governance Critical for the Success of Cloud Adoption?

The objective of cloud governance is not to limit access to cloud resources but to streamline it and manage it more efficiently. An intelligently-designed cloud governance policy lays the foundation for the success of your cloud adoption. ... The absence of a cloud governance framework usually means that an alternate governance policy, typically a more centralized one, exists. Such a centralized approach inhibits fast decision-making, which works against the organizational goals like agility and flexibility in adding new revenue streams, expanding to new markets, reducing time-to-market, and so on. It impedes the company’s responsiveness against the market dynamics, giving the competition more room to meet the customer needs. A comprehensive cloud governance framework complements an organization’s goals without compromising its resource utilization objectives. ... Cloud service providers recommend their customers to move their multiple-tenant workloads that reside in a single account to different cloud accounts. This allows organizations to offer precise control to users for only the workloads that are relevant to them. Such a siloed management of access drastically limits the financial and security fallout resulting from an issue – be it a technical issue or a security issue.


Implications Of AI On Everyday Documents

Industries generate millions of documents every month as a byproduct of business processes. And each of these documents contains meaningful snippets of information that are hidden deep inside. Once the datasets are identified, collected, and cleansed, it was time to move to the next step. The next step includes providing meaning to text extracted from digital assets such as documents, text files, and scanned images and use these datasets to feed downstream business apps, set up workflows and optimise business processes. All previous attempts of using AI to understand documents have failed because it focused on the co-occurrence of individual words and phrases existing in individual business documents. It was time to move beyond that by creating tools that could understand different portions of the document and their unique usage in any organisation. Therefore, the result is an intelligent model that can look for specific entities such as dates, contract numbers, purchase order numbers, etc. in different documents in minutes to generate meaningful insights and accelerate business outcomes. Think itinerary processing, financial compliance, auditing, renewal follow-up, invoice processing, and so on, all reviewed, identified, and automated.


The digital revolution: eight technologies that will change health and care

Wearable devices are in a newer category of technologies encompassing smartwatches (eg, an Apple Watch), activity trackers (eg, a Fitbit) and connected patches (eg, a smart bandage or smart plaster). These are generally in direct contact with the wearer for long durations, generating large quantities of data on specific biometrics or behaviours. Many large technology companies are positioning these devices as health or wellness devices not medical devices – currently side-stepping regulatory requirements. However, there is potential for these devices to be widely used in health and care, as well as by individuals to improve their health and care. For example, a wearable sensor measuring heart rate can give a truer indication of a person’s heart-rate at various stress levels (sitting, standing, walking, etc) over time instead of a single one-off measure in a surgery which could be erroneous due to patient anxiety or stress. App stores already feature thousands of health and wellbeing apps, encompassing everything from diet diaries and mindfulness guidance to period trackers and musculoskeletal rehabilitation support. However, the uptake by the health and care system has been patchy due to a range of issues including quality, evidence, clinician knowledge, confidence and skills, and integration into pathways.


Diversity and inclusion make IT stronger

“Diversity at its core takes advantage of people’s unique backgrounds and skill sets and makes for richer content and discovery of what we do,” says Darren Dworkin, senior vice president and CIO at Cedars-Sinai; the healthcare organization was ranked No. 3 for D&I on the 2020 Best Places to Work in IT list. “The IT group spends a lot of time working closely with all sorts of departments and stakeholders … and having folks with different backgrounds and skills reflects that and helps us relate and translate so we can be a department that contributes to the mission of the organization.” For Erickson Living, diversity & inclusion has long been a core value, exemplified through one of the three pillars in its Employee Transformation initiative, which includes curiosity, competence, and community. Erickson, which ranked No. 1 for diversity on the 2020 Best Places to Work in IT list, maintains a robust D&I council made up of employees and leadership and offers D&I training at all levels to instruct managers, directors, and rank-and-file employees in such issues as how to overcome unconscious bias in the workforce and how to promote civil treatment among colleagues.


The Season for Nonprofit Cybersecurity Risks to Reach New Heights

Nonprofits and charities frequently outsource a lot of their day-to-day IT work or make use of cloud-hosted solutions, such as software-as-a-service (SaaS) options. However, there’s a growing trend among threat actors to exploit third-party providers in order to gain access to their customers’ data assets. As the past year’s high-profile breach of Blackbaud should remind us, nonprofits may be at extra risk from these types of attacks. It’s vital to examine the data privacy and security policies that your provider has in place, as well as to check that all relevant compliance audits are up to date. Cloud use has grown across industries over the course of 2020. This, in turn, puts increased pressure on managed service providers, who are challenged to find the talent they need to manage their data in the cloud. (Cloud environments require more specialized skills because they’re so complex.) Finally, it may be worthwhile to purchase cybersecurity insurance if you don’t already carry it. A well-chosen policy can absorb many of the financial risks that come with collecting donations online. 


Digital Transformation: The Key To Tackling Climate Change

The starting point is to understand how and where energy is consumed, lost or wasted and here digital technologies are key. Sensors that can monitor performance, software that can connect operations with IT systems, automation and analytics will equip organisations and individuals alike with the ability to better manage and optimise their environment whether at work or home. The good news is that most of the energy and digital automation technology already exists to enable us to do so. Consider data centres. According to a report by the International Energy Agency in June, there was an increase in global internet traffic of 40% between February and April as the use of the cloud for remote working became more prevalent. The physical aspects of work across many industries have changed, and we can expect digital channels and e-commerce to remain the default for the ways things are done for the foreseeable future and long after the pandemic has been consigned to history. Yet far from being like the ‘dark satanic mills’ William Blake wrote about in the first industrial revolution, data centres can be carbon neutral, carbon positive, even. EcoDataCenter in Sweden is the world's first climate-positive data centre.


Top Digital Banking Transformation Trends for 2021

Financial marketers were thrust into the spotlight with the COVID crisis, needing to respond to unforeseen circumstances in an instant. Instead of selling services, marketers were being asked to customize communication to help customers deal with the financial impact of the pandemic. Blanket communication around branch closures quickly needed to transform to personalized messages around loan payment deferrals and how to use unfamiliar digital tools. The coming year will bring the next evolution of financial marketing, leveraging data and advanced analytics to provide predictive personalization. This use of AI and machine learning will result in tailored websites, real-time financial recommendations, and a level of test and learn capabilities far beyond what was imagined just a few years ago. For the vast majority of financial institutions, this level of personalization is playing a game of catch up in meeting consumer expectations. Accenture found that nearly half (48%) of consumers abandoned a purchase process when the website did not personalize the experience, with nearly all consumers (91%) saying they are more likely to do business with brands that know them, look out for them and reward them.


8 Soft Skills That Make You an Even Better Leader

Emotional intelligence (EQ) is defined as “the capacity to be aware of, control and express one’s emotions, and to handle interpersonal relationships judiciously and empathetically.” Those with high EQ are better able to handle high-pressure situations, conflict resolution, constructive criticism, and more. This ability is highly sought-after for teams, especially ones made up of differing backgrounds. According to a survey conducted by CareerBuilder, 75% of hiring managers valued EQ over IQ. Hard skills and intelligence are more easily taught to employees while EQ takes more time and understanding to grasp. ... Overwhelm is something many entrepreneurs must deal with. That’s the world we live in today. That’s where self-motivation comes into play. We must all learn how to manage our energy. Energy comes not just from having a balanced diet, but also from our personal drive to achieve, resilience, and commitment. A personal drive to achieve is directly linked to our mindset. Research shows that those with a growth mindset are far more like to succeed in the endeavors they engage in because they believe they can improve. Resilience is born out of courage to overcome challenges.


Data Distribution in Apache Ignite

Inevitably, the evolution of a system that requires data storage and processing reaches a threshold. Either too much data is accumulated, so the data simply does not fit into the storage device, or the load increases so rapidly that a single server cannot manage the number of queries. Both scenarios happen frequently. Usually, in such situations, two solutions come in handy—sharding the data storage or migrating to a distributed database. The solutions have features in common. The most frequently used feature uses a set of nodes to manage data. ...  The problem of data distribution among the nodes of the topology can be described in regard to the set of requirements that the distribution must comply with: The algorithm allows the topology nodes and front-end applications to discover unambiguously on which node or nodes an object (or key) is located; The more uniform the data distribution is among the nodes, the more uniform the workloads on the nodes is. Here, I assume that the nodes have approximately equal resources; and If the topology is changed because of a node failure, the changes in distribution should affect only the data that is on the failed node. It should also be noted that, if a node is added to the topology, no data swap should occur among the nodes that are already present in the topology.



Quote for the day:

"If you find a path with no obstacles, it probably doesn't lead anywhere." -- Frank A Clark

Daily Tech Digest - December 26, 2020

Ransomware: Attacks could be about to get even more dangerous and disruptive

Ransomware attacks have become more powerful and lucrative than ever before – to such an extent that advanced cyber-criminal groups have switched to using it over their traditional forms of crime – and it's very likely that they're just going to become even more potent in 2021.  For example, what if ransomware gangs could hit many different organisations at once in a coordinated attack? This would offer an opportunity to illicitly make a large amount of money in a very short amount of time – and one way malicious hackers could attempt to do this is by compromising cloud services with ransomware. "The next thing we're going to see is probably more of a focus on cloud. Because everyone is moving to cloud, COVID-19 has accelerated many organisations cloud deployments, so most organisations have data stored in the cloud," says Andrew Rose, resident CISO at Proofpoint. We saw a taster of the extent of the widespread disruption that can be caused when cyber criminals targeted smartwatch and wearable manufacturer Garmin with ransomware. The attack left users around the world without access to its services for days. If criminals could gain access to cloud services used by multiple organisations and encrypt those it would cause widespread disruption to many organisations at once.


Overcoming Data Scarcity and Privacy Challenges with Synthetic Data

Synthetic data is data that is artificially generated rather than collected by real-world events. It is data that serves the purpose of resembling a real dataset but is entirely fake in nature. Data has a distribution, a shape that defines the way it looks. Picture a dataset in a tabular format. We have all these different columns and there are hidden interactions between the columns, as well as inherent correlations and patterns. If we can build a model to understand the way the data looks, interacts, and behaves, then we can query it and generate millions of additional synthetic records that look, act, and feel like the real thing. Now, synthetic data isn’t a magical process. We can’t start with just a few poor-quality data points and expect to have a miraculous high-quality synthetic dataset from our model. Just like the old saying goes, "garbage in, garbage out," in order to create high-quality synthetic data, we need to start with a dataset that is both high-quality and plentiful in size. With this, it is possible to expand our current dataset with high-quality synthetic data points.


How Brexit Could Help London Evolve From A Fintech Center Into A DeFi Hub

The popularity of DeFi—using crypto technology to recreate traditional financial instruments such as loans and insurance—has exploded over the last year or so, growing to a $16 billion global market. The price of ethereum, the world's second largest cryptocurrency by value, has soared this year as investors pour funds into DeFi projects that are built on top of it. "There's more and more DeFi innovators in London," said Stani Kulechov, the founder and chief executive of London-based technology company and DeFi protocol Aave, speaking over the phone. "Up until recently, fintechs and banks have been all about innovating on the front-end—the user experience. Now, DeFi is helping the back-end innovate." Aave, a money market for lending and borrowing assets, has become one of the top DeFi protocols since it was created in 2017 and was given an Electronic Money Institution license in July by the U.K.'s Financial Conduct Authority (FCA). "I think we'll see London emerge as a hub for DeFi," added Kulechov. The City of London, a financial powerhouse rivaled only by New York, is currently under threat as the U.K. prepares to end its transition out of the European Union at the end of this month.


Outlook 2021: Designing data governance policies to promote domestic startups

With more and more startups relying on data driven business models and analytics for improving the service/product, and using data for their competitive advantage, data governance laws with steep compliances are a cause for worry. The regulations will have a direct effect on how the businesses deal with data available to them, and that is on the market. The regulatory uncertainty in matters pertaining to handling data and the drawing economic value from data, causes indirect impact on long term innovation and investments as well. Investors that are looking for facilitating growth in the domestic market are also deeply concerned about the current trend of steep compliance, excessive government access to data and regulatory uncertainty. In this context, the commonalities in the two frameworks are pertinent to note. Firstly, both the PDP and the NPD framework restrict cross border data flows, citing reasons pertaining to sensitivity of data that underlies it. While the concerns regarding harm are valid, the solution to address the concerns might be misplaced. The assumption is that security is better served if the data is stored within the territorial limits of the country and that rests on shaky grounds. 


Why cybersecurity tools fail when it comes to ambiguity

"Cybersecurity is very good at identifying activities that are black or white--either obviously bad and dangerous or clearly good and safe," writes Margaret Cunningham, PhD, psychologist and principal research scientist at Forcepoint's Innovation Lab, in her research paper Exploring the Gray Space of Cybersecurity with Insights from Cognitive Science. "But, traditional cybersecurity tools struggle with ambiguity--our algorithms are not always able to analyze all salient variables and make a confident decision whether to allow or block risky actions." For example, an employee accessing sensitive files after company business hours might not be a security issue--the person could be traveling and in a different time zone. "We don't want to stop the person from doing work because the access is flagged as an unapproved intrusion due to the time," says Cunningham. "Building the capability to reason across multiple factors, or multiple categories, will help prevent the kinds of concrete reasoning mistakes that result in false positives and false negatives in traditional cyber toolsets." The success of cybercriminals, admits Cunningham, is in large part due to their ability to quickly morph attack tools, and cybersecurity tech cannot keep pace.


The Benefits of Automating Data Lineage in the Initial Phases of a Data Governance Initiative

If you are putting in place a data governance framework you can’t put controls and data quality reports on every single piece of data throughout your organisation. But if you have data lineage it will help you identify the areas where your data is most at risk of something going wrong, enabling you to put in place appropriate checks, controls and data quality reports. Having data lineage also allows you to speed up data discovery. So many organisations have vast quantities of data that would be valuable to them, if only they knew it existed. Finally, as I mentioned at the start of this article for many industries there is a regulatory requirement to have data lineage in place. It’s clear that having data lineage has lots of benefits, but on so many occasions data lineage is captured and documented manually. Whether you do data lineage automatically or manually you will achieve the benefits mentioned above, but taking a manual approach to data lineage requires considerable effort. When I first started capturing data lineage I tried starting at the beginning, where data first comes into the organisation and tried to follow it as it flowed. However, this approach fails because a lot of people who produce or capture data have absolutely no idea where it goes.



Why Credit Karma Crafted a Tool to Automate Its DevOps Cycle

Unruh says part of his challenge when he joined Credit Karma about three years ago was to increase efficiency of releasing code across the company. The engineers there had been using an older Jenkins-style system, he says, which served as a generic job runner. Developing products on that system meant clearing a few hurdles along the way, Unruh says, including jumping through a remote desktop running on a Windows computer. On top of that, teams building new microservices were required to write custom deployment code to move production forward, he says. That would be the basis for the job for the system to execute the service, Unruh says. That meant everything was different because every team took their own approach, he says, which slowed them down. “It linearly required 15 steps just to deploy your service into production,” Unruh says. “It was really cumbersome and there was no way for us to standardize.” Looking for ways to improve efficiency, he wanted to eliminate the need to jump to another host just to access the system. Unruh says he also sought to end the need for custom code for deploying a service. “I just build a service and I can deploy it,” he says.


Q&A on the Book Retrospectives Antipatterns

Retrospectives antipatterns are patterns I have seen recurring in many retrospectives, and the way I have described them in the book is in the context you would normally find them, the antipattern "solution" that is often used for various reasons, such as haste, ignorance, or fear, and the refactored solution to this antipattern. Some of the antipatterns have a refactored solution that will get you out of the pickle right away, but for some of the others it is more a warning of things to avoid, because if you find yourself in that antipattern there is nothing better to do than to consider other options for the next retrospective. ... The prime directive was written by Norm Kerth in his book "Project Retrospectives: A Handbook for Team Review" and it goes like this: "Regardless of what we discover, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand." It basically means that when we enter a retrospective we should strive to be in the mindset that allows us to think that everybody did the best they could at all times, given the circumstances.


Here’s How CIOs Can Create More Inclusive Cultures In Their Tech Teams

Often, diversity and inclusion outcomes are directly linked to recruitment and outreach efforts. But while many people fret about flaws in the education system that seem to discourage young women from pursuing tech-related subjects, Barrett has found in her work with Girls Who Code that the problem lies elsewhere. It’s not a lack of interest amongst female students, she said. Instead, it’s the culture of the technology industry. Girls who complete the organization’s program go on to major in computer science at a rate of 15 times the national average. But, Barrett noted, “our girls still don’t feel welcome in tech.” According to a recent report by Girls Who Code and consulting firm Accenture, it’s possible to lower the attrition rate for female employees by 70% over the next decade. The study’s recommendations include establishing supportive parental leave policies, creating external goals and targets around diversity, providing workplace support for women and creating inclusive networking opportunities. Role models are also crucial. “We know very often that women report that it’s hard to be what they can’t see,” Barrett said. “It’s hard to feel connected to an organization when they don’t see women in tech thriving.”


Commonwealth entities left to self-assess security in cloud procurement

Macquarie Government managing director Aidan Tudehope said he was disappointed by the decision to discontinue the CCSL certification regime. "This is about more than simply the physical geographic location where data is stored. Data sovereignty is about the legal authority that can be asserted over data because it resides in a particular jurisdiction, or is controlled by a cloud service provider over which another jurisdiction extends," he said. "Data hosted in globalised cloud environments may be subject to multiple overlapping or concurrent jurisdictions as the debate about the reach of the US CLOUD Act demonstrates. As the ACSC points out, globalised clouds are also maintained by personnel from outside Australia, adding another layer of risk." He believes the only way to guarantee Australian sovereignty is ensuring data is hosted in an Australian cloud, in an accredited Australian data centre, and is accessible only by Australian-based staff with appropriate government security clearances. "Taken alongside Minister Robert's planned sovereign data policy, this guide opens new opportunities for Australian cloud service providers," he said.



Quote for the day:

"The most important quality in a leader is that of being acknowledged as such." -- Andre Maurois

Daily Tech Digest - December 25, 2020

An introduction to data science and machine learning with Microsoft Excel

To most people, MS Excel is a spreadsheet application that stores data in tabular format and performs very basic mathematical operations. But in reality, Excel is a powerful computation tool that can solve complicated problems. Excel also has many features that allow you to create machine learning models directly into your workbooks. While I’ve been using Excel’s mathematical tools for years, I didn’t come to appreciate its use for learning and applying data science and machine learning until I picked up Learn Data Mining Through Excel: A Step-by-Step Approach for Understanding Machine Learning Methods by Hong Zhou. Learn Data Mining Through Excel takes you through the basics of machine learning step by step and shows how you can implement many algorithms using basic Excel functions and a few of the application’s advanced tools. While Excel will in no way replace Python machine learning, it is a great window to learn the basics of AI and solve many basic problems without writing a line of code. Linear regression is a simple machine learning algorithm that has many uses for analyzing data and predicting outcomes. Linear regression is especially useful when your data is neatly arranged in tabular format. Excel has several features that enable you to create regression models from tabular data in your spreadsheets.


The Four Mistakes That Kill Artificial Intelligence Projects

Humans have a “complexity bias,” or a tendency to look at things we don’t understand well as complex problems, even when it’s just our own naïveté. Marketers take advantage of our preference for complexity. Most people would pay more for an elaborate coffee ritual with specific timing, temperature, bean grinding and water pH over a pack of instant coffee. Even Apple advertises its new central processing unit (CPU) as a “16-core neural engine” instead of a chip and a “retina display” instead of high-definition. It’s not a keyboard; it’s a “magic keyboard.” It’s not gray; it’s “space gray.” The same bias applies to artificial intelligence, which has the unfortunate side effect of leading to overly complex projects. Even the term “artificial intelligence” is a symptom of complexity bias because it really just means “optimization” or “minimizing error with a composite function.” There’s nothing intelligent about it. Many overcomplicate AI projects by thinking that they need a big, expensive team skilled in data engineering, data modeling, deployment and a host of tools, from Python to Kubernetes to PyTorch. In reality, you don’t need any experience in AI or code.


Three reasons why context is key to narrowing your attack surface

Security has become too complex to manage without a contextual understanding of the infrastructure, all assets and their vulnerabilities. Today’s typical six-layer enterprise technology stack consists of networking, storage, physical servers, as well as virtualization, management and application layers. Tech stacks can involve more than 1.6 billion versions of tech installations for 300+ products provided by 50+ vendors, per Aberdeen Research. This sprawl is on top of the 75 security products that an enterprise leverages on average to secure their network. Now, imagine carrying over this identical legacy system architecture but with thousands of employees all shifting to remote work and leveraging cloud-based services at the same time. Due to security teams implementing new network configurations and security controls essentially overnight, there is a high potential of new risks being introduced through misconfiguration. Security teams have more ingress and egress points to configure, more technologies to secure and more changes to properly validate. The only way to meaningfully address increased risk while balancing limited staff and increased business demands is to gain contextual insight into the exposure of the enterprise environment that enables smarter, targeted risk reduction.


How to Really Improve Your Business Processes with “Dashboards”

The managers’ success is measured mainly by this task. That is correct in most cases as he is usually receiving a bonus based on how well the KPIs under his responsibility perform over a given period. Our goal is to design an information system to support him with this task. The best way to do that is to help him answer the main questions related to each step based on the data we have: Is there a problem? What caused the problem? Which actions should we take? Were the actions successful? From looking at the questions above, you can already tell that the dashboard described at the beginning of this article only helps answer the first question. Most of the value that an automated analytics solution could have is left out. Let’s have a look at how a more sophisticated solution could answer these questions. I took the screenshots from one of the actual dashboards we implemented at my company. ... The main idea is that a bad result is, in most cases, not caused by the average. Most of the time, outliers drag down the overall result. Consequently, showing a top-level KPI without quickly allowing for root cause analysis leads to ineffective actions as the vast majority of dimension members is not a problem.


The top 5 open-source RPA frameworks—and how to choose

RPA has the potential to reduce costs by 30% to 50%. It is a smart investment that can significantly improve the organization's bottom line. It is very flexible and can handle a wide range of tasks, including process replication and web scraping. RPA can help predict errors and reduce or eliminate entire processes. It also helps you stay ahead of the competition by using intelligent automation. And it can improve the digital customer experience by creating personalized services. One way to get started with RPA is to use open-source tools, which have no up-front licensing fees. Below are five options to consider for your first RPA initiative, with pros and cons of each one, along with advice on how to choose the right tool for your your company. ... When compared to commercial RPA tools, open source reduces your cost for software licensing. On the other hand, it may require additional implementation expense and preparation time, and you'll need to rely on the open-source community for support and updates. Yes there are trade-offs between commercial and open souce RPA tools—I'll get to those in a minute. But when used as an operational component of your RPA implementations, open-source tools can improve the overall ROI of your enterprise projects. Here's our list of contenders.


What happens when you open source everything?

If you start giving away the product for free, it’s natural to assume sales will slow. The opposite happened. (Because, as Ranganathan pointed out, the product wasn’t the software, but rather the operationalizing of the software.) “So on the commercial side, we didn’t lose anybody in our pipeline [and] it increased our adoption like crazy,” he said. I asked Ranganathan to put some numbers on “crazy.” Well, the company tracks two things closely: creation of Yugabyte clusters (an indication of adoption) and activity on its community Slack channel (engagement being an indication of production usage). At the beginning of 2019, before the company opened up completely, Yugabyte had about 6,000 clusters (and no Slack channel). By the end of 2019, the company had roughly 64,000 clusters (a 10x boom), with 650 people in the Slack channel. The Yugabyte team was happy with the results. The company had hoped to see a 4x improvement in cluster growth in 2020. As of mid-December, clusters have grown to nearly 600,000, and could well get Yugabyte to another 10x growth year before 2020 closes. As for Slack activity, they’re now at 2,200, with people asking about use cases, feature requests, and more.


Simplifying Cybersecurity: It’s All About The Data

The most effective way to secure data is to encrypt it and then only decrypt it when an authorized entity (person or app) requests access and is authorized to access it. Data moves between being at rest in storage, in transit across a network and in use by applications. The first step is to encrypt data at rest and in motion everywhere, which makes data security pervasive within the organization. If you do not encrypt your network traffic inside your “perimeter,” you aren’t fully protecting your data. If you encrypt your primary storage and then leave secondary storage unencrypted, you are not fully protecting data. While data is often encrypted at rest and in transit, rarely is it encrypted while in use by applications. Any application or cybercriminal with access to the server can see Social Security numbers, credit card numbers and private healthcare data by looking at the memory of the server when the application is using it. A new technology called confidential computing makes it possible to encrypt data and applications while they are in use. Confidential computing uses hardware-based trusted execution environments (TEEs) called enclaves to isolate and secure the CPU and memory used by the code and data from potentially compromised software, operating systems or other VMs running on the same server.


Why the US government hack is literally keeping security experts awake at night

One reason the attack is so concerning is because of who may have been victimized by the spying campaign. At least two US agencies have publicly confirmed they were compromised: The Department of Commerce and the Agriculture Department. The Department of Homeland Security's cyber arm was also compromised, CNN previously reported. But the range of potential victims is much, much larger, raising the troubling prospect that the US military, the White House or public health agencies responding to the pandemic may have been targeted by the foreign spying, too. The Justice Department, the National Security Agency and even the US Postal Service have all been cited by security experts as potentially vulnerable. All federal civilian agencies have been told to review their systems in an emergency directive by DHS officials. It's only the fifth such directive to be issued by the Cybersecurity and Infrastructure Security Agency since it was created in 2015. It isn't just the US government in the crosshairs: The elite cybersecurity firm FireEye, which itself was a victim of the attack, said companies across the broader economy were vulnerable to the spying, too.


Creating the Corporate Future

With the move to the later 20th century, post-industrial age began with the systems thinking at its core. The initiating dilemma for this change was that not all problems could be solved by the prevailing world view, analysis. It is unfortunate to think about the number of MBAs that were graduated with callus analysis at their core. As enterprise architects know, when a system is taken apart, it loses its essential properties. A system is a whole that cannot be understood through analysis. What is needed instead is a synthesis or the putting together things together. In sum, analysis focuses on structure whereas synthesis focuses on why things operate as they do.  At the beginning of the industrial age, the corporation was viewed as a legal mechanism and as a machine. However, in the post-industrial age, Ackoff suggests a new view of the corporation. He suggests a view of the corporation as a purposefully system that is part of more purposeful systems and parts of which, people, have purposes on their own. Here leaders need to be aware of the interactions of corporations at the societal, organizational, and individual level. At the same time, they need to realize how an organizations parts affect the system and how external systems affect the system.


Microservices vs. Monoliths: An Operational Comparison

There are a number of factors at play when considering complexity: The complexity of development, and the complexity of running the software. For the complexity of development the size of the codebase can quickly grow when building microservice-based software. Multiple source codes are involved, using different frameworks and even different languages. Since microservices need to be independent of one another there will often be code duplication. Also, different services may use different versions of libraries, as release schedules are not in sync. For the running and monitoring aspect, the number of affected services is highly relevant. A monolith only talks to itself. That means it has one potential partner in its processing flow. A single call in a microservice architecture can hit multiple services. These can be on different servers or even in different geographic locations. In a monolith, logging is as simple as viewing a single log-file. However, for microservices tracking an issue may involve checking multiple log files. Not only is it necessary to find all the relevant logs outputs, but also put them together in the correct order. Microservices use a unique id, or span, for each call.



Quote for the day:

"If something is important enough, even if the odds are against you, you should still do it." -- Elon Musk