Daily Tech Digest - March 14, 2021

The Agile Manifesto 20 years on: agility in software delivery is still a work in progress

What's missing from many agile initiatives is "ways to manage what you do based on value and outcomes, rather than on measuring effort and tasks," says Morris. "We've seen the rise of formulaic 'enterprise agile' frameworks that try to help you to manage teams in a top-down way, in ways that are based on everything on the right of the values of the Agile Manifesto. The manifesto says we value 'responding to change over following a plan,' but these frameworks give you a formula for managing plans that don't really encourage you to respond to change once you get going." ... Ritchie agrees that there's too much of a tendency to pigeonhole agile into rigid processes. "The first and most-common mistake is the interpretation of agile as simply a process, or something you can just buy and do to immediately call yourself agile," says Ritchie. "This more often than not results in process for the sake of process, frustration, and - contradictory to the intent of agile - an even further disconnect between business outcomes and the IT professionals chartered to deliver them." Related to this, he says, is there often can be a "dogmatic agile zealot approach, where everything a particular framework says must be taken as gospel...'"


Combining edge computing and IoT to unlock autonomous and intelligent applications

Edge computing isn’t limited to just sensors and other IoT; it can also involve traditional IT devices, such as laptops, servers, and handheld systems. Enterprise applications such as enterprise resource planning (ERP), financial software, and data management systems typically don’t need the level of real-time instantaneous data processing most commonly associated with autonomous applications. Edge computing has the most relevance in the world of enterprise software in the context of application delivery. Employees don’t need access to the whole application suite or all of the company’s data. Providing them just what they need with limited data generally results in better performance and user experience. Edge computing also makes it possible to harness AI in enterprise applications, such as voice recognition. Voice recognition applications need to work locally for fast response, even if the algorithm is trained in the cloud. “For the first time in history, computing is moving out of the realm of abstract stuff like spreadsheets, web browsers, video games, et cetera, and into the real world,” Thomason said. Devices are sensing things in the real world and acting based on that information.


From The Vault: Top Statistical Ideas Behind The Data Science Boom

Improved data collection strategies (think sensors, Internet) have resulted in enormous datasets. But, data collection and curation consumes nearly 80% of a data engineer’s typical day. Data is still a problem. More so a couple of decades ago. The idea behind bootstrap distribution is to use it as an approximation to the data’s sampling distribution. According to researchers, parametric bootstrapping, prior and posterior predictive checking, and simulation-based calibration allow replication of datasets from a model instead of directly resampling from the data. Calibrated simulation in the face of uncertain data volumes is a standard procedure rooted in statistics and helps in analysing complex models or algorithms. Gelman and Vehtari believe the future research will lean more towards inferential methods, taking ideas such as unit testing from software engineering and applying them to problems of learning from noisy data. “As our statistical methods become more advanced, there will be a continuing need to understand the links between data, models, and substantive theory,” concluded the authors. The ideas mentioned above have laid the foundation for modern-day deep learning and other such tools.


Driving innovation with emotional intelligence

EQ is increasingly recognized as a competitive advantage, according to a survey by Harvard Business Review Analytic Services. It found that emotionally intelligent organizations get an innovation premium. These organizations reported more creativity, higher levels of productivity and employee engagement, significantly stronger customer experiences, and higher levels of customer loyalty, advocacy, and profitability. Organizations that did not focus on emotional intelligence had “significant consequences, including low productivity, lukewarm innovation, and an uninspired workforce,” said the report. ... Verizon surveyed senior business leaders both before and after covid-19. Before the pandemic, less than 20% of respondents said EQ would be an important skill for the future. But since covid, EI increased in significance for 69% of respondents. ... “A sure way to stifle innovation is to not have the emotional maturity to recognize that innovation and creativity can come from many sources,” says Steele. “I think that our agency has hugely benefited from research institutes, large businesses, small businesses, and individual contributors.” She continues, “The capacity to recognize untapped sources of innovation, then bringing them together in a system, is a great ability to have.”


Three flaws that sat in Linux kernel since 2006 could deliver root privileges to attackers

While the vulnerabilities “are in code that is not remotely accessible, so this isn’t like a remote exploit,” said Nichols, they are still troublesome. They take “any existing threat that might be there. It just makes it that much worse,” he explained. “And if you have users on the system that you don’t really trust with root access it, it breaks them as well.” Referring to the theory that ‘many eyes make all bugs shallow,’ Linux code “is not getting many eyes or the eyes are looking at it and saying that seems fine,” said Nichols. “But, [the bugs] have been in there since the code was first written, and they haven’t really changed over the last 15 years.” As a matter of course, GRIMM researchers try “to dig in” and see how long vulnerabilities have existed when they can – a more feasible proposition with open source. That the flaws slipped detection for so long has a lot to do with the sprawl of the the Linux kernel. It “has gotten so big” and “there’s so much code there,” said Nichols. “The real strategy is make sure you’re loading as little code as possible.”


Master Data Management Much More Than Technology

Industry experts define data governance as the “authority over the management of data assets” and assigning “accountability for the quality of your organization’s data.” Having authority over data assets is the function of data ownership. Being accountable for the quality of these data assets is the function of data stewardship. Data is a business asset, and business assets are controlled by business people. Therefore, data owners and data stewards should be business people. They must be careful not to manage their data within the narrow focus of their own business unit (department or division); instead, they must ensure that their data is managed from an enterprise perspective so that it can be used and shared by all business units. Enterprise information management (EIM) is about the administration of data. One industry expert describes EIM as “a function, typically dedicated to an organization in IT, for maintaining, cataloging, and standardizing corporate data.” This is done with the help of data stewards under the umbrella of a data strategy, and by establishing data-related standards, policies, and procedures.


Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI?

Lights travel faster than electrons. The concept of using light as a substitute for carrying out heavy tasks (aka photonics computing/optical computing) dates back to the 1980s, when Nokia Bell Labs, an American industrial research and scientific development company, tried to develop a light-based processor. However, due to the impracticality of creating a working optical transistor, the concept didn’t take off. We experience optical technology in cameras, CDs, and even in Blue-Ray discs. But these photons are usually converted into electrons to deploy in chips. Four decades later, photonic computing gained momentum when IBM and researchers from the University of Oxford Muenster developed the system that uses light instead of electricity to perform several AI model-based computations. Alongside, Lightmatter’s new AI chip has created a buzz in the industry. According to the company website, Envise can run the largest neural networks three times higher inferences/second than the Nvidia DGX-A100, with seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.


Why Data Management Needs An Aggregator Model

The traditional approach to managing unstructured data has been storage-centric; you move data to a storage system, and the storage system manages your data and gives you some tools to search it and report on it. This approach worked and made things easier when data volumes were small and all of an enterprise's data could fit in a single storage solution. As enterprises shift to a hybrid multicloud architecture, they can no longer manage data within each storage silo, search for data within each storage silo and pay a heavy cost to move data from one silo to another. As GigaOm analyst Enrico Signoretti pointed out: "The trend is clear: The future of IT infrastructures is hybrid ... [and] it requires a different and modern approach to data management." Another key reason an aggregator model for data management is needed is that customers want to extract value from their data. To analyze and search unstructured data, vital information is stored in what is called "metadata" — information about the data itself. Metadata is like an electronic fingerprint of the data. For example, a photo on your phone might have information about the time and location when it was taken as well as who was in it. Metadata is very valuable, as it is used to search, find and index different types of unstructured data.


3 Ways To Improve Board-Level Focus on Third-Party Risk Management

Assessing the risks that third parties bring to your business shouldn’t begin once you have signed the contract. Instead, security and procurement teams should be reviewing known risks in potential vendors during the sourcing and selection stage of the vendor lifecycle. Unfortunately, though, only 31% of companies conduct thorough pre-contract due diligence, indicating there is a long way to go to overcome this obstacle. ... Third-party risk management can’t be a one-and-done task. It needs to be a continuous process built into the risk DNA of the enterprise. However, most organizations can get easily tripped up with performing vendor risk assessments, since half are still using manual spreadsheets to manage their vendors, and a further 34% say it takes over a month to complete an assessment of a top-tier vendor. This traditional static annual assessment approach must give way to a more dynamic process that incorporates real-time risk metrics. Agility should be the order of the day in assessing third parties. ... Effectively reducing vendor risk requires an understanding of how vendors are performing against expectations – both security and performance-related.


Compromised devices and data protection: Be prepared or else

While encryption alone isn’t fully sufficient to secure data, it’s also the case that multiple layers of encryption are often necessary to ensure that any exposed data is rendered unreadable and unusable. For example, an encryption tool like Bitlocker, if used on its own, can leave data vulnerable in certain scenarios such as if a power failure interrupts the encryption process, or if a system administrator’s credentials are compromised. In the wrong hands, a system administrator account will be able to view all files as decrypted and in clear text. However, deploying a solution like Encrypted File System (EFS) as a secondary encryption layer on top of Bitlocker will provide additional file-level encryption. In this way, EFS makes it possible to ensure the encryption of sensitive data, even if an attacker has gained access to device hardware and has powerful credentials in hand. This approach provides the added benefit of making it possible to service devices without it being necessary to allow data access or present any risk of exposure. By implementing a layered encryption strategy with protection at both the full drive and file levels, organizations can take peace of mind that the loss of a particular device is hardly a loss at all.



Quote for the day:

"Becoming a leader is synonymous with becoming yourself. It is precisely that simple, and it is also that difficult." -- Warren G. Bennis

Daily Tech Digest - March 13, 2021

4 ways to keep the cybersecurity conversation going after the crisis has passed

The report recommends adding a business information security officer (BISO) to improve business security alignment, building a top-down measurable program, and changing reporting structures so the CISO reports directly to the CEO. Ultimately, analysts say it’s the CISO’s responsibility to build relationships with executives and the board and have regular conversations with them. “It’s not just the board ignoring things or executives minimizing things, but cybersecurity people staying in their lane,” says Jon Oltsik, senior principal analyst at Enterprise Strategy Group and author of the report. “We need progressive and proactive CISOs to kind of shake the world up.” To maintain momentum, CISOs must keep the board’s attention with a steady stream of relevant information delivered in business terms and presented in the form of risk and strategy for cybersecurity, not just tech solutions. Security leaders and analysts offer some tips, tools, and frameworks to help translate security into strategy and keep the conversation going. If CISOs want to speak in board terms, “you have to speak strategically, and there are strategic business tools to do that,” says Lance Spitzner, director of SANS security awareness. 


The concept of justifiable healthcare and how big data can help us to achieve it

As a first necessary but not sufficient step, the evaluation requires consideration of any evidence on the efficacy of the intervention. AI and big data can be of help to mine and digest existing literature and evidence, at a much higher speed and in a more exhaustive manner than is currently possible using human skills. It is crucial that for this evaluation of efficacy, standardized core outcome sets are applied. These are sets including only outcomes that are: relevant to patients; measurable in an accurate and reliable manner; and discriminative. At the micro-level of the healthcare professional and the patient, justifiable healthcare is in fact an essential element to allow for genuine shared decision making. Indeed, justifiable healthcare provides all stakeholders involved with the argumentation and information necessary to decide which intervention has the highest probability to lead to the desired outcome given this specific condition affecting this specific patient and taking into account other available interventions. Big data and AI can be of help to present alternative options in a way that both patients and healthcare workers can easily understand, and finetuned for the specific case of the patient.


Quantum computing: Quantum annealing versus gate-based quantum computers

All is not lost for gate-based methods – quite the contrary, in fact. GSK's researchers foresee that the expected increase in qubit count in computers like these will allow quantum devices to show a significant performance advantage over classical hardware, for pharmaceutically-relevant life science problems, but also many other types of application. The results of the scientists experiments are still in pre-print, and are yet to be certified by peer review; in addition, the trials only focus on a specific problem – the use of quantum computing to assist drug discovery. Nevertheless, the research offers a valuable overview of the capabilities of quantum devices as they stand, and of the limitations of different approaches to quantum computing. The problem addressed by the scientists is well-established in classical computing. Called codon optimization, it consists of finding sequences of genetic code, called codons, that will ultimately lead to the expression of a particular gene. Up to six codons can be required to represent an amino acid, which in turn form the proteins that determine the gene.


Could No-Code Enable Everything Ops?

“The story of B2B software is largely about automation,” said Chou. If we consider classic examples of corporate applications, we see one common thread — B2B software has always sought to automate traditional office work. And so, it came to pass that computing and software devoured all these office tasks, from accounting to time-tracking, communications and many other areas. Though computer-based office work delivered huge efficiency boosts, these interfaces often introduced new hurdles; bad UI design, difficult navigation and a hundred open tabs, just to name a few. From there, robotic process automation (RPA) aimed to operationalize a growing number of tedious digital workflows across Excel spreadsheets, web apps and desktop apps. In Chou’s words, “RPA took the concepts of test automation software, and then pointed it toward production systems.” Though RPA’s process of recording screen interactions gathered much interest and attention (and funding), screen scraping is ultimately too fragile to be effective. It adds technical debt over legacy systems. It can be expensive to implement. Lastly, it’s not processes-centric and doesn’t implement reusable software-defined APIs.


Here's how digital transformation and sustainability can flourish together

The rapid digitalisation catalysed by COVID-19 presents the opportunity to rethink how we make decisions and how we apply technology in new and meaningful ways. Immense opportunity exists for enterprises that can capture the value of data to drive more sustainable solutions. For example, it’s estimated that the value unlocked by artificial intelligence in helping design out waste for food, keeping products and materials in use, and regenerating natural systems, could be up to $127 billion a year in 2030. The digital transformations of today must be purpose-led, delivering for all stakeholders as a requisite for company success. Spearheading that effort is the Forum's CEO Champions group on Accelerating Digital Transformation in a Post-COVID-19 World, which is led by Antonio Neri, the CEO of Hewlett Packard Enterprise (HPE). Today, this group published a playbook, Bridging Digital and Environmental Goals, designed to provide leaders with recommended actions and examples to leverage data-led insights and create products, strategies and business models that minimise their impact on the planet.


Security awareness programs: The difference between window dressing and behavior change

Instead of merely checking the annual compliance security box, good security awareness programs are focused entirely on real-world outcomes and results. To achieve measurable results, companies need to make a real change in educating employees on cybersecurity and their role in protecting their companies. The core issue with “cookie-cutter” security training, in which all employees receive the same phishing simulation, is that they often do not target at-risk users at the critical moment when a potential attack is in progress. Nor are they conducted with enough frequency to remain top of mind for employees. By implementing policies, controls, and technologies that focus on the individual, organizations can more effectively teach employees the right behaviors that will result in a cyber-savvy culture. ... Taking a behavior-based approach to security awareness training is more effective than traditional initiatives, reduces costs, and provides a measurable ROI for organizations. Consider lane assist technology. While the reason why a driver might drift into another lane can range from fatigue to inattention to an inability to see the lines, alerting drivers exactly when they might be dangerously drifting into another lane helps drivers avoid a collision.


AI and you: how confusion about the technology that runs our world threatens democracy

Machine learning occupies an interesting position in the story of scientific progress. On one hand it’s a natural outcome of developments in computer science that began in the 1980s. On the other hand, its total dependence on information — and its ability to make do with all sorts of information, including things like your keystroke and heart rate — marks what could turn out to be a more radical break with previous technologies. Machine learning uses existing information to generate new information. But it also allows that new information to be put to a variety of questionable uses, including surveillance and manipulation. If you’ve ever been recommended products while shopping online, you’ve probably been profiled. Ever been denied an application for a credit card in short order? Again, you’ve probably been profiled. Algorithmic profiling presents a host of ethical and legal challenges, particularly around discrimination and privacy. But profiling is just the tip of an ever-expanding iceberg. Many uses of big tech pose a threat to individuals as individuals, which is bad enough.


The cyber security risks of working from home

The most obvious risk is that most of our tasks are conducted online. After all, if something’s on the Internet, then there’s always the possibility of a cyber criminal compromising it. They might attempt to do this by cracking your password. This could be easier than ever if you’re reusing login credentials for the various online apps you need to stay in touch with your team. Meanwhile, according to CISO’s Benchmark Report 2020, organisations are struggling to manage remote workers’ use of phones and other mobile devices. It found that 52% of respondents said that mobile devices are now challenging to protect from cyber threats. ... Organisations should also be concerned about remote employees using their own devices. This might have been unavoidable given how quickly the pandemic spiralled and the suddenness of the government’s decision to implement lockdown measures. Still, where possible, all work should be done on a corporate laptop subject to remote access security controls. This should include, at the very least, 2FA (two-factor authentication), which will mitigate the risk of a crook gaining access to an employee’s account.


The future of data privacy: confidential computing, quantum safe cryptography take center stage

Quantum safe cryptography aims to tackle the problems that will arrive with the day we have a working quantum machine. While quantum computing is being actively worked on by engineers worldwide, with Honeywell, for example, ramping up the capacity of its own System Model H1 to a quantum volume of 512, it is estimated that a full-capacity quantum computer could exist within the next 10 to 15 years. When that day arrives, however, the high computational power of these machines would render "virtually all electronic communication insecure," according to IBM, as quantum computers are able to factor large numbers -- a core precept of today's cryptography. To resolve this, standards based on lattice cryptography have been proposed. This hides data in complex algebraic structures and is considered to be an attractive option for future-proofing data privacy architectures. According to IBM cryptographer Vadim Lyubashevsky, adopting lattice frameworks is unlikely to impact end-users -- and may actually improve computational performance. But why bother now, when full quantum machines do not exist? 


Enterprise architecture: a tool for business recovery?

From entirely new ways of working, permanent shifts in customer behaviour and operational networks, the world beyond the crisis is set to look drastically different. To emerge from the pandemic in a stronger position, organisations will need to directly address the vulnerabilities the pandemic has exposed. For instance, people may continue to be adverse to gathering in large groups, ecommerce is unlikely to lose the gains it has obtained during multiple lockdowns, and of course, businesses globally have realised the benefits that the work from home model brings. These emerging trends will significantly alter the roadmap ahead, but more importantly, it’ll accelerate the exploration of new digital tools. A recent McKinsey report shows that nearly all organisations, whether traditional companies or startups are re-orienting their business models to be more digital as a direct result of the impact Covid-19 has had on changing consumer behaviours, and many of these changes will outlive the current landscape. As we delve into this virtual world, we must prepare and ask ourselves, could parts of hospitality and tourism be replaced by VR? Will business meetings make use of holographic technology for a blended experience? Will self-driving or delivery drones spearhead the future of retail?



Quote for the day:

"The world is full of obvious things which nobody by any chance ever observes." -- Arthur Conan Doyle

Daily Tech Digest - March 12, 2021

Hiring developers? Here's how to keep them happy and productive

"Whiteboard coding was another thing that was just totally broken in engineering hiring. Asking people to code on a whiteboard is a different skill set. People don't do it for their day-to-day. It was silly for us to ask people to put code on a whiteboard, but we did it for years!" A better strategy for onboarding developers remotely, Pillar says, is a sort of BYOD policy, whereby hiring managers ask candidates to bring their laptops along to the interview with the understanding that they'll be performing some form of on-the-spot coding while they share their screen with the interviewer. "That's a way more productive way to get an excellent signal about the quality of a developer, because it's actually their environment and you can see them using the tools that they're familiar with," he explains. ... "A meeting is an extremely expensive thing for an engineer. It's way easier, unfortunately, to interrupt an engineer's flow in a remote world with a meeting because their calendar is open, you can just throw it in there and you don't even really think about it. Some software providers now provide analytics tools that will measure how workers' time is spent, some of which include the ability to measure interrupted time – also known as 'friction time'. 


How Security Architecture Is Shaping Up for 2021

Access is often referred to as zero-trust network access, which seems incorrect to me since its application access, which is network access, which is the old traditional VPN piece. But that architecture makes no difference if you’re on or off the network. It uses an access proxy to provide a security and control context, it’ll provide identity components for users and devices. So it gives you this application [information] and then contextual information as applied per session. That’s one architecture — one of the problems, obviously, when you have some of these architectures is that you try and build it, and you’ve got five different vendors, you’re trying to build it from code union and endpoint solution, you to proxy, you need security, you need identity, you need these other contextual engineering management [techniques]. So customers have trouble when they try to build it across maybe five or six vendors. That’s why I think it’s a really important architecture, especially when I think people are gonna be more and more often on the network and backward and forwards, it doesn’t really matter whether it’s a zero trust architecture. So that’s one really important component.


IT security strategy: A CISO's 5 essentials

One of the most common cyberattack vectors remains exploiting known vulnerabilities in OS software and applications. To combat these attacks, stay on top of the maintenance level of your hardware and software. Unsupported components should be upgraded or replaced as soon as possible. Conduct vulnerability scans for the full infrastructure monthly, and correct issues as soon as possible. Ensure your scans include third-party products and applications. ... A famous baseball coach once said, “You can observe a lot by just looking.” Make better use of the logs and reports provided by the systems and applications running your business. Delineate baselines and metrics defining security health. A change in activity patterns or metrics may be an early indicator of trouble brewing. Develop, maintain, and test a practical security incident management plan so you will know what to do if faced with a real incident. Composing a secure foundation isn’t easy in the best of times. While these five tips may not be as exciting as hunting for hackers or implementing a sophisticated security incident event management (SIEM) system, they are the building blocks of a strong foundation and offer the best way to move organizations forward safely.


Power Equipment: A New Cybersecurity Frontier

While IoT has been the catalyst for many positive developments, there are challenges with these expanding interconnections. For power management, the ability to connect backup equipment like an uninterruptible power supply (UPS) can prove helpful in enabling IT teams to monitor and maintain essential infrastructure more efficiently. However, like any other network-connected devices, they become assets that need to be secured from potential cyber breaches. Though UPS doesn't traditionally come to mind when envisioning ways cybercriminals infiltrate a network, the same could also be said for other inconspicuous devices like HVAC units. Yet, that's exactly what hackers pursued when they were able to gain access to Target's system and steal data on over 40 million credit and debit cards. And consider how hackers were able to penetrate the network of a North American casino utilizing an Internet-connected thermometer inside an aquarium. Finding the vulnerability in a fish tank, of all places, allowed hackers to access the casino's database and ultimately steal private customer data.


How do I select a SOAR solution for my business?

A SOAR solution should enable teams to automate the identification and response process across significant volumes of disparate data streams, so that the prioritisation of threats and vulnerabilities becomes almost seamless, not least far more operationally efficient. If implemented correctly, Security Operations Centres (SOC) can benefit from using SOAR solutions helping them to deal with threats faster and more efficiently. Integrating SOAR with other security tools, such as Security Information and Event Management (SIEM), can transform SOC teams business and technology outcomes through automation, while also increasing efficiency. Combining forces, organisations can use SOAR to augment the capabilities of SIEM, offering an all-comprehensive solution. SIEMs collect and store data in a useful manner which SOAR can use to automatically investigate and respond to incidents and reduce the need for manual operations. What’s more, in tackling one of the biggest challenges for SOC teams to date, SOAR solutions can help to ingest information, sort, prioritise and combine duplicate alerts to reduce the number of false positives.


Fintech Innovation Done Right: Be A Creator

Fintech can also create entirely new product categories. One mechanism I’ve explored previously are embedded fintech strategies. A financial product can be embedded into other products to change the nature of, availability and engagement model with customers. Companies like Opendoor give customers the ability to make cash offers for homes to make them more competitive. Boost allows companies to launch insurance products and bundle them into a broader offering. Zola bundles loans and mobile repayments with Pay-As-You-Go financing to unlock demand for home solar systems in Africa. Without the built-in financing, the systems would be unaffordable making the loan a core piece of the business model, rather than a feature. Similarly, many boot camps engage in income sharing agreements – rather than charging tuition, the program is repaid through a percentage of future earnings for a set period of time. Finally, players like ZhongAn have created fully automated insurance built into products. For instance, in a partnership with a telephone provider, they can automatically detect a broken screen.


Untangl CEO discusses how Insurtech startups are disrupting finance markets

“There’s no doubt that technology is going to disrupt the insurance sector like it has any other industry,” said Stewart. “But I think insurance has been particularly slow when it comes to modernising, and that’s been highlighted by the rapid shift to the cloud. “The pandemic has been another catalyst in a rethink of operations going forward, but a cultural problem has been present around an industry that’s underinvested in technology, while finding it difficult to innovate in such a risk-averse, high margin landscape.” Stewart went on to explain that companies in the space can often spend up to 18 months making decisions to solve inquiries in response to potential problems, a reflection that he described as “a reflection of how not to do it”. However, while the insurance sector has found innovating quickly with short term projects more difficult than other sectors, the past year has seen areas such as personal lines become more agile and intuitive. “It’s not easy because the industry has experts in their complex fields, who are representing stakeholders with billions in capital behind them, and any mistakes can be financially disastrous,” Stewart added.


The Brain of Security

In fairness security analysts are seeking to make risk-informed decisions, as the human brain does this instinctively. However, they can only do that based on the information they are provided. There are not many security programmes where business context was provided to the analyst to aid in decision making. Recognising this reality, organisations are seeking to quantify their cyber risk to better align security to the business, drive remediation and response activities, support investment decisions and demonstrate return on security investment. Many have already embraced the move to a quantified understanding of risk – only to be let down as current approaches require too much manual data collection, too much training and professional services support, don’t connect this newfound understanding with the ability to take action and fail to meet the need to efficiently and cost-effectively mitigate risk. Organisations need to acknowledge that understanding and quantifying risk is critical to building an effective security programme in this day and age. Solely orchestrating and automating security actions with an intelligence-led approach is not enough.


CIO Agenda for Right Now: Priorities a Year Into the Pandemic

First, the COVID-19 pandemic brought a period of rapid change and challenges for organizations, and that has accelerated technological change. Future conditions will be significantly different from the past and even from the present, according to White. Second, operating models have had to change. Now that the dust has settled, organizations will be using the rest of 2021 to review and consolidate all of the changes that have happened in organizations, White said. Third, the pandemic has raised new business priorities. Work from home has been one of them. But deeper in that trend, the pandemic has disrupted traditional research conducted by business and has raised different priorities for innovators, according to White. Plus, the work-from-home trend will drive significant organizational changes. Remote leadership poses challenges for presence and influence, according to White. Leaders and managers will need to adapt their styles to encompass non-line-of-sight supervision and performance management. Fourth, the CIO role has changed and will continue to change. Technology and the CIO's response to the pandemic, lockdown, and economic downturn, meant that many organizations were able to survive the initial crisis.


OpenAI’s state-of-the-art machine vision AI is fooled by handwritten notes

Researchers from machine learning lab OpenAI have discovered that their state-of-the-art computer vision system can be deceived by tools no more sophisticated than a pen and a pad. As illustrated in the image above, simply writing down the name of an object and sticking it on another can be enough to trick the software into misidentifying what it sees. “We refer to these attacks as typographic attacks,” write OpenAI’s researchers in a blog post. “By exploiting the model’s ability to read text robustly, we find that even photographs of hand-written text can often fool the model.” They note that such attacks are similar to “adversarial images” that can fool commercial machine vision systems, but far simpler to produce. Adversarial images present a real danger for systems that rely on machine vision. Researchers have shown, for example, that they can trick the software in Tesla’s self-driving cars to change lanes without warning simply by placing certain stickers on the road. Such attacks are a serious threat for a variety of AI applications, from the medical to the military.



Quote for the day:

"The most difficult thing is the decision to act, the rest is merely tenacity." -- Amelia Earhart

Daily Tech Digest - March 11, 2021

Microsoft cracks the RPA market! ... not really

The big question is, what do you actually automate as individual users? People, more often than not have no idea what it is they actually want to do with the software. This will change as eureka moments are shared and Windows users begin to understand that they no longer have to produce weekly reports, update databases or populate CRM systems. But this is not a disruptive moment in the history of RPA, it's more akin to a dripping tap eventually filling up the bath. Microsoft is late to the game, very late. While RPA is new to some and still unheard of by others, it's now considered as a fairly mainstream business software solution, propped up by process discovery, mining and a plethora of consulting firms all benefiting from the success of the technology. Of course the very entrance of Microsoft into the market is much more than 'a bit of news' but it really just validates the potential for RPA to evolve further and might even justify the hype that some find so distasteful and worthy of their online wrath. Contrary to what the detractors are claiming, most, if not all of the large RPA implementations are delivering high ROI. Technology tends to succeed or fail based on what it delivers.


New free software signing service aims to strengthen open-source ecosystem

Sigstore uses the OpenID authentication protocol to tie certificates to identities. This means a developer can use their email address or account with an existing OpenID identity provider to sign their software. This is different from traditional code signing that requires obtaining a certificate from a certificate authority (CA) that's trusted by the maintainers of a particular software ecosystem, for example Microsoft or Apple. Obtaining a traditional code signing certificate requires going through special procedures that include identity verification or joining a developers program. The sigstore signing client generates a short-lived ephemeral key pair and contacts the sigstore PKI (public-key infrastructure), which will be run by the Linux Foundation. The PKI service checks for a successful OpenID connect grant and issues a certificate based on the key pair that will be used to sign the software. The signing event is logged in the public log and then the keys can be discarded. This is another difference to existing code signing because each signing event generates a new pair of keys and certificate. Ultimately, the goal is to have a public proof that a particular identity signed one file at a particular time.


Sustainability and Availability Big Wins for Digital Twins

“The impact of digital twins goes beyond the asset and extends to logistics,” Weiss explains.” This has significant implications for mission readiness if an asset deploys to remote or hazardous locations. By understanding the condition of any given asset at any given time, sustainment leaders can anticipate maintenance requirements, ensuring the right components and personnel are in the right place at the right time, making real the concept of condition-based maintenance plus (CBM+).” CBM – sometimes called Predictive Maintenance (PdM) - is a maintenance methodology that utilizes sensors to assess the health of the system. The health information, in addition to other inputs, helps to drive the maintenance activities. In a CBM environment, operating platforms, embedded sensors, inspections, and other triggering events determine when restorative maintenance tasks are required based on evidence of need. Combined, these end benefits drive affordable and resource-optimized sustainment operations and data-informed decisions to significantly increase operational availability. In a single-use case assessment of two aircraft engine components, a U.S. military service branch reported potential savings of approximately $42 million annually by using a digital twin.


Enabling Your Workforce for the Digital Workplace

Changing company culture is just one step on the path to successful digital transformation. It requires the resources, knowledge, and skills to support and sustain the initiative. Providing sufficient training and targeted reskilling or upskilling for current employees is a necessary part of building the workforce of tomorrow. Digital literacy dovetails with key soft skills, including adaptability, problem-solving, effective communication, and emotional intelligence, which all correlate with effective teamwork and leadership. At the same time, investing in the workforce’s digital literacy helps nurture employee engagement and can improve retention. The costs of recruiting and training new staff can be substantial, so assessing and augmenting the current workforce’s digital literacy should be a central feature of any strategy. Building those foundational skills helps support adoption. Cybersecurity is another crucial focus area, and building a culture of security is central to organizational resilience. That’s why it’s no surprise to see that 59% of CFOs plan to increase budgets for IT in 2021. The number and sophistication of cyber threats have risen substantially, and a company is only as secure as its weakest link. This is especially true for the digital workplace.


When AI Alone Isn’t Enough?

Things are always more complicated than we would hope, especially when it comes to the promises of new technologies. After more than 30 years as a consultant and analyst in emerging technologies, I’ve learned that the implementation of promises is always more difficult and problematic than we expect. How is this playing out with machine learning models? First, there is the basic problem of the data itself. Without a clear cycle of data management, machine-learning models may cause more problems than they solve. You must then think about how a business can explain how decisions derived from a machine learning model were made. Here are a few questions that may arise: What are the hidden and not so hidden biases with the data selected to create a model?; What are the ethical challenges that must be managed as we move to this brave new world of Artificial Intelligence?; For example, how can a manager explain what business processes and rules are behind a model? Are those rules ethical?; Do these models adhere to either corporate or governmental requirements?; How does an organization know if there is bias in a model that can impact business outcomes?


Multiple Attack Groups Exploited Microsoft Exchange Flaws Prior to the Patches

Security firms including Red Canary and FireEye are now tracking the exploit activity in clusters and anticipate the number of clusters will grow over time. ESET researchers have detected at least ten APT groups using the critical flaws to target Exchange servers. When used in an attack chain, the exploits for these vulnerabilities could allow an attacker to authenticate as the Exchange server and deploy a Web shell so they can remotely control the target server. When Microsoft released patches for the four Exchange server zero-days, it attributed the activity with high confidence to a Chinese state-sponsored group called Hafnium. Now, as researchers observe Web shells stemming from suspected Exchange exploitation, they believe far more groups are responsible for the growth in attack activity. In a blog post released March 9, Red Canary analysts report none of the clusters they observe significantly overlap with the group Microsoft calls Hafnium; as a result, they are now tracking these clusters separately. "We don't know who is behind these clusters – we aren't sure if it's the same adversaries working together or different adversaries completely," the researchers write. "We're focusing narrowly on what we observe on victim servers for our clustering."


Clearview AI sued in California by immigrant rights groups, activists

Clearview AI, the controversial firm behind facial-recognition software used by law enforcement, is being sued in California by two immigrants' rights groups to stop the company's surveillance technology from proliferating in the state. The complaint, which was filed Tuesday in California Superior Court in Alameda County, alleges Clearview AI's software is still used by state and federal law enforcement to identify individuals even though several California cities have banned government use of facial recognition technology. The lawsuit was filed by Mijente, NorCal Resist, and four individuals who identify as political activists. The suit alleges Clearview AI's database of images violates the privacy rights of people in California broadly and that the company's "mass surveillance technology disproportionately harms immigrants and communities of color." ... The lawsuit is the latest attempt by grassroots groups to clamp down on facial-recognition software, which is not widely regulated in the United States. In the absence of clear federal rules regarding the usage of the technology, a number of cities — such as San Francisco, Boston, and Portland, Oregon — have banned the technology in some capacity.


DeFi and tokenization together reshape the financial system

By integrating tokenized assets and equity into DeFi protocols, the functions of, for instance, existing liquidity pools and interest rate based protocols can be effectively applied to real-world assets. Thus, mainstream adoption for DeFi is fostered. But with tokenization, even features such as atomic swaps or flash loans become possible for real-world assets. This will substantially rejuvenate the current illiquidity in the DeFi sector and also enable a far more diverse range of opportunities for investing from the perspective of CeFi (centralized finance). Tokenization efficiently bridges the gap between these two worlds. Semi-automated lending and trading systems atop blockchain networks are the future of finance. Here, cutting out intermediaries while democratizing access through blockchain-based governance models make room for new solutions. In theory, anyone with a connection to the internet can leverage DeFi protocols. Individuals worldwide can now conduct financial transactions more efficiently and at lower costs, especially when compared to the global remittances market. Currently, this still requires users to own crypto-assets.


How To Build A Career In IoT?

Internet of Things has truly lived up to its hype. Between 2014 and 2019, the number of businesses employing IoT technologies grew from 13 percent to 25 percent. According to McKinsey, the number of IoT-connected devices would touch 43 billion by 2023. The blooming IoT domain has opened up a world of possibilities for skilled engineers and professionals. The growing demand has widened the supply-demand gap. An Immersat Research Programme research showed that as many as 47 percent of surveyed companies lacked appropriate IoT skills and were forced to outsource projects. A suboptimal IoT workforce means up to 75 percent of the IoT projects take twice as much time to complete, as noted by a Gartner survey. There are no fixed eligibility criteria to enter this field. However, an engineering graduate specialising in IT, computer science, electrical and electronics are better fits. A few colleges provide undergraduate courses in IoT or have computer science specialisation with IoT as a major subject. ... Since an IoT engineer deals with a large amount of data that is often unreliable, the ability to manage such data gains paramount importance. 


Getting your application security program off the ground

Like water, attackers are always trying to find a path of least resistance. It should not be surprising that instead of trying to get through sophisticated defenses on the infrastructure side, they explore vulnerabilities in web, mobile applications and web services. ... A potential attacker could exploit a vulnerability by executing an API call with a specially crafted parameters/payload, and this may lead to an injection attack and result in the attacker obtaining sensitive data (e.g., financial information) or executing unauthorized actions (e.g., transferring money to a bank account controlled by the attacker). In many cases, Dzihanau notes, such actions would be difficult to distinguish from typical application usage. ... “Unfortunately, automation is not everything, and developers need to obtain the necessary knowledge and make security part of their day-to-day work. Security aspects need to be addressed not only during testing but continuously in design development and deployment too. While terms security by design and shift-left are well known, organizations only start to realize now what changes and implications this brings to the development process.”



Quote for the day:

"Leadership is working with goals and vision; management is working with objectives." -- Russel Honore

Daily Tech Digest - March 10, 2021

7 challenges women face in getting ahead in IT

It’s a razor thin line that’s nearly impossible to walk. Not being assertive won’t get you promoted. Being assertive makes you unlikeable. Either one makes promotion difficult. Not being liked holds women back, according to Lean In, an organization dedicated to overcoming these obstacles, because women who aren’t liked are seen as less competent. ... “My advice to women is to make sure you speak up,” says Easwar. “Don’t let your ability to express your opinions be impacted by prejudice. You have to break that mold because not speaking up can be more detrimental and can be a disservice to other women trying to follow a similar growth path.” Women can’t change this without help, though, and altering the way your company manages meetings can go a long way to preventing this. “We switch who runs meetings every week,” says Breanne Acio, co-founder and CEO of SÄ“kr, a female-run tech startup. ... Women are often left out of group events or conspicuously ignored because they are not a member of the “in group,” which, in IT, is often white men. Whether this is conscious or unconscious, it is a bias that makes it difficult to get promoted.


How to mitigate security risks as cloud services adoption spikes

As organizations continue to move to the cloud, it’s vital to build a trusted foundation for computing. Security is only as strong as the layer below it, which is why it’s important to have trusted security technologies rooted in hardware for cloud. For example, technologies such as disk and network traffic encryption protect data in storage and during transmission, but data can be vulnerable to interception and tampering while in use in memory. “Confidential computing” is a rapidly emerging usage category that protects cloud data while it’s in use in a Trusted Execution Environment (TEE). TEEs enable application isolation in private memory regions called enclaves to protect code and data while in use on a CPU. For instance, healthcare organizations can securely protect data with TEEs — including electronic health records — and create trusted computing environments that preserve patient privacy across the cloud. Full memory encryption is another technology used to protect hardware platforms (in the cloud) to ensure that memory accessed in CPUs is encrypted, including customer credentials, encryption keys, and other IP or personal information on the external memory bus.


What is the future of the Proptech space?

With Proptech companies needing to increasingly digitise their operations to meet evolving customer demands, they will also need to continue paying attention to regulations, which may be equally liable to change. Roger Tyrzyk, director of global gaming and sales for UK/I at IDnow, believes that such rules would need to be reconsidered for true innovation to occur, and that this could be increasingly prioritised by regulators in the future. “Consumers have changed; they expect processes to be much quicker, more efficient and certainly not an inconvenience,” said Tyrzyk. “In response, the property sector – both conveyancing and rental markets – must adapt and the only way to do this is to leverage the innovative technologies that already exist. “However, to empower sectoral change, property regulators and the Government must revisit the archaic laws that currently exist in the UK and align them with the 21st Century. For example, under current laws, every letter a solicitor sends out must be wax sealed. To deliver a truly digitised process and harness the power of Proptech, this must be addressed and overhauled.


How to protect backups from ransomware

If your backup system is writing backups to disk, do your best to make sure they are not accessible via a standard file-system directory. For example, the worst possible place to put your backup data is E:\backups. Ransomware products specifically target directories with names like that and will encrypt your backups. This means that you need to figure out a way to store those backups on disk in such a way that the operating system doesn’t see those backups as files. For example, one of the most common backup configurations is a backup server writing its backup data to a target deduplication array that is mounted to the backup server via server message block (SMB) or network file system (NFS). If a ransomware product infects that server, it will be able to encrypt those backups on that target deduplication system because those backups are accessible via a directory. You need to investigate ways to allow your backup product to write to your target deduplication array without using SMB or NFS. All popular backup products have such options.


How the UK Has Set its Sights on Becoming a Fintech Haven in the Wake of Brexit

With global competition for fintech talent within the sector, cities like London face fresh competition from European destinations like Berlin, Barcelona and Amsterdam - which are becoming increasingly popular for fintech professionals with the right to work across the EU. This exodus is exactly what the UK is looking to prevent, and the danger posed by the situation has been underlined by Ricky Knox, CEO at fintech bank, Tandem, who said: “Tech visas are a great thing and essential if we are going to keep a competitive tech and fintech sector,” he added. “Over half of our coders are from outside the UK and some have already left due to Brexit.” Another aspect of the review has called on the UK to revise its approach to the regulation of crypto-assets as a means of welcoming more fintech businesses in the future. Recent restrictive measures by UK regulators involve bans on the sale of crypto derivatives and an anti-money laundering register that have created a somewhat hostile environment for blockchain or decentralised finance fintech businesses to set up camp in London.


The world's most powerful supercomputer is now up and running"

The ultra-high-performance computer Fugaku is about to go into full-scale operation," said RIST president Yasuhide Tajima. "I look forward to seeing this most powerful 'external brain' ever developed by humanity helping to expand our knowledge, enabling us to gain deeper insights into the foundation of matter in both time and space, giving us better structural and functional analysis of life, society, and industry, enabling more accurate predictions; and even designing the unknown future of humanity." Fugaku is designed to carry out high-resolution, long-duration, and large-scale simulations, and boasts up to 100 times the application performance of its predecessor, the K supercomputer, which was decommissioned in 2019. This unprecedented compute power has earned the device the top spot for two consecutive terms in the Top500 list, which classifies the 500 most powerful computer systems around the world. At 442 petaflops, Fugaku stands a long way ahead of competitors, with three times more capability than the number two system on the list, IBM's Summit, which has a performance of 148.8 petaflops.


Researchers Describe a Second, Separate SolarWinds Attack

Secureworks describes Supernova as a Trojanized version of the legitimate dynamic link library used by the SolarWinds Orion network monitoring platform. The hackers were spotted using Supernova to conduct further reconnaissance on a SolarWinds client's network, which eventually led to the exfiltration of some credentials, the researchers say. SolarWinds confirmed to ISMG that the Supernova attack researched by Secureworks is associated with earlier warnings of a second group of hackers exploiting a SolarWinds Orion flaw, which has since been patched. Secureworks' researchers discovered the Spiral attack on one organization in November 2020 when they spotted hackers exploiting a SolarWinds Orion API vulnerability on an internet-facing SolarWinds server during an incident response effort. "The threat actor exploited a SolarWinds Orion API authentication bypass vulnerability (CVE-2020-10148) to execute a reconnaissance script and then write the Supernova web shell to disk," the researchers say. "There is no known connection between Spiral activity and the recently reported compromises of on-premises Microsoft Exchange servers," says McLellan.


The Perfect Pair: Digital Twins and Predictive Maintenance

Sensors are the heart of any measurement, control, and diagnostic devices. Device telemetry is collected using the smart sensors available on the hardware / software environment and then used to create the digital twin model of the physical equipment. All of the data is then aggregated and compiled to generate actionable information. The digital twin model is then continuously updated to mirror the current state of the physical thing. It can then be used to effectively model, monitor, and manage devices from a remote location. It also enables continuous intelligence & estimated time for the next needed maintenance, which the maintenance system can use to schedule at the optimal time. ... Not all use cases or business problems can be effectively solved by predictive maintenance using a digital twin. Here are important qualifying criteria that needs to be considered during use case qualification: The problem must be predictive in nature, meaning there should be a target or an outcome to predict; The problem should have a record of the operational history of the equipment that contains both good and bad outcomes; ... 


Talent and workforce effects in the age of AI

To meet their AI aspirations, companies will likely need the right mix of talent to translate business needs into solution requirements, build and deploy AI systems, integrate AI into processes, and interpret results. However, most early adopters face an AI skills gap and are looking for expertise to boost their capabilities. In fact, 68 percent of executives surveyed report a moderate-to-extreme skills gap, and more than a quarter (27 percent) rate their skills gap as “major” or “extreme.” The gap is evident across all countries surveyed, ranging from 51 percent reporting moderate-to-extreme gaps in China to 73 percent reporting the same in the United Kingdom. What do leaders regard as the “most needed” roles to fill their company’s AI skills gap? The top four most-needed roles are “AI builders,” who are instrumental in creating AI solutions: researchers to invent new kinds of AI algorithms and systems, software developers to architect and code AI systems, data scientists to analyze and extract meaningful insights from data, and project managers to ensure that AI projects are executed according to plan


Digitally Transforming Trusted Transactions Through Biometrics, ML & AI

The EIU study found that that biometrics will become the dominant customer-authentication method for payments: "The survey shows optimism that evolving technological innovations like biometrics (fingerprint, facial, or voice recognition) could further reduce the trade-off between fraud, security, and [consumer experience], with 85% expecting biometrics to be used to authenticate the vast majority of payments in the next 10 years." Biometrics eliminates one of the riskiest forms of authentication: the username/password combination. Although there are some risks involved with biometrics — identity thieves have figured out ways to steal a fingerprint — they continue to be safer than other forms of authentication. During digital transactions, they give customers peace of mind that their data is not at risk. In fact, a recent study by biometrics company Fingerprints found that 56% of consumers would prefer to use a biometric sensor on their payment card instead of a PIN, signaling market appetite for embedding biometric authentication solutions into the payment process. Pattern recognition also plays a major role in fraud detection. Historical data shows patterns of how customers behave, what they are searching for, etc.



Quote for the day:

"The key to successful leadership today is influence, not authority." -- Ken Blanchard

Daily Tech Digest - March 09, 2021

We don’t need to go back to the office to be creative, we need AI

The solution to this dilemma will come from artificial intelligence (AI). The inherent trade-off between exploration and efficiency is well known to AI researchers. One question that those working in AI often have to grapple with is how often an algorithm should take actions that it hasn’t tried, as against actions it has already tried that will usually lead to some reward. Untried actions can yield spectacular results. For example, when the DeepMind computer program AlphaGo beat Go world champion Lee Sedol in 2016, it did so by exploring moves most human players had never seen before. Prior to move 37 in the second match against Sedol, AlphaGo had calculated that there was a one-in-ten-thousand chance that a human player would make that same move. And the adventurous gamble paid off. Human innovation involves a similar process of exploration and, to facilitate innovation, companies must get their employees to “collide”. Before the pandemic, this was achieved through open-plan architecture that encouraged “water-cooler” moments of unplanned encounters. But, with many employees working from home, corporations will have to find different ways to facilitate these kinds of random interactions.


Why Data Monitoring is Critical in a Hybrid Cloud Environment

Effective capacity management is not the only benefit of data monitoring, however. While cloud providers typically have strong security protocols, agencies and organizations must remain vigilant for cyberattacks. Data monitoring software provides an effective means to spot problems and mitigate issues before they can affect or damage the network and limit operations. When it comes to data, agencies and organizations must be able to “properly protect their data whether it is on-prem, in the cloud, or in transit,” Grunewald explained. While the path to the cloud has become more clear, Grunewald cautions that there must be an ordered approach to migrating software. That process can be roughly divided into six steps: 1) assessment, 2) prioritization, 3) roadmap, 4) optimization, 5) build, and 6) migration. From start to finish, the entire process is designed to encourage frank conversations about what material and processes are worth the transition and how to best utilize the available resources. There are incredible gains to be made by transitioning to the cloud.


Attack on Exchange Servers Gives Impetus to Move Email to the Cloud

Moving Exchange to the cloud began in 2005 but only became mainstream after the release of Office 365 in 2011. I spoke about the perils of moving to Exchange Online at the Exchange Conferences of 2012 and 2014. On-premises servers were still attractive in 2014 but the situation is very different now, both in terms of the threat to on-premises servers and our knowledge of what it’s like for companies to run email in the cloud. According to data shared at the TEC 2020 conference, Exchange Online supports 5.5 billion mailboxes. That number seems enormous in the context of 250-odd monthly active Office 365 users, but more reasonable when you consider that the figure includes Outlook.com users (400 million switched over to use the Exchange Online infrastructure in 2017), shared mailboxes, group mailboxes, resource mailboxes, and a very large number of system mailboxes used by the Microsoft 365 substrate. Exchange Online is a massive online service running on 275,000 mailbox servers. The attack penetrated none of these servers.


Infrastructure as code: Create and configure infrastructure elements in seconds

The final part of the IaC topology, is the one that will be most visible and familiar to IT Ops teams. These are the container orchestrators that control the way in which containers are deployed and provisioned. The most widespread of these orchestrators is undoubtedly Kubernetes. However, the ubiquity of Kubernetes has created something of a problem—that most IT Ops staff, and indeed many developers, think that IaC and Kubernetes are synonymous. I don’t mean this as a criticism of Kubernetes. The system is perhaps the purest expression of the IaC paradigm: eminently portable, but also capable of being adapted to run efficiently on a wide variety of hardware. However, Kubernetes is far from representing the full range of what can be achieved by a careful mix of containers, VMs, and a creative use of container orchestrators. In other words—Kubernetes is the start of your IaC journey, not the end. While it’s a great place to begin to explore adaptive provisioning and continuous integration, many firms will need to develop bespoke containerization strategies in order to work with obscure systems: those that interact with legacy hardware


Are Passwords Becoming A Thing Of The Past?

Compounding the problem of a weak password is the tendency of many users to use the same password across multiple sites and accounts, ranging from their social media to their financial accounts. One study revealed that an average of just five passwords is used across multiple services by more than half of the respondents. One password getting compromised could spell disaster for a user, and even more so if they possess sensitive information about their company. Many users may already have their passwords out in the open – the website Have I Been Pwned has a database of over 613 million passwords that data breaches have exposed. Passwords acquired through these breaches are usually sold en masse to other hackers that use automated attacks like credential stuffing to find a password match to an account. One of the more common ways to strengthen authentication is to use a password manager, especially those that generate unique passwords and automatically change passwords every few months. Changing passwords frequently greatly reduces the risk of user data being affected in the event of a password breach. Two-factor authentication is another common method of going semi, if not fully, passwordless.


No More Wasted Data: Why More Companies Are Turning Data Into Action

Though the majority of data collected by businesses currently goes to waste, there are more tools emerging to help companies unify consumed data, automate insights, and apply machine learning to better leverage data to meet business goals. First, it's important to take a step back to evaluate the purpose and end goals here. Collecting data for the sake of having it won't get anyone very far. Companies need to identify the issues or opportunities associated with the data collection. In other words, they need to know what they're going to do with every single piece of data collected. To determine the end goals, start by analyzing and accessing different types of data collected to determine if it was beneficial to the desired outcome or has the potential to be but wasn't leveraged. This will help identify any holes where other data should be tracked. This will also help hone the focus on the more important data sets to integrate and normalize, ultimately making data analysis a more painless process that produces more usable information. Next, make sure the data is useful -- that it's standardized, integrated across as few tech platforms as possible, and that the collection of specific data follows company rules and industry regulations.


Akash Network Launches Akash MAINNET 2, the First Decentralized Open-Source Cloud

As the first open-source cloud and the only viable decentralized cloud alternative to centralized cloud providers like Amazon Web Services, Google Cloud, and Microsoft Azure, Akash MAINNET 2 empowers developers to break free from the limitations of traditional cloud infrastructure. The platform accelerates growth and scale in the blockchain ecosystem by enabling developers and companies to decentralize their cloud infrastructure, deploying applications faster, more efficiently, and at lower cost. Through the platform, individuals, companies, and data centers with underutilized computing capacity will also be able to monetize and lease their cloud compute to those who need it, recouping the high costs of server maintenance and capital expenditure. Recently, Akash announced an integration with Equinix Metal, the world's largest data center and colocation infrastructure provider with 220 data centers in 25 countries, to expand access to global, low-latency, and powerful cloud infrastructure. For the first time, developers will be able to launch applications such as DeFi apps, blogs, games, data visualizations, block explorers, blockchain nodes, and other blockchain network components on a decentralized cloud.


Number of ransomware attacks grew by more than 150%

On a technical level, public-facing RDP servers were the most common target for many ransomware gangs last year. Against the backdrop of the pandemic that caused many people to work from home, the number of such servers grew exponentially. In 52% of all attacks, analyzed by Group-IB, publicly accessible RDP servers were used to gain initial access, followed by phishing (29%), and exploitation of public-facing applications (17%). Big Game Hunting – targeted ransomware attacks against wealthy enterprises – continued to be one of the defining trends in 2020. In hope to secure the biggest ransom possible, the adversaries were going after large companies. Big businesses cannot afford downtime, averaging 18 days in 2020. The operators were less concerned about the industry and more focused on scale. It’s no surprise that most of the ransomware attacks, that Group-IB analyzed, occurred in North America and Europe, where most of the Fortune 500 firms are located, followed by Latin America and the Asia-Pacific respectively. A chance of easy money prompted many gangs to join the Big Game Hunting. State-sponsored threat actors who were seen carrying out financially motivated attacks were not long in coming.


NHS datacentre transformation projects continue apace during pandemic

Ratcliffe suggests a shift away from a top-down approach across NHS IT, recognising the beneficial modularity that enables individual working parts such as hospitals or trusts to become more nimble – a whole that is greater than the sum of its parts. He says NHS Digital, the NHS’s digital arm, has partly succeeded in moving away from a more traditional “command and control” approach, shifting emphasis more to the empowerment of clinicians and the individual healthcare bodies for which they work. “It is amazing what a crisis will do,” says Ratcliffe, pointing to the fast-moving, NHS-driven, coronavirus vaccination roll-out across England. “There weren’t really rules before, but habits that became rules by repetition. I genuinely don’t think we will all suddenly go, ‘oh wow, we’ve got to get back to normal’ either – ‘normal’ doesn’t exist any more.” The NHS has been able to move forward – for example, in its Covid-19 vaccine roll-out and, to a lesser degree, in contact tracing – in ways that NHS Digital chief Sarah Wilkinson admitted, in an IBM presentation, “no one would have thought possible”. Ratcliffe says more productive provision might involve slicing resources by disease or by discipline, horizontally or vertically, instead of NHS-wide or even trust-wide.


Look to Banking as a Model for Stopping Crime-as-a-Service

We have seen good examples of how cybersecurity teams are working more closely with other internal parties, especially in the banking sector. Some of the major UK and European banks have been operating with an organizational structure where financial crime and cybersecurity teams have been part of the same business unit for over 10 years, driven by the natural synergy between these functions. This has created significant progress. With the convergence of cyber and financial crime teams, the industry has seen the emergence of the fusion center which can be thought of as an advanced version of the security operations center (SOC) management model, unifying several different teams within an organization, such as fraud, financial crime, and cyber. By bringing together these units, organizations can increase situational awareness, share analytics and threat intelligence more easily, have increased attractiveness to talent, and have a standard framework for procedures. Combating cybercrime and disrupting the illegal economy can then be done to a more effective degree by having more transparent management, establishing an end-to-end operating model, and allowing easier collaboration and consolidation on relevant threats and actions.



Quote for the day:

"Every leader needs to look back once in awhile to make sure he has followers." -- Kouzes and Posner