Daily Tech Digest - January 31, 2021

How retailers can manage data loss threats during remote work

While an e-commerce store often relies on many software tools to help make day-to-day operations a little easier, it's likely that the number of apps being used has gone up with the increase in remote work. However, separate software tools don't always play nice together, and the level of access and control they have over your data might surprise you. Some even have the ability to delete your data without warning. At least once a year, e-commerce merchants should audit all the applications connected to their online store. Terms and conditions can change so it's best you understand any changes in the last 365 days. List all the pros and cons of each integration and decide if any tradeoffs are worth it. SaaS doesn't save everything.  Software-as-a-service (SaaS) tools will always ensure the nuts and bolts of the platform work. However, protecting all the data stored inside a SaaS or cloud solution like BigCommerce or Shopify rests on the shoulders of users. If you don't fully back up all the content and information in your store, there's absolutely no guarantee it will be there the next time you log in. This model isn't limited to just e-commerce platforms. Accounting software like QuickBooks, productivity tools like Trello and even code repositories like GitHub all follow the same model.

Don't make these cyber resiliency mistakes

Manea begins by sharing the well-worn axiom that defenders must protect every possible opening where attackers only need one way in. If realistic, that truism alone should be enough to replace a prevention attitude with one based on resilience. Manea then suggests caution. "Make sure you understand your organizational constraints—be they technological, budgetary, or even political—and work to minimize risk with the resources that you're given. Think of it as a game of economic optimization." ... Put simply, a digital threat-risk assessment is required. Manea suggests that a team including representatives from the IT department, business units, and upper management work together to create a security-threat model of the organization—keeping in mind: What would an attacker want to achieve?; What is the easiest way for an attacker to achieve it?; and What are the risks, their severity, and their likelihood? An accurate threat model allows IT-department personnel to implement security measures where they are most needed and not waste resources. "Once you've identified your crown jewels and the path of least resistance, focus on adding obstacles to that path," he said.

Researchers have developed a deep-learning algorithm that can de-noise images

In conventional deep-learning-based image processing techniques, the number and network between layers decide how many pixels in the input image contribute to the value of a single pixel in the output image. This value is immutable after the deep-learning algorithm has been trained and is ready to de-noise new images. However, Ji says fixing the number for the input pixels, technically called the receptive field, limits the performance of the algorithm. “Imagine a piece of specimen having a repeating motif, like a honeycomb pattern. Most deep-learning algorithms only use local information to fill in the gaps in the image created by the noise,” Ji says. “But this is inefficient because the algorithm is, in essence, blind to the repeating pattern within the image since the receptive field is fixed. Instead, deep-learning algorithms need to have adaptive receptive fields that can capture the information in the overall image structure.” To overcome this hurdle, Ji and his students developed another deep-learning algorithm that can dynamically change the size of the receptive field. In other words, unlike earlier algorithms that can only aggregate information from a small number of pixels, their new algorithm, called global voxel transformer networks (GVTNets), can pool information from a larger area of the image if required.

Manufacturers Take the Initiative in Home IoT Security

Although ensuring basic connectivity between endpoint devices and the many virtual assistants they connect to would seem to be a basic necessity, many consumers have encountered issues getting their devices to work together effectively. While interoperability and security standards exist, there are none in place that provide consumers the assurance their smart home device will seamlessly and securely connect. To respond to consumer concerns, “Project Connected Home over IP” was launched in December 2019. Initiated by Amazon, Apple, Google and the Zigbee Alliance, this working group focuses on developing and promoting a standard for interoperability that emphasizes security. The project aims to enable communication across mobile apps, smart home devices and cloud services, defining a specific set of IP-based networking technologies for device certification. The goal is not only to improve compatibility but to ensure that all data is collected and managed safely. Dozens of smart home manufacturers, chip manufacturers and security experts are participating in the project. Since security is one of the key pillars of the group’s objectives, DigiCert was invited to provide security recommendations to help ensure devices are properly authenticated and communication is handled confidentially.

Has 5G made telecommunications sustainable again?

The state of the personal communications market as we enter 2021 bears undeniable similarity to that of the PC market (personal computer, if you've forgotten) in the 1980s. When the era of graphical computing began in earnest, the major players at that time (e.g., Microsoft, Apple, IBM, Commodore) tried to leverage the clout they had built up to that point among consumers, to help them make the transition away from 8-bit command lines and into graphical environments. Some of those key players tried to leverage more than just their market positions; they sought to apply technological advantages as well — in one very notable instance, even if it meant contriving that advantage artificially. Consumers are always smarter than marketing professionals presume they are. Two years ago, one carrier in particular (which shall remain nameless, in deference to folks who complain I tend to jump on AT&T's case) pulled the proverbial wool in a direction that was supposed to cover consumers' eyes. The "5G+" campaign divebombed, and as a result, there's no way any carrier can cosmetically alter the appearance of existing smartphones, to give their users the feeling of standing on the threshold of a new and forthcoming sea change.

Learn SAML: The Language You Don't Know You're Already Speaking

SAML streamlines the authentication process for signing into SAML-supported websites and applications, and it's the most popular underlying protocol for Web-based SSO. An organization has one login page and can configure any Web app, or service provider (SP), supporting SAML so its users only have to authenticate once to log into all its Web apps (more on this process later). The protocol has recently made headlines due to the "Golden SAML" attack vector, which was leveraged in the SolarWinds security incident. This technique enables the attacker to gain access to any service or asset that uses the SAML authentication standard. Its use in the wild underscores the importance of following best practices for privileged access management. A need for a standard like SAML emerged in the late 1990s with the proliferation of merchant websites, says Thomas Hardjono, CTO of Connection Science and Engineering at the Massachusetts Institute of Technology and chair of OASIS Security Services, where the SAML protocol was developed. Each merchant wanted to own the authentication of each customer, which led to the issue of people maintaining usernames and passwords for dozens of accounts.

Biometrics ethics group addresses public-private use of facial recognition

“To maintain public confidence, the BFEG recommends that oversight mechanisms should be put in place,” it said. “The BFEG suggests that an independent ethics group should be tasked to oversee individual deployments of biometric recognition technologies by the police and the use of biometric recognition technologies in public-private collaborations (P-PCs). “This independent ethics group would require that any proposed deployments and P-PCs are reviewed when they are established and monitored at regular intervals during their operation.” Other recommendations included that police should only be able to share data with “trustworthy private organisations”, specific members of which should also be thoroughly vetted; that data should only be shared with, or accessible to, the absolute minimum number of people; and that arrangements should be made for the safe and secure sharing and storage of biometric data. The BFEG’s note also made clear that any public-private collaborations must be able to demonstrate that they are necessary, and that the data sharing between the organisations is proportionate.

Security Threats to Machine Learning Systems

The collection of good and relevant data is a very important task. For the development of a real-world application, data is collected from various sources. This is where an attacker can insert fraudulent and inaccurate data, thus compromising the machine learning system. So, even before a model has been created, by inserting a very large chuck of fraudulent data the whole system can be compromised by the attacker, this is a stealthy channel attack. This is the reason why the data collectors should be very diligent while collecting the data for machine learning systems. ... Data poisoning directly affects two important aspects of data, data confidentiality, and data trustworthiness. Many a time the data used for training a system might contain confidential and sensitive information. By poisoning attack, the confidentiality of the data is lost. It is often believed that maintaining the confidentially of data is a challenging area of study by itself, the additional aspect of machine learning makes the task of securing the confidentiality of the data becomes that much more important. Another important aspect affected by data poisoning is data trustworthiness.

Fuzzing (fuzz testing) tutorial: What it is and how can it improve application security?

We know when a programmer is developing code, they have different computations depending upon what the user gives them. So here the program is the maze and then we have, let's just pretend, a little robot up here and input to the program is going to be directions for our robot through the maze. So for example, we can give the robot the directions, I'm going to write it up here, down, left, down, right. And he's going to take two rights, just meaning he's going to go to the right twice. And then he's going to go down a bunch of times. So you can think about giving our little robot this input and robot is going to take that as directions and he's going to take this path through the program. He's going to go down, left, down first right, second right, then a bunch of downs. And when you look at this, we had a little bug here. They can verify that this is actually okay. There's no actual bug here. And this is what's happening when a developer writes a unit test. So what they're doing is they're coming up with an input and they're making sure that it gets the right output. Now, a problem is, if you think about this maze, we've only checked one path through this maze and there's other potential lurking bugs out there.

The three steps for smart cities to unlock their full IoT potential

In theory, if a city applied uniform standards across all of its IoT-connected devices, it could achieve full interoperability. Nevertheless, we believe that cities and regulators should focus on defining common communication standards to support technical interoperability. The reason: Although different versions exist, communications standards are generally mature and widely used by IoT players. In contrast, the standards that apply to messaging and data formats—and are needed for syntactic interoperability—are less mature, and semantic standards remain in the early stages of development and are highly fragmented. Some messaging and data format standards are starting to gain broad acceptance, and it shouldn’t be long before policymakers can prudently adopt the leading ones. With that scenario in mind, planners should ignore semantic standards until clear favorites emerge. Building a platform that works across use cases can improve interoperability. The platform effectively acts as an orchestrator, translating interactions between devices so that they can share data and work. In a city context, a cross-vertical platform offers significant benefits over standardization.

Quote for the day:

"Education makes a people difficult to drive, but easy to lead; impossible to enslave, but easy to govern." -- Lorn Brougham

Daily Tech Digest - January 30, 2021

Internet of Cars: A driver-side primer on IoT implementation

There are millions of internet-connected cars already on the road, albeit mostly with crude subscription services for music and weather apps. With further advances, connection will be much more encompassing, with the average connected car having up to 200 sensors installed, each recording a point of data, minute by minute. The numbers quickly become staggering, and in emergency situations, the need for data agility is apparent. Picture driving on a highway in medium traffic. If someone’s tire blows out half a mile ahead, this information could be quickly conveyed to surrounding cars, warning of the potential for emergency braking. Any DLT solution would have to include a very nimble verification process for all these new packets of information to be brought into and carried by the network. Additionally, because of the computational complexity involved, almost all DLTs today charge a fee for each new transaction brought into the network. In fact, the fee is an integral part of the structure of many of these computational models. This is obviously not going to be workable in a system like urban traffic that would be generating billions of “transactions” every day. The truth is that decentralized data networks were never designed to handle these kinds of massive use-case scenarios.

AI vendors may have to prove systems don't discriminate

Providing proof that AI models are non-discriminatory means AI vendors would have to become much more transparent about how AI models were trained and developed, according to Purcell. "In the bill, it talks about the necessity of understanding what the training data was that went into creating the model," he said. "That's a big deal because today, a lot of AI vendors can just build a model kind of in secret or in the shadows and then put it on the market. Unless the model is being used for a highly regulated use case like credit determination or something like that, very few people ask questions." That could be easier for the biggest AI vendors, including Google and Microsoft, which have invested heavily in explainable AI for years. Purcell said that investment in transparency serves as a differentiator for them now. In general, bias in an AI system largely results from the data the system is trained on. The model itself "does not come with built-in discrimination, it comes as a blank canvas of sorts that learns from and with you," said Alan Pelz-Sharpe, founder and principal analyst at Deep Analysis. Yet, many vendors sell pre-trained models as a way to save their clients the time and know-how it normally takes to train a model. That's ordinarily uncontroversial if the model is used to, say, detect the difference between an invoice and a purchase order, Pelz-Sharpe continued.

Microsoft releases Application Guard for Office to M365 customers

Application Guard for Office isolates certain files opened in the suite's three primary applications: Word, Excel and PowerPoint. Documents obtained from untrusted Internet or intranet domains, files pulled from potentially unsafe areas, and attachments received through the Outlook email client, are opened in a virtualized environment, or sandbox, where malicious code can't wreak havoc. Unlike the much older Protected View, another Office defensive feature — it opens potentially dangerous documents as read-only — files opened in Application Guard can be manipulated. They can be printed, edited and saved. When saved, they remain in the isolation container and when reopened later, again are quarantined in the sandbox. Outdated file types — which can be set by administrators in the File Block feature within Word, Excel and PowerPoint — are also shunted into Application Guard's virtual machine. Application Guard for Office will be available to customers licensing Microsoft 365 E5 or Microsoft 365 E5 Security, and for now, only to those on either the Current Channel or Monthly Enterprise Channel. (Those are the Microsoft 365 update channels that deliver the most frequent refreshes.)

Digital nomads and "bleisure" define the new high-tech take on work trips

Many organizations have adopted remote work policies amid a modern plague. While some companies have brought telecommuters back to the traditional office, others have many long-term commitments to remote work. Ernest Lee, managing director development and investments, Americas, with citizenM hotels, similarly alluded to remote work-enabled "nomadic behavior" among professionals. The company recently announced a global passport; a subscription service allowing remote workers with a penchant for frequent traveling the ability to stay in any of the citizenM's 21 hotels around the globe. "We certainly think that this new sort of lifestyle will attract a certain type of person that wants to also blend in their personal interests and passions [with] not having to be tied down so much to a fixed location," Lee said. The company also offers a corporate subscription providing organizations with access to hotel rooms and meeting room spaces at a fixed price. Lee explained that this package is designed for remote teams who are no longer sharing "the same co-located space." To enhance the traditional business travel experience, hotels are incorporating a wide range of technologies, in-app features, Internet of Things (IoT) capabilities, and more.

'Clone Firm' Fraudsters Stealing Millions From UK Investors

A clone firm is a fake entity created by fraudsters that uses the name, address and Firm Reference Number - a unique identifier assigned to every financial or investment firm in the U.K and issued by the Financial Conduct Authority - of a legitimate organization, according to the alert. In some cases, the scammers will clone or spoof the entire website of a legitimate firm. Once these fake and spoofed websites are created, the fraudsters then send sales and marketing materials to would-be investors that appear to originate from legitimate firms. The scammers also advertise on social media, according to the alert. The fraudsters use phishing emails and social engineering techniques to lure victims, and their use of the legitimate sales materials gives the scheme a sheen of authenticity. Once a connection is established, the fraudsters attempt to get victims to send money to the cloned firm, the NCA notes. "Fraudsters use literature and websites that mirror those of legitimate firms, as well as encouraging investors to check the Firm Reference Number on the FCA Register to sound as convincing as possible," says Mark Steward, executive director of enforcement and market oversight for the Financial Conduct Authority.

DDoS Attacks Reach Over 10 Million in 2020

Richard Hummel, threat intelligence lead at NETSCOUT, said, “It is no coincidence that this milestone number of global attacks comes at a time when businesses have relied so heavily on online services to survive. Threat actors have focused their efforts on targeting crucial online platforms and services such as healthcare, education, financial services and e-commerce that we all rely on in our daily lives. As the COVID-19 pandemic continues to present challenges to businesses and societies around the world, it is imperative that defenders and security professionals remain vigilant to protect the critical infrastructure that connects and enables the modern world.” DDoS attack count, bandwidth, and throughput all saw significant increases since the start of the global COVID-19 pandemic. For instance, attack frequency rose 20% year over year, but that includes the pre-pandemic months of January, February, and most of March. For the second half of 2020, which was entirely pandemic-ridden, attacks rose 22% year over year. As cybercriminals quickly exploited pandemic-driven opportunities, we saw another kind of ‘new normal.’ Monthly DDoS attacks regularly exceeded 800,000 starting in March, as the pandemic lockdown took effect. 

IoT at the edge: magic won’t happen automatically

Creating more value at the edge Dheeraj Remella, Chief Product Officer at VoltDB, notes the uncertainty around many edge and IoT business cases. He argues, “Telcos spend a lot of time talking about moving up the value chain beyond connectivity, and this is a great opportunity. Differentiation is based on sets of complementary features, contributed by an ecosystem, that create capabilities rather than individual features, which as stand-alones are not compelling. The owner of the platform that delivers that joint capability holds the keys to the digital kingdom.” As Remella points out, decisioning at low-millisecond speed is one thing on a private network within an industrial plant, but another ball game when the edge is hugely distributed, such as a wind farm over hundreds or thousands of acres, or for smart agriculture or an electricity grid. He says that often, to cut down processing times at the edge, companies take what he calls a “hyper-contextualised” approach – automating decisions based on data about a single entity or an isolated set of events. This limits its usefulness, just making existing processes digital (digitising), rather than using advances in technology to do things we’ve never been able to do before (digitalising), which means doing thing differently – changing processes.

Sorry, Data Lakes Are Not “Legacy”

From a technical perspective, compute, and storage is intended to be loosely coupled architecture. As a result, this is a benefit for warehouses. However, the benefit is not just for warehouses. Any modern data architecture, by design, depends on a loosely coupled separation of compute and storage to deliver an efficient, scalable, and flexible solution. The fact that data warehouse vendors are introducing separate compute and storage is not innovation compared to data lakes; it is achieving parity with data lakes. The evolution of separate compute and storage in warehouses brings them in line with the architecture employed by productive data lakes via on-demand SQL query services. In a post called When to Adopt a Data Lake — and When Not to, a dig at data lakes was that they could not scale compute easily or on-demand; Some solutions architects have proposed data lakes to “separate compute from storage” in a traditional data warehouse. But they’re missing the point: You want the ability to scale compute easily and on-demand. A data lake isn’t going to give you this; what you need is a data warehouse that can provision and suspend capacity whenever you need it.

AI, machine learning effective in cyber defence, but can also present challenges

"Antivirus technology, for example, operates a strict ‘yes or no’ policy as to whether a file is potentially malicious or not. It’s not subjective, through a strict level of parameters, something is either considered a threat, or not." he says. "The AI can quickly determine whether it’s going to crash the device, lock the machine, take down the network and as such, it is either removed or allowed. "It is important to note that VIPRE uses AI and ML as key components in their email and endpoint security services for example as part of their email security attachment sandboxing solution where an email attachment is opened and tested by AI in an isolated environment away from a customer’s network," Paterson adds. "So while AI might not be an ideal method for preventing accidental data leakage through email, it does have an important part to play in specific areas such as virus detection, sandboxing and threat analysis." Paterson says with so much reliance on email within business practices, accidental data leakage is an inevitable risk. "The implications of reputational impact, compliance breach and associated financial damage can be devastating. A cyber-aware culture with continuous training is essential, and so is the right technology," he says.

Does CI/CD impact telecom operations?

In the standard microservice code model that underpins cloud-native software, every time a common code software component is improved, it will change all network systems that use that standard code. This approach can bring lightning-fast agility and innovation but leaves today's legacy bi-annual software test and validate processes entirely unfit for purpose. The telecom CI/CD philosophy means that software is developed, delivered, tested, accepted, and brought into operation incrementally at a far higher cadence than previously in a traditional service provider environment. Further, it creates a significant software development volume that needs validation on an increasingly dynamic network. This approach implies that continuous software validation and continuous testing must accompany continuous software delivery and deployment. These requirements demand a new agile way of working between the network operator, its software suppliers, and vendors. Essentially, the merging of Dev and Ops as in the IT world is now a must for the telecom context where the 'Dev' from vendors needs to seamlessly merge and receive feedback from the 'Ops' on the operator side of the firewall. This evolution requires a transformation on both the vendor side as well as the operator side.

Quote for the day:

"Entrepreneurship is the last refuge of the trouble making individual." -- Natalie Barney

Daily Tech Digest - January 29, 2021

Expert: Agile data-driven decision-making key to growth

"You can't achieve agility, and you can't be adaptive unless you empower your business users with as much self-service analytics and business intelligence and reporting as they can consume," Evelson said. "Self-service is really the only way to become agile and adaptive." That, however, is linked to data governance, which is also imperative to agile data-driven decision-making. "There is a very fine line between too much self-service and not enough governance, versus too much governance and not enough self-service," Evelson added. "Hopefully, there is a middle ground between the two, which we call Goldilocks data governance." All of the competencies together, meanwhile, enable an organization to be agile through what Evelson terms multi-modal analytics and reporting. They empower organizations to do descriptive analytics through dashboards and reports, diagnostic and predictive analytics to get insights, and ultimately prescriptive and actionable analytics to make decisions and trigger actions. And should organizations fail to become agile and adapt to constant change, they risk irrelevancy and ultimately insolvency. Forty years ago, the average lifespan of companies in the S&P 500 was about 30 years, Evelson said.

The Brain Is Neither a Neural Network nor a Computer

Autonomy is the idea that the brain is self-governing, receptive to the environment, but always in control. Somatic disorders ranging from improper sugar levels and hormone imbalances to diseases such as malaria or syphilis can cause mental dysfunction. Some individuals are placed in mental hospitals when correcting an underlying disorder would actually fix the problem. At the simplest level, no amount of mental determination would make you a world-class athlete if you did not have the right type of muscle fibers or hand-eye coordination. You cannot flap your arms and fly—the aerodynamics does not allow it. Paganini could only be the legendary violinist he was because of his flexibility. No amount of musicianship could provide that ability. Cognitive processes are embodied. They emerge from the interaction between physical organisms and their environment, not just their brains. For example, there is evidence that the nature of your gut bacteria can cause anxiety, stress, and even depression. Replacing a diseased organ with a healthy one can increase mental functioning. A kidney transplant will help remove poisons from the blood such as urea or ammonia which will increase brain health.

The state of corporate legal departments and the role of the Chief Legal Officer

The survey affirms we are in the “age of the CLO.” With 78 percent of respondents reporting to the CEO, the overall trend remains very positive. Further, while CLOs still spend around one quarter of their time providing legal advice, they also spend a significant amount of time on board matters and governance issues, contributing to strategy development, and advising other executives on non-legal issues. The survey found that 46 percent of CLOs are responsible for their company’s data privacy function, reflecting the growing integration of legal in business strategy and technology policy. In the order of functions reporting to the Chief Legal Officer, only compliance (74 percent) outranks privacy. CLOs are also increasingly engaging with environmental, social, and governance issues. This includes diversity and inclusion (D&I). A full 72.7 percent of CLOs expect diversity and inclusion specifically to accelerate in 2021. Encouragingly, even despite COVID-19, 32 percent of law departments plan to take on more lawyers in 2021, a slight increase over 30 percent from 2020.

Defense Against Vulnerabilities in the Cloud – Is It Too Late?

Apart from the traditional challenges around access management, data pilferage and threats from data communication with third party applications is gaining prominence. Communication with third party applications has found increased traction through APIs, which are increasingly being targeted by threat actors. Further, misconfigurations and policy violations in cloud assets create potential vulnerabilities and backdoors leading to risk of compromise. This is primarily due to the policies of some companies to not change the default security settings on their cloud workloads. These cloud vulnerabilities are accentuated by the increasing number of connected systems and their dependencies. The genesis of many vulnerabilities boil down to access and privilege management. Organizations need to plan for a deep inspection and vulnerability management system as part of their devsecops pipeline for building scalable cloud native applications. A comprehensive vulnerability management system goes a long way to enable organizations to effectively manage and minimizing their threat attack surface.

How to build a trustworthy and connected future

More broadly, big(ger) data from personal, commercial and government sources has the potential to address various challenges related to the Sustainable Development Goals. For instance, the Humanitarian and Resilience Investing Initiative aims to fill critical gaps in the available data that are preventing investors from accessing more humanitarian and resilience investing (HRI) opportunities. The pandemic has exposed and exacerbated existing gaps and inequalities, notably almost half of the global population remain offline and broadband services are too expensive for 50% of the population in developed countries. These “connectivity deserts” hamper access to health, education and economic inclusion. In a bid to improve access to the digital economy, during The Davos Agenda, the Forum launched the Essential Digital Infrastructure and Services Network, or EDISON Alliance, tasked with working to accelerate digital inclusion Meanwhile, in metropolises around the globe, which account for nearly two-thirds of CO2 emissions, smart energy infrastructure connected through data and digitalization is central to transitioning to “net zero” cities.

2020 Marked a Renaissance in DDoS Attacks

The sheer quantity of attacks in 2020 was surprising, Kaczmarek says. "We always expect the number of attacks to increase year over year and quarter over quarter, but we didn't expect that the quantity would increase by over 150%," he says. "This truly reflects the impact of the pandemic and the challenging precedent the 'new normal' has set for cybersecurity." The number of DDoS attacks that involved two or more vectors increased from 40% in 2019 to 72% in 2020, Kaczmarek added. "This means that the attackers as well as the tools they are using are improving," he says. According to Neustar, while the use of DDoS to try and extort ransoms is not new, these attacks grew in persistence, sophistication, and targeting in 2020. Cyber extortionists purporting to belong to well-known nation-state groups went after organizations in industries they have not regularly targeted previously, such as financial services, government, and telecommunications. "RDDoS attacks surged in Q4 2020 as groups claiming to be Fancy Bear, Cozy Bear, and the Lazarus Group attempted to extort organizations around the world," says Omer Yoachimik, product manager, DDoS protection at Cloudflare, another vendor that observed the same trend.

A better kind of cybersecurity strategy

The core of the matter involves deterrence and retaliation. In conventional warfare, deterrence usually consists of potential retaliatory military strikes against enemies. But in cybersecurity, this is more complicated. If identifying cyberattackers is difficult, then retaliating too quickly or too often, on the basis of limited information such as the location of certain IP addresses, can be counterproductive. Indeed, it can embolden other countries to launch their own attacks, by leading them to think they will not be blamed. “If one country becomes more aggressive, then the equilibrium response is that all countries are going to end up becoming more aggressive,” says Alexander Wolitzky, an MIT economist who specializes in game theory. “If after every cyberattack my first instinct is to retaliate against Russia and China, this gives North Korea and Iran impunity to engage in cyberattacks.” But Wolitzky and his colleagues do think there is a viable new approach, involving a more judicious and well-informed use of selective retaliation. “Imperfect attribution makes deterrence multilateral,” Wolitzky says. “You have to think about everybody’s incentives together. Focusing your attention on the most likely culprits could be a big mistake.”

US, China or Europe? Here's who is really winning the global race for AI

On almost all metrics, therefore, the EU seems to be taking a backseat; and according to the researchers, there is no doubt that this is due to stringent regulations that are in place within the bloc. "Many in Europe do not trust AI and see it as technology to be feared and constrained, rather than welcomed and promoted," concludes the report, recommending that the EU change its regulatory system to be "more innovation-friendly". The General Data Protection Regulation (GDPR), say the researchers, limits the collection and use of data that can foster developments in AI. Proposals for a Data Governance Act, while encouraging the re-use of public sector data, also restrains the transfer of some information; and by creating European data spaces, the regulation could inhibit global partnerships. Recent reports show that the last year has seen almost a 40% increase in GDPR fines issued by the EU compared to the previous 20 months, reaching a total of $332 million in fines since the new laws started applying. In that context, it is not rare to find that some firms are deterred from developing AI systems altogether, out of fear of receiving a fine – even for the most well-intentioned innovations.

A Guide to Find the Right IoT Module for Your Project

As more small and new module providers emerge into the IoT market, many cheaper IoT modules are becoming available to customers with extremely attractive tag price. If we simply look at the initial deployment cost of using cheaper modules, it might look like that it saves a lot of money for the customers. But is the quality of these modules guaranteed? The process of developing a new product and making it deliverable to the market takes long and is costly. Low-quality modules always accompany a higher risk of malfunction and, to the worst extent, result in the failure of the whole project. This will not help IoT companies to generate expected project income, in reverse, it causes a greater loss in investment. From a long-term perspective, even if the product was launched to the market, the unstable performance of the module is likely to cause unwanted surprises and require frequent maintenances. This will not be simply a higher operating cost to the business, it will also harm the reputation of the brand and damage the customers’ loyalty. For the long-term growth of the business, choosing a reliable partner and quality-guaranteed module products is wise and worthy.

Researchers: Beware of 10-Year-Old Linux Vulnerability

The vulnerability, called "Baron Samedit" by the researchers and officially tracked as CVE-2021-3156, is a heap-based buffer overflow in the Sudo utility, which is found in most Unix and Linux operating systems. Sudo is a utility included in open-source operating systems that enables users to run programs with the security privileges of another user, which would them give them administrative – or superuser - privileges. The bug, which appears to have been added into the Sudo source code in July 2011, was not detected until earlier this month, Qualys says. "Qualys security researchers have been able to independently verify the vulnerability and develop multiple variants of exploits and obtain full root privileges on Ubuntu 20.04 (Sudo 1.8.31), Debian 10 (Sudo 1.8.27), and Fedora 33 (Sudo 1.9.2). Other operating systems and distributions are also likely to be exploitable," the researchers say. After Qualys notified the authors of Sudo, a patch was included in version 1.5.5p2, published this week. Qualys and the Sudo authors are urging Linux and Unix users to immediately patch systems. Rob Joyce, who was recently named director of the National Security Agency's Cybersecurity Directorate, also flagged the alert on Twitter.

Quote for the day:

"Believe those who are seeking the truth. Doubt those who find it." -- Andre Gide

Daily Tech Digest - January 28, 2021

Engaging Employees to Accelerate Digital Banking Transformation

Many financial institutions are investing heavily in new technologies and processes to support their digital banking transformation goals. Research by the Digital Banking Report has found that banks and credit unions have increased investment in digital transformation in each of the past four years. There is no doubt that these investments are justified given the flight to digital by consumers and the game-changing technology that can support digital customer experience improvements. Unfortunately, with such a focus on data, analytics, technology and systems, most firms ignore the need to invest in employees to make sure they maximize the value of the new tools being deployed. Beyond open communication around how employees can be a part of the digital banking transformation process, it is important to invest in training the people to ensure that the digital banking transformation efforts succeed. If you don’t, it’s like buying a new car but failing to fill the gas tank (or charge the batteries). To respond to the need to reskill and upskill current employees, new models of managing learning and development have emerged. More than replicating legacy training methods, new learning officer positions have been created with the responsibility of not only creating ongoing learning opportunities, but also supporting cultural transformation.

Here’s why upskilling is crucial to drive the post-COVID recovery

We have a pressing societal problem: how to equip people with the skills they need to participate in the economy – now and in the future. As outlined in the World Economic Forum’s latest Future of Jobs Report, half of all employees around the world will need reskilling by 2025 – and that number doesn’t include all the people who are currently not in employment. If we don’t act now, this skills gap will only widen. With challenges come opportunities. Crisis events, like the pandemic, can and should shape economic thinking and represent a rare but narrow window of opportunity to reflect, reimagine, and reset priorities. So let’s seize this opportunity. We’re calling on governments, business leaders, and educators to join us in a global movement for upskilling. As you’ll see in our new report – Upskilling for Shared Prosperity – published as part of Davos Agenda Week to mark the first anniversary of the World Economic Forum’s Reskilling Revolution Platform, there’s a clear social and economic case for upskilling. If we commit to giving all people opportunities to build the skills they will need to fully participate in the future workplace, it will, in turn, lead to a prosperity dividend.

Law enforcement takes over Emotet, one of the biggest botnets

According to Europol, Emotet's infrastructure consisted of several hundred servers located across the world and serving different purposes, including making the botnet more resilient to takeover attempts. Law enforcement agencies had to work together to develop a strategy that involved gaining control of the infrastructure from the inside and redirecting victims to servers under their own control. As part of the investigation, the Dutch National Police seized data from the servers used by Emotet, including a list of stolen email credentials abused by the botnet. The agency set up a web page where users can check if their email address was among those affected. The information about infected computers that was gathered during the operation was also shared with national CERTs so the victims can be identified and contacted. "Only time will tell if the takedown will have long-term impact to Emotet operations," Jason Passwaters, COO of security firm Intel 471, tells CSO. "These groups are sophisticated and will have baked in some sort of recovery. Emotet itself does not appear to have any sort of inherent recovery mechanism, but a lot of the infected machines will have other malware installed as well, such as Qbot, Trickbot or something else. ..."

Top 5 Evolving Cybersecurity Threats to Cloud Computing in 2021

According to the Sophos Threat Report of 2020, misconfigurations can drive numerous data breaching incidents. Businesses are integrating themselves with cloud computing which guarantees the possibilities of cloud jacking emergence. Trend Marco predicts that code injection attacks can be utilized to attack cloud platforms. These attacks can be carried out through third-party libraries, from SQL injection and cross-site scripting. Attackers inject malicious code through third-party libraries and ensure that the code is downloaded and executed by individuals unintentionally. According to typical public cloud vendors, they are only responsible for the security of their infrastructure and individuals are responsible for protecting their data. ... Social engineering acquires phishing scams to steal user credentials for cloud-service tracks and on-premises attacks. Do you know that 78% of data breaching incidents that occurred during 2019 were related to phishing? This percentage increased in 2020. Innovative phishing attempts are launched through cloud applications rather than traditional emails. Phishing kits make it easier for cybercriminals to carry out illicit activities. Phishing kits require a very small amount of technical skills to carry out phishing operations.

What Is Robomorphic Computing?

A robot’s operation is a three-step process: gathering data using sensors or cameras; use mapping and localisation techniques to understand the environment; plotting the course of action. Advances in embedded vision and SLAM technology make data gathering and localisation easy. However, all these steps take a lot of time, especially when calculations are done on CPUs. Previously, the researchers have investigated the software side to develop an efficient algorithm to speed up robots. The MIT folks concluded it’s time to look beyond software. Hardware acceleration is the use of a specialised hardware unit to do certain computing tasks more efficiently. While Graphic Processing Units or GPUs have been availed for such tasks, the application is limited since the use cases are different for different robots. Hence, the researchers at MIT developed robomorphic computing to devise a customised hardware unit for individual robots. It considers the physical parameters of the robot and the tasks it needs to perform and translates it into mathematical matrices to design a specialised hardware architecture. The resulting chip design is unique to the robot and maximises its efficiency.

Digital Identity Is the New Security Control Plane

Digital identity — in the form of trusted contextual data defining who is accessing a system and how — provides this control plane. Users are already providing identity (and likely at multiple points). Systems are already consuming it — in the case of software-as-a-service (SaaS) environments, it may be one of the few configurable security controls available — but the decoupling of security from location and IP address is present in many other solutions. It can be tailored to an organization's needs and be risk-sensitive, with different methods and phases required, depending on the resource accessed. Even better, it's a control plane that can and should be implemented in a phased approach and provides a path to a zero-trust network architecture. The steps to building this are conceptually simple, and we can do extensive preparation. First, ensure even before you implement that the technologies you are investing in are identity-aware and able to make differentiated security decisions in the data plane based on that identity. This must extend to SaaS applications — one of the largest benefits of using identity as your control plane is the ability to bring these into the fold, as it were, and to match them to your security model. Second, consolidate identity to a single "source of trust" — that is, a single secure, consistent, and accurate repository for identity.

Data Privacy Day 2021: What to consider in the wake of Covid-19

The exit of the UK from the EU means that companies across the country that deal with Europe need to take extra steps to ensure correct compliance. According to Rich Vibert, CEO and co-founder of Metomic, this can be aided by considering this aspect at the start of any deployment. “This Data Privacy Day, we must confront the fact that UK companies aren’t equipped to protect their data now that we’ve Brexited,” said Vibert. “A large proportion of the responsibility for this lies with the UK government, whose failure to deliver guidance during the transition period resulted in businesses adopting a ‘wait and see’ approach. “Businesses need to take charge; proactively adapting compliance to UK-GDPR and analysing how a lack of adequacy could impact them and their customers. Only by doing so will they avoid the financial and reputational damage caused by non-compliance. “Regardless of whether the government holds the blame for the current status quo or not, leaders must see this as an opportunity to reset their approach to data protection. This means putting the privacy, compliance and security of data at the heart of their business strategy and using technology to facilitate this.

Marry IGA with ITSM to avoid the pitfalls of Identity 2.0

IAM solutions are too coarse-grained to handle such moves, in my experience. That forces admins to do IGA the hard way – taking care of onboarding, job changes, terminations, and so forth by hand. In addition to being a time- and labor-intensive hassle, manual IGA leads to numerous identity management errors. All too often, manual IGA grants access to new applications or information sources but doesn’t take away old ones, which exposes companies to security and compliance risks. Manual processes for managing patches, password resets, software updates, and more also increase risks. You don’t want an executive accessing highly confidential information from an app that doesn’t require two-factor authentication on a laptop that hasn’t been updated. But if IGA is managed from a spreadsheet, that’s exactly what happens. The employee lifecycle is only one of the IGA challenges that Identity 2.0 systems are not well-positioned to address. Take for example the expense and integration hassle of onboarding traditional IAM into manual IGA systems. The typical IGA system, like most enterprise systems, exists in a silo. Implementing manual IGA on systems such as HR, CRM, finance, and operations means writing numerous custom integrations.

What Happens If a Cloud Provider Shuts You Out?

There are other reasons, such as sudden outages or the shutdown of a cloud provider, for organizations to create plans to salvage their code and get back online quickly, Valentine says. Heikki Nousiainen, CTO at Aiven, also says the threat of getting cut off by all three major cloud providers is very low for most other businesses -- yet companies may want to maintain the ability to move code around for disaster recovery needs. “They are rare, but we sometimes see these big outages touch Google, AWS, or Azure in one or more regions,” he says. Companies with very time-sensitive online business needs, for example, may want to maintain the ability to roll over to a backup elsehwere, Nousiainen says. He recommends exploring true multi-cloud options where companies can select providers freely without being locked in, and also going with open source technology because that lets the same set of services run in different clouds. Some of these options can come at a bit of premium, though Nousiainen says the overall benefits may be worth it. “There are costs associated but typically when that investment goes into preparing infrastructure as a code it also helps for many other problems such as disaster recovery.”

Dead System Admin's Credentials Used for Ransomware Attack

In a case study published Tuesday, the researchers say the system administrator had died three months previously, but the account remained active. The researchers note that there are numerous reasons why the account could have been left open, including the possibility that the system admin had helped with the initial setup of the targeted firm's services. "Closing down the account would have stopped those services working, so keeping the account going was, we'd imagine, a convenient way of letting the dead person's work live on," according to the report. The Sophos report also notes that these types of "ghost" accounts are an increasing problem for security teams, especially if other parts of the company forget that they remain active after an employee has left or died. "In this case, the active use of the account of a recently deceased colleague ought to have raised suspicions immediately - except that the account was deliberately and knowingly kept going, making its abuse look perfectly normal and therefore unexceptionable, rather than making it seem weirdly paranormal and therefore raising an alarm," according to Sophos.

Quote for the day:

"The leadership team is the most important asset of the company and can be its worst liability." -- Med Jones

Daily Tech Digest - January 27, 2021

When Kubernetes is not the solution

Automation and orchestration are frequent reasons to leverage Kubernetes. Keep in mind that automation and orchestration often get confused, and for good reason. Automation can help make a business process more efficient by reducing or removing human involvement with software or hardware that performs specific tasks. For example, automation can launch a process to reorder raw materials automatically when other processes notice that supplies are below a specific level. In short, a single task is automated. Orchestration, in contrast, allows you to automate a workflow. Orchestration can keep track of sequence and activities, and can even invoke many single-task automations that are part of the workflow. Orchestration is a powerful Kubernetes tool that also allows you to invoke services such as database access across disparate systems. What's happening now is that many developers and architects choose Kubernetes to automate processes using the orchestration engine. That’s like hitting a thumbtack with a sledgehammer. You’ll end up spending way too many dollars on development and cloud resources to solve a simple, specific problem. Another fact that often gets overlooked is that Kubernetes is a complex system itself; it requires special expertise and at times can increase risk.

Learning from Incidents

When we use language that wraps up complexity in a neat parcel like “human error,” or we make counterfactual assertions (“system X should be able to detect this scenario,”) we give participants in our investigation the opportunity to agree with something that might be true given what we know in hindsight, but which does not help us understand the behaviour of the people or systems during the incident. Everyone in the room can nod and acknowledge that the human did indeed make a mistake, or that system “X” really should have detected the issue. Have you understood anything new about how your system really works? Unlikely. Secondly, when we ignore power structures and the social dynamics of the organizations we work in, we risk learning less. Asking “why” questions can put people on the defensive, which might make them less likely to speak frankly about their own experience. This is especially important when the person being asked is relatively less powerful in the organisation. “Why did you deploy that version of the code?” can be seen as accusatory. If the person being asked is already worried about how their actions will be judged, it can close down the conversation. “During this incident you deployed version XYZ. 

4 Ways Blockchain Could Catapult Into the Mainstream

We are used to storing valuables at home such as money, jewelry or art. However, when the value of these goods exceed what we can insure, or what we feel safe in keeping at home, we usually turn to banks or special custodians as more convenient safeguards for storing our liquid assets. Cryptocurrency offers alternative storage options via personal wallets or easy on-ramps to exchanges or a new category of crypto custodians that possess their own secure vaults. Today, many self-custody wallets already exist, allowing users to experience the self-service option for assets storage. Those same wallets also enable the storage of another blockchain novelty: “digitally unique” artifacts also known as non-fungible tokens (or NFTs; think CryptoKitties). In the long term, banks and old-style physical storage services may not be the most popular or safest storage methods anymore. Being your own custodian is an attractive value proposition that comes with a degree of freedom and efficiency, as long as its relative ease of use and trust levels continue to improve. Many users will gradually de-bank their assets and move them into self-custody to take advantage of new services that are only available in the blockchain world. 

Security's Inevitable Shift to the Edge

Many security architects are initially attracted to the SASE model as it helps them apply security controls at the optimal location in their rapidly changing architecture. That optimal location is the edge of the Internet, which will be close to any infrastructure-as-a-service (IaaS) or co-location facility that the business uses today or in the future. The edge deployment model provides agility for hybrid multicloud organizations and is well suited to changes to IaaS vendor or new locations from mergers and acquisitions. The flexibility of deploying security inspection at the edge means that, regardless of shifts in the location of compute, security inspection can be performed at a local edge node. This provides for optimized routing of traffic and avoids what Gartner describes as the unnecessary "tromboning of traffic to inspection engines entombed in enterprise data centers." Furthermore, since multicloud is the predominant architecture, deploying security at a homogeneous edge makes more sense than trying to engineer consistent controls using heterogeneous capabilities available at various cloud security providers (CSPs). Another driver for SASE is the migration of users outside of the traditional corporate offices.

Cisco bolsters edge networking family with expanded SD-WAN, security options

Among the four new models is a low-end box – the Cisco Catalyst 8500L – that's aimed at entry-level 1G/10G aggregation use cases, Cisco stated. The 1RU form factor 8500L is powered by 12 x86 cores and features up to 64GB memory to support secure connectivity for thousands of remote sites and millions of stateful NAT and firewall sessions, wrote Archana Khetan, senior director of product management for Enterprise Routing and SD-WAN Infrastructure at Cisco, in a blog about the new boxes. Businesses find that establishing aggregation sites at either core locations or colocations helps them own the first mile on their branch and remote-worker connectivity to the internet and other software-defined cloud interconnects, Khetan stated. "The Catalyst 8500L provides ultra-fast IPsec crypto performance and advanced flow-based forwarding to keep up with the demands of today's high-speed, secure connectivity," Khetan stated. Targeting the branch, Cisco added the Catalyst 8200, which supports eight CPU cores for high-performance packet forwarding and 8GB of default RAM to run the latest security services, Khetan stated. The Catalyst 8200 Series supports up to 1Gbps of aggregate forwarding throughput, which is double the performance of its ISR 4300 predecessor, according to Khetan.

Ransomware: Should Governments Hack Cybercrime Cartels?

One proposal has been to ban all ransom payments. Whether such bans could be enforced is not clear. Also, organizations that did their best to safeguard themselves, but still saw their systems get crypto-locked, could go out of business or suffer devastating interruptions due to a ban. Short of a ban, Ciaran Martin, an Oxford University professor of practice in the management of public organizations who until last August served as the British government's cybersecurity chief, says governments should at least crack down on insurers being able to help victims funnel payoffs to attackers. "I see this as so avoidable. At the moment, companies have incentives to pay ransoms to make sure this all goes away," Martin tells The Guardian, expanding on suggestions he's previously made. "You have to look seriously [at] changing the law on insurance and banning these payments, or at the very least, having a major consultation with the industry." Responding to suggestions that ransom payments be banned, a spokesman for the Association of British Insurers tells Information Security Media Group: "Insurance is not an alternative to managing the cyber ransomware risk; it is part of a toolkit to combat this crime." The spokesman also notes that policyholders must have all "reasonable precautions" in place.

Experts predict hot enterprise architecture trends for 2021

There is increasing competition in enterprise architecture tools, with a lot of new players. There's going to be more investing in R&D. Hopefully, that means customers will get better tools for their EA initiatives. We'll see tools going in different directions and having different focuses. The newer generation of tools is typically data-driven. You don't draw your architecture. It is basically derived from the data you put into the tools. That opens up different uses for data analytics to create future-state scenarios, quantify the benefits to the business and use that to make strategic decisions. You can do organizational modeling. It's difficult to do that unless you have a data-driven approach, because you would have to create every single future-state scenario. The entire delivery vehicle for the newer tools is cloud only, so you can deploy more rapidly. Companies that have moved to the cloud over the last couple of years realize that you can't be in one cloud anymore. You have to be in multiple clouds in order to ensure redundancy. That's another area where EA tools are focusing, creating native integration with these modern-day cloud environments and using enterprise architecture practices to manage and model them.

Streamlining cloud compliance through automation

The first is inherent in compliance with any cybersecurity and privacy requirement, and the cloud doesn’t make it go away (in fact, it arguably makes it worse) – and that’s the time it takes to audit. Companies preparing for audits must sink significant time and effort (hundreds of hours, every audit, across multiple requirements) into collecting a vast amount of technical data on information security controls and processes. Manually collecting data, taking screenshots, and organizing evidence takes that time away from cloud and DevOps teams that could otherwise be spent building new products or services. ... Second, security capabilities meant for on-premises environments no longer apply when companies begin migrating to the cloud, making evidence gathering all the more complicated. Quite simply, the cloud creates a new paradigm, forcing companies to re-architect the best security practices they have spent years perfecting, i.e., to fundamentally start from scratch. Third, software development and change management in the cloud moves at light speed compared to more traditional monolithic application updates, and it can be difficult for companies to keep up with the security and privacy implications of that ever-changing cloud environment.

How to deliver an effective technology strategy in 2021

Technology strategies, like data strategies and digital transformations can no longer be considered in isolation. Having the right technology platform is just one of a number of critical enablers to being competitive, agile and innovative in the 2020s. The growing trend for business transformation is a holistic approach which recognises to succeed, technology, data and digital transformations need to be tackled together, or at least in parallel. In the 2020s businesses can be divided between those who are disrupting and those being disrupted. Disruptors enter categories with a transformative new product, service or customer experience — posing an existential threat to the existing players. Disruptors are digital, data and technology first companies, leveraging these as assets in the battleground of customer experience. Any technology strategy should be intertwined with a data strategy. It should be focused on delivering the customer approach to serve the overall business plan. I appreciate that sounds a lot harder than focusing just on technology, but the alignment needs to be embraced rather than avoided if the desired outcomes are to be achieved. The world is littered with technology that’s easy to buy, more challenging to implement and often only partially or completely unused.

Cybersecurity, Modernization Top Priorities for Federal CIOs

One significant focus not covered by the first 100 day plan but indicated in the proposed stimulus package is a response to something more recent -- the SolarWinds hack, which has impacted both government and commercial IT organizations. In response the new administration is putting a new focus on cybersecurity, adding provisions that cover this area to the COVID-19 stimulus package. While it needs to go through Congress, the American Rescue Plan from the administration calls for a total of more than $10 billion for cybersecurity and IT modernization efforts, plus some other IT-related areas. "In addition to the COVID-19 crisis, we also face a crisis when it comes to the nation's cybersecurity," a brief of the plan says. "The recent cybersecurity breaches of federal government data systems underscore the importance and urgency of strengthening US cybersecurity capabilities. President-elect Biden is calling on Congress to launch the most ambitious effort ever to modernize and secure federal IT and networks." Even if it doesn't remain in the stimulus package that Congress ultimately passes, the Biden administration's inclusion of funding for cybersecurity highlights just what a priority this area is for the administration going forward.

Quote for the day:

"If we were a bit more tolerant of each other's weaknesses we'd be less alone." -- Juliette Binoche

Daily Tech Digest - January 26, 2021

How to unleash creative thinking

When you are stuck on a problem, take the time to step away and relax. When your mind is clear, it will look to combine what is on its memory shelves and then...aha! You will have a flash of insight — a combination of examples from history that form an idea. It may not be one big Eureka moment. Instead, it may be a series of smaller insights that you hardly feel as discrete cognitive events. Regardless, the mental mechanism is the same for large and small epiphanies — it is a feeling of excitement as the idea forms. Here is what presence of mind looks like. Let’s say two family members who are both very picky eaters are spending the night with you. You can’t decide what to make for dinner that both will like. As you go up and down the aisles of the supermarket, the contents of your cart keep changing, but each time you look at the combination, you know that your guests will not be happy with the dinner it would make. ... When you need a new idea, throughout the workday try to take in as many examples from history as possible that might relate to your problem. Don’t work late: Spend the evening on something that gives your mind a rest. Go to the gym, have dinner with friends, take a long shower, and above all get a good night’s sleep. This greatly increases your chances of a flash of insight to solve your problem.

Enhancing Email Security with MTA-STS and SMTP TLS Reporting

The primary goal is to improve transport-level security during SMTP communication, ensuring the privacy of email traffic. Moreover, encryption of inbound messages addressed to your domain enhances information security, using cryptography to safeguard electronic information. Furthermore, man-in-the-middle attacks (MITM) like SMTP Downgrade and DNS spoofing attacks, have been gaining popularity in recent times and have become a common practice among cybercriminals, which can be evaded by enforcing TLS encryption and extending support to secure protocols. ... Since encryption had to be retrofitted into SMTP protocol, the upgrade for encrypted delivery has to rely on a STARTTLS command. A MITM attacker can easily exploit this feature by performing an SMTP downgrade attack on the SMTP connection by tampering with the upgrade command by replacing or deleting it, forcing the client to fall back to sending the email in plaintext. After intercepting the communication a MITM attacker can easily steal the decrypted information and access the content of the email. This is because SMTP being the industry standard for mail transfer uses opportunistic encryption, which implies that encryption is optional and emails can still be delivered in cleartext.

LAMBDA: The ultimate Excel worksheet function

Researchers have known since the 1960s that Church’s lambda notation is a foundation for a wide range of programming languages and hence is a highly expressive programming construct in its own right. Its incorporation into Excel represents a qualitative shift, not just an incremental change. To illustrate the power of LAMBDA, here’s a function written using the notation to compute the length of the hypotenuse of a right-angled triangle:=LAMBDA( X, Y, SQRT( X*X+Y*Y ) ) LAMBDA complements the March 2020 release of LET, which allows us to structure the same example like this:=LAMBDA( X, Y, LET( XS, X*X, YS, Y*Y, SQRT( XS+YS ) ) ) The function takes two arguments named X and Y, binds the value of X*X to the name XS, binds the value of Y*Y to YS, and returns SQRT( XS+YS) as its result. The existing Name Manager in Excel allows any formula to be given a name. If we name our function PYTHAGORAS, then a formula such as PYTHAGORAS(3,4) evaluates to 5. Once named, you call the function by name, eliminating the need to repeat entire formulas when you want to use them. Moreover, LAMBDA is the true lambda that we know and love: a lambda can be an argument to another lambda or its result; you can define the Church numerals; lambdas can return lambdas, so you can do currying; you can define a fixed-point combinator using LAMBDA and hence write recursive functions; and so on.

North Korean hackers have targeted security researchers via social media

Google said the blog hosted malicious code that infected the security researcher's computer after accessing the site. "A malicious service was installed on the researcher's system and an in-memory backdoor would begin beaconing to an actor-owned command and control server," Weidemann said. But Google TAG also added that many victims who accessed the site were also running "fully patched and up-to-date Windows 10 and Chrome browser versions" and still got infected. Details about the browser-based attacks are still scant, but some security researchers believe the North Korean group most likely used a combination of Chrome and Windows 10 zero-day vulnerabilities to deploy their malicious code. As a result, the Google TAG team is currently asking the cyber-security community to share more details about the attacks, if any security researchers believe they were infected. The Google TAG report includes a list of links for the fake social media profiles that the North Korean actor used to lure and trick members of the infosec community. Security researchers are advised to review their browsing histories and see if they interacted with any of these profiles or if they accessed the malicious blog.br0vvnn.io domain.

Open source magic solves a months-long problem in 20 minutes

Every industry is trying to get to the future as fast as possible, and telecommunications is no different. As Iain Morris called out in a Light Reading article, in 2018 France's Orange estimated that a third of its global workforce--more than 50,000 employees--needed reskilling if the company hoped to keep up with cloud vendors. In that same article, Morris pointed out that Spain's Telefonica figured it would need nearly $2 billion in staff training and early retirement buyouts to bring in new talent with new skills to be competitive. Such telcos often turn to SIs, like UK-based Capventis, who in turn bring domain expertise and work primarily with clients in the Business Intelligence (BI), Customer Relationship Management (CRM), and Customer Experience (CX) fields. These areas haven't traditionally been ripe for open source, but even SIs with these focus areas rely on open source software to help their clients. It's hard for even the best proprietary software vendors to keep pace with the innovation cycles of successful open source projects; so, these SIs will partner with companies like Alteryx, Qlik, Qualtrics, and Zendesk, augmenting their proprietary software with open source expertise.

Artificial Intelligence And The Power Sector: A Promising Future

AI powers electrical grids that allow two-way communication between utilities and consumers. Smart grids are embedded with an information layer that allows communication between its various components so they can better respond to quick changes in energy demand or urgent situations. This information layer, created through widespread installation of smart meters and sensors, allows for data collection, storage, and analysis. Given the large volume and diverse structure of such data sets, techniques such as machine learning, Internet of Things, etc are best suited for their analysis and use. This analysis can be used for a variety of purposes, including seamless fault detection in meters, predictive maintenance needs, quality monitoring of sustainable energy, as well as renewable energy forecasting, along with latest innovation in Information and Communications technology (ICT). The power sector in developed countries has already started using AI, Data Analytics, Internet of Things (IoT), and related technologies that allow for communication between smart grids, smart meters, and computer devices. These technologies help prevent power mismanagement, inefficiency, and lack of transparency, while increasing the use of renewable energy sources. 

Operations Model: DevOps, NoOps, AIOps or None of the Above Ops?

Traditional IT operations (ITOps) is the process of designing, developing, deploying and maintaining the infrastructure and software components of a specific product or application. It also ensures the customer experience is delivered through traditional means of support such as tickets and escalated paths for resolutions. DevOps is a process for accelerating the delivery of features to the application or product infrastructure in a consistent model with less human intervention, while also providing a better quality product through automation in the software development life cycle. An AIOps process introduces data science into the operating model by learning the behavior of the systems, scaling according to the needs of a platform (both infrastructure and customer usage of the platform). It expands the horizon of DevOps through the introduction of machine learning, focusing on data generated from the hardware and software systems, and allows organizations to grow organically based on demand. NoOps, although the name suggests otherwise, is an advanced approach to managing IT operations through the mindset that everything is derived as development. This model is best applied with startups and companies with high technological maturity.

How To Become A Cybersecurity Analyst

Cybersecurity analysts are the forefront warriors of an enterprise’s cyber defense. The role demands keeping a constant tab on any threat and monitoring the company’s network for potential vulnerabilities. A cybersecurity analyst lives by the adage– ‘a company’s security is as good as its weakest link’, and is always on the lookout for any untoward event across the network. The major responsibilities of a cybersecurity analyst include: Maintaining a firewall to protect confidential information and encrypting data transmission; Monitoring the entire network for any attacks, intrusions or unauthorised activity; Determining emerging threat patterns and vulnerabilities using advanced analytics tools; Generating reports for all the stakeholders involved– both technical and non-technical; Carrying out risk assessments to ensure best security practices are in place; Help in developing cybersecurity awareness training for colleagues; Educating users about threats and vulnerabilities ... Cybersecurity analysts are the forefront warriors of an enterprise’s cyber defense. The role demands keeping a constant tab on any threat and monitoring the company’s network for potential vulnerabilities.

7 digital transformation leadership sins – and what to do instead

While it’s true that organizations across industries are under enormous pressure to transform, it is still incumbent upon digital leaders to convince the organization to buy into the effort. “The typical operating budget is under constant pressure, driving the need to maximize efficiency,” says Greg Bentham, vice president of cloud Infrastructure Services at Capgemini North America. “As with any sales transaction, a level of trust needs to be established before the buyer is inclined to buy.” ... Businesses tend to lowball the effort required to plan and execute a successful digital transformation. “Many organizations believe that they can layer transformation on top of their normal activities without dedicated resources,” says Greg Stam, managing principal in the CIO advisory at digital business consultancy AHEAD. “They pull together matrix committees to discuss the problems of the day and how they might attack them.” They ultimately decide they need to modernize applications, upgrade technology, or retrain staff, but perhaps without a cohesive strategy. “All these things are good but will not produce digital transformation,” Stam says. ... It’s lonely at the top – particularly when it comes to digital transformation. What’s more, old-school hierarchical leadership is ineffective and often counterproductive to these efforts.

Why the first five minutes of a meeting shape its outcome

Many people arrive at meetings prepared to be disengaged. Whether it is a recurring team call, a project team update, or a longer strategy retreat, participants often lack a clear sense of why the meeting is necessary. And people are distracted. Their minds may still be focused on their last call or an upcoming deadline. These days, they may have kids at home learning remotely or a relative to care for; they may be anxious about economic upheaval and societal uncertainty. Facilitators clearly can’t resolve all these issues, but they can help people to be more present and productive while in a meeting. In most cases, lack of engagement stems from the mistaken assumption that meetings are time sinks. But leaders who routinely host dynamic, high-engagement meetings set up conversations as opportunities for real work — regardless of the specific purpose. They approach and design them with this premise (and cancel them if there is no real work to be done). And, with this simple shift, they tap into one of the biggest day-to-day sources of team motivation: a sense of progress toward a worthwhile goal. With this lens, leaders can engage any group more actively and productively. The most important moment, other than crafting your original invitation, is when you begin.

Quote for the day:

"Any one can hold the helm when the sea is calm." -- Publilius Syrus