Daily Tech Digest - March 27, 2021

A Day in the Life of a DevSecOps Manager

The goal of a DevSecOps team, in my view, is embedding application security into development through enablement, iteration, and continuous feedback – also sometimes called "shifting security left." This requires talking to other folks and making sure you can offer them something that solves your problem while enabling them to solve theirs. No one wants to "stop" producing value to take care of security concerns, which can often be how it feels to interact with security teams. Everyone already has a full roadmap. Why does this security concern need to be addressed now? Through a DevSecOps philosophy, which mostly means taking agile principles from engineering and applying them to security work, I use those aforementioned days of meetings to determine how a particular security concern can be mitigated or eradicated without adding friction to the development pipeline. ... Our DevSecOps team, for example, can write a cryptography library for engineering that uses standard libraries in an appropriate manner, avoiding common implementation mistakes that could lead to data exposure. Sometimes we may mandate a particular approach, but typically we offer a library like this to engineering and sell it as saving them development time.


Artificial Intelligence is the Key to Economic Recovery

Artificial Intelligence technologies have already tremendous economic potential in the private and business sector. The value of the global AI market in 2019 is estimated by Gartner and McKinsey at USD 1.9 trillion and forecasts for 2022 is USD 3.9 trillion. ... There are reasons to believe it will be even more so in the post-Corona era. About two years ago, before the outbreak of the plague, Prime Minister Netanyahu asked me and my colleague Professor Eviatar Matanya to lead a national initiative in the field of intelligent systems that would make Israel one of the top five countries in the world in this technology within five years. ... AI has a much wider spectrum than Cyber technology. Its applications have far-reaching implications in most areas of our lives, including security, medicine, transportation, automation, retail, sales, customer service and virtually every field relevant to modern life. The various learning algorithms, along with the tremendous increase in computing power, are already beginning to penetrate all areas of our lives, and their understanding requires mastery not only of the “natural” technological disciplines – such as computer science, mathematics and engineering – but also of social, legal, business and even philosophical aspects.


The war against the virus also fueling a war against digital fraud

The study also found that as of March 16, 2021 the 36% of consumers who said they are being targeted by digital fraud related to COVID-19 in the last three months is higher than approximately one year ago. In April 2020, 29% said they had been targeted by digital fraud related to COVID-19. In the U.S., this percentage increased from 26% to 38% in the same timeframe. Gen Z, those born 1995 to 2002, is currently the most targeted out of any generation at 42%. They are followed by Millennials (37%). Similarities were observed in the U.S. where Gen Z was most targeted at 53% followed by Millennials at 40%. “TransUnion documented a 21% increase in reported phishing attacks among consumers who were globally targeted with COVID-19-related digital fraud just from November 2020 to recently,” said Melissa Gaddis, senior director of customer success, Global Fraud Solutions at TransUnion. “This revelation shows just how essential acquiring personal credentials are for carrying out any type of digital fraud. Consumers must be vigilant and businesses should assume all consumer information is available on the dark web and have alternatives to traditional password verification in place.”


‘Hacktivism’ adds twist to cybersecurity woes

Earlier waves of hacktivism, notably by the amorphous collective known as Anonymous in the early 2010s, largely faded away under law enforcement pressure. But now a new generation of youthful hackers, many angry about how the cybersecurity world operates and upset about the role of tech companies in spreading propaganda, is joining the fray. And some former Anonymous members are returning to the field, including Aubrey Cottle, who helped revive the group’s Twitter presence last year in support of the Black Lives Matter protests. Anonymous followers drew attention for disrupting an app that the Dallas police department was using to field complaints about protesters by flooding it with nonsense traffic. They also wrested control of Twitter hashtags promoted by police supporters. “What’s interesting about the current wave of the Parler archive and Gab hack and leak is that the hacktivism is supporting antiracist politics or antifascism politics,” said Gabriella Coleman, an anthropologist at McGill University, Montreal, who wrote a book on Anonymous.


Sweden’s Fastest Supercomputer for AI Now Online

“Research in machine learning requires enormous quantities of data that must be stored, transported and processed during the training phase. Berzelius is a resource of a completely new order of magnitude in Sweden for this purpose, and it will make it possible for Swedish researchers to compete among the global vanguard in AI,” said Ynnerman. Berzelius will initially be equipped with 60 of the latest and fastest AI systems from Nvidia, with eight graphics processing units and Nvidia Networking in each. Jensen Huang is Nvidia’s CEO and founder. “In every phase of science, there has been an instrument that was essential to its advancement, and today, the most important instrument of science is the supercomputer. With Berzelius, Marcus and the Wallenberg Foundation have created the conditions so that Sweden can be at the forefront of discovery and science. The researchers that will be attracted to this system will enable the nation to transform itself from an industrial technology leader to a global technology leader,” said Huang. The facility has networks from Nvidia, application tools from Atos, and storage capacity from DDN. The machine has been delivered and installed by Atos. Pierre BarnabĂ© is Senior Executive Vice-President and Head of the Big Data and Cybersecurity Division at Atos.


Why data classification should be every organisation’s first step on the path to effective protection

The value of classification was once limited to protection from insider threats. However, with the growth in outsider threats, classification takes on a new importance. It provides the guidance for information security pros to allocate resources towards defending the crown jewels against all threats. Internal actors cause both malicious and unintentional data loss. With a classification program in place, the mistyped email address in a message with sensitive data is flagged. Files that are intentionally being leaked are classified as sensitive and get the attention of security solutions, such as Data Loss Prevention (DLP). On the other hand, external threat actors seek data that can be monetised. Understanding which data within your organisation has the greatest value, and the greatest risk for theft, is where classification delivers value. By understanding the greater potential impact of an attack on sensitive data, advanced threat detection tools escalate alarms accordingly to allow more immediate response. Organisations generate data every day. This comes as no surprise. However, what might be surprising is the accelerating volume at which the data is being created.


6 Principles for Hybrid Work Wellbeing

Wellbeing is both an individual and a team sport. Everyone’s individual circumstances are unique—from caring for a sick parent to juggling the demands of remote learning to struggling with racial injustice. Each of us needs to define our boundaries based on what we can and can’t do—and own them. In practice, this means deciding what time you start work, deciding what time you finish work, and sticking to those commitments while communicating them to your team, whether you’re working remotely or in person. Technology can be your friend here. For example, set your status message in Teams to indicate when you're prioritizing family time. When we all own and respect boundaries, we create a culture of mutual support that promotes everyone’s wellbeing. ... Meeting bloat is one of remote work’s most counterproductive trends, though the reasons for it aren’t hard to understand. Without well-defined ways to indicate progress and participation, showing up to a meeting has become the signal of doing work. It’s the 21st-century version of punching the clock. This helps neither employees nor employers. Organizations can undercut this expectation—and the drain on wellbeing that comes from too many meetings—by fostering a meeting culture centered on preparation and purpose.


Remote working burn-out a factor in security risk

“Lockdown has been a stressful time for everyone, and while employers have admirably supported remote working with technology and connectivity, the human factor must not be overlooked,” said Margaret Cunningham, Forcepoint’s principal research scientist. “Interruptions, distractions and split attention can be physically and emotionally draining and, as such, it’s unsurprising that decision fatigue and motivated reasoning continues to grow. “Companies and business leaders need to take into account the unique psychological and physical situation of their home workers when it comes to effective IT protection. “They need to make their employees feel comfortable in their home offices, raise their awareness of IT security and also model positive behaviours. Knowing the rules, both written and implied, and then designing behaviour-centric metrics surrounding the rules can help us mitigate the negative impact of these risky behaviours.” Cunningham said that although both older and younger employees tended to report they were receiving similar levels of organisational support while working remotely, the emotional experience, and how different generations use technology, was markedly different.


Impact of Big Data on Innovation, Competitive Advantage, Productivity, and Decision Making

Advances in the field of technology enabled individuals and businesses to collect large amounts of data (structured and unstructured) from various sources like never before. Data from social media, user-generated, internet, health care, manufacturing, supply chain, financial institution, and sensors have grown exponentially. This paper’s objective is to review how big data drive and impact innovation, competitive advantage, productivity, and decision support. Methodology: A comprehensive literature review on big data and identifying the impact of big data analytics on innovation, competitive advantage, productivity, and decision support are studied. The reviewed literature created the foundation for studying, a model that was developed based on an extensive review of literature as well as case studies and future forecast by market leaders. Big data is the latest buzzword among businesses. A new model is suggested identifying big data and the correlation between innovation, competitive advantage, productivity, and decision support. Findings: A review of scholarly literature and existing case studies finds that there is a gap between existing frameworks and the integration of big data into various business and management functions and objectives.


Rethinking data strategies: Shifting the focus from technology to insights

We need to redefine data strategy. Businesses need to move away from collecting data for data’s sake. Instead, we need to focus on data-driven technological innovation that delivers meaningful customer experiences, using targeted data to provide the right insights about customers. Today, businesses are collecting data en masse. But what are the benefits of collecting this data? What insight does it provide about customers or competitors? Most businesses believe they know their customer profile, and acquire more technology and data to meet this perceived customer profile. By rethinking data strategies, however, and exploring the value of the data being collected and how it is being collected, businesses will understand their customers’ wants and needs more effectively. Indeed, knowing your customer is not only about tracking and tracing their behaviour digitally, you first need to define what kind of data insight you want to learn from your customer. Then you can work out how to leverage new data insights amassed through a targeted data collection to deliver tailored features back to the customer quickly and easily – engaging customers in a product or service when they need it most.



Quote for the day:

"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer

Daily Tech Digest - March 26, 2021

Text authentication is even worse than almost anyone thought

For years, the key argument against relying on text message confirmations is that they are susceptible to man-in-the-middle attacks, which is still true. But this peek into the authorized infrastructure for text messages means that text takeovers can happen far more simply. There are plenty of easily accessed apps that make text-like authentication far more secure, including Google Authenticator, Symantec's VIP Access, Adobe Authenticator, and Signal. Why risk unencrypted, easily stolen texts for account access or anything else? For the moment, let's set aside how relatively easy and low-cost it is to move to a more secure version of text confirmations. Let's also, for the moment, set aside the compliance and operational risks your team is taking by letting the enterprise grant account access vis unencrypted texts. How about solely looking at the risk and compliance implications of offering third-party access via unencrypted text authentications? Remember this from the Vice piece: "The (attacker) sent login requests to Bumble, WhatsApp, and Postmates, and easily accessed the accounts." Once a bad guy takes control of a customer's texts, a vast domino effect kicks in, where lots of businesses can be improperly accessed.


How to Mitigate Low-Code Security Risks

On the low-code spectrum, there is the attitude that “people aren’t really doing mission-critical applications — they’re mainly doing prototyping, or back office automations,” Wysopal said. Or, since low-code applications often do not publicly expose an application, they are deemed low-risk and fall to the bottom of the security team’s inbox and list of to-dos. “We’re still in a world where people don’t secure all applications,” he said. However, these attitudes are a bit short-sighted, since “applications aren’t islands,” Wysopal said. Even if applications are intended for internal use only, phishing attempts could expose credentials and gain access to the network. Here, attackers could leverage sensitive resources or steal infrastructure for compute power. Thus, all low-code users should be aware of potential threats and change habits to address these risks accordingly. However, the onus is also on low-code vendors to sufficiently arm their systems. Having a vulnerability disclosure, bounty program and an easy means to accept bug reports from white hat security researchers will be necessary to continually patch issues. Low-code continues to permeate more and more digital operations, opening up novel potential for citizen developers.


Smart cities are built on data

Cities must understand the full set of stakeholders who should be involved in setting data governance policies. They include civic authorities, the public, the private sector, technology providers and academic experts. Then they need to look at smart city projects as an “integrated ecosystem,” which means using the data for the collective benefit of the city, public and private sector, Chiasson said. “In a lot of cases, the costs are concentrated, but the benefits are diffuse,” he said. For instance, improving traffic flow would benefit many stakeholders, but the cost tends to be concentrated with the agency paying for sensors and algorithms. “Holistically tying that together … is the way you start to have data that’s good and data that’s valuable,” Chiasson said. And valuable means data that can be turned into usable information. For instance, a city may have ridership and traffic data, but if they can’t use it to reduce emissions, it’s not helpful. “Data is not information,” Dennis said. “Oftentimes, what we’ll see is that there’s too much data and not enough ability for the city to convert it into information holistically.” Urban SDK has a data specification for smart cities that lets them aggregate data from a range of sources and normalize it into one database. From there, the data can be analyzed into information.


GAO warns on cyber risks to power grid

The country's electrical systems are increasingly susceptible to cyberattacks, according to government auditors, and there is uncertainty about the extent to which a localized attack might cascade through power distribution systems. A new report from the Government Accountability Office examines the vulnerabilities of electricity grid distribution systems, how some states and industry actions have hardened those systems and the extent to which the Department of Energy has addressed risks by implementing the national cybersecurity strategy. Government and industry officials told GAO that a cyberattack on a grid distribution system would likely have localized effects, but a coordinated attack could have widespread consequences. However, the officials conceded that assumption is based on their professional experience, GAO noted, and none of them were aware of an assessment that confirmed their claims. "Moreover, three federal and national laboratory officials told us that even if a cyberattack on the grid's distribution systems was localized, such an attack could still have significant national consequences, depending on the specific distribution systems that were targeted and the severity of the attack's effects," according to the report.


Vaccinated Employees Returning with Un-Vaccinated Devices

With vaccines making their way throughout the country and lockdown restrictions loosening up, a debate surrounding the office return emerges. Hybrid vs. remote vs. full-time – every scenario is different, as well as the rules and accommodations to make the return safe and productive for all. User systems will be coming back as well, and their absence from the network could present new challenges. In the vast majority of cases, computers and mobile devices have not been on-premise for close to a year. Ever vigilant from a security perspective, the best way to approach the “repatriation” of these systems onto the office network is to regard them as potentially infected. At the very least, consider that these devices are not in the same state as when they left. During this time, the company office extended into the homes of employees, and the line separating home from work was essentially diminished. ... Even with device management in place, the potential for unaccounted change from field devices is significant. Consider this both a threat and an opportunity. The first goal is to ensure that workstations have been adequately patched and updated. Devices, security software and applications must be validated and brought up-to-date before actively working on networks hold sensitive, critical data.


Cyber threats, ongoing war for talent, biggest concerns for tech leaders

When it comes to talent, 44% of respondents said that finding enough qualified employees to fill open positions is the biggest risk they face over the coming year. An almost equal percentage say the task is just as difficult as it was a year ago. To close the skills gap, companies said they are trying a variety of strategies, including building flexible, on-the-job training opportunities (61%); rewriting job descriptions or job titles (42%); creating an apprenticeship program (39%); and eliminating requirements that applicants have certain types of academic degrees (24%). Some other pathways to finding the right talent: easing location restrictions; gamification of training; and prioritizing the search for candidates with a diversity of career backgrounds. In fact, nearly three-quarters of the respondents said that in the past year they’ve filled open technology positions with candidates with liberal arts degrees. An equal percentage have taken internal candidates from non-tech teams. About half of tech leaders have hired people with no college degree at all. A little over 60% of survey participants said their company is ahead of the curve when it comes to investing in new technology, and 35% said they’re about average. 


When Every Millisecond Matters in IoT

It starts with a network of seismic sensors, which are used to detect the P-waves, providing a ton of information that can be used to calculate the size and location of the damaging earthquake. The data is distributed in real-time to every subscribed party: emergency response, infrastructureand everyday users who have the app installed. The next critical piece of technology is a real-time network: a super-fast, low-latency and reliable infrastructure that is optimized to broadcast small amounts of data to huge audiences of subscribers. This may include both the earthquake data itself, as well as push notifications or alerts specified by the app developer. This is where every millisecond matters, so ensuring reliability at scale, even in unreliable environments, is mission critical. When selecting a real-time network, whether you go with a hosted service or build it yourself, app developers need to understand the underlying technology, real-time protocols and other indicators of scalability. Lastly, you need the application that connects your real-time IoT network to the deployed sensors, where notifications are transmitted and the response is automated based on incoming data.


What businesses need to know to evaluate partner cyber resilience

Protecting customer data is vital and now regulated in certain geographies with the introduction and implementation of privacy laws like the GDPR and the CCPA. Non-compliance with either of these regulations may result in large fines that can pose a serious threat to business continuity depending on the size of the company and violation. While the GDPR and the CCPA are the two of most well-known regulations, at least 25 U.S. states have data protection laws, with Virginia being the most recent to enact legislation. Legislation aside, organizations must protect data and be able to recover it in the event of any loss. Not being able to recover data, albeit at the fault of a partner, can quickly propel an organization toward financial setbacks, damaged relationships and diminished reputation. When it comes to evaluating a partner, ask them to detail their backup strategy and policies. Regular infection simulations and backup procedure tests are crucial in making sure you are prepared for a real DEFCON scenario. Businesses must have endpoint security in place as cybercriminals are constantly developing new ways to attack networks, take advantage of employee trust and steal data. In traditional office building settings, employees were better protected within the corporate network.


How one data scientist is pioneering techniques to detect security threats

It was all pretty accidental, not something I had planned. I did really well in college—I was first in my class. And I finished in 2010, in the middle of the Great Recession, which hit the Spanish labor market horribly. At that time, the unemployment rate was 25 percent. The lucky ones, like people in engineering, were getting job offers. But when you’re in technology, the only options in Spain are to work for a consulting company or to do support or sales. There weren’t any entry-level jobs in research and development. So, I started a master’s with a group doing research on biometrics. The master’s was also in computer science and very related to artificial intelligence and a lot of interconnected fields like multimedia signal processing, computer vision, and natural language processing. I did my thesis on statistics around forensic fingerprints, and the probability of a random match between a latent fingerprint found at a crime scene and a random person that could have been wrongly convicted of that crime. ... One good approach is to find an internship that has some connection between doing data science and security and fraud, even if it’s just loosely related.


What To Expect From The New US-India Artificial Intelligence Initiative

India and the US can complement each other in this collaborative effort to ensure equitable progress. “For the US, India represents a massive consumer market – and one of the world’s largest troves of data. Technology firms in the US accessing this data will be like energy firms finding oil in the Middle East,” said Prakash. “For India, the US algorithms are solutions to a variety of development challenges India faces, from bringing banking to hundreds of millions of people to modernising the Indian military to offering healthcare to the masses. At the same time, for US technology firms, India churns out massive amounts of engineers and computer scientists – critical talent that these firms need.” Another major reason for a partnership between India and the US is the new geopolitical realities. China’s growing influence in the field of AI is a pressing concern. “What India and the US bring to the table is what is a supposedly democratic governance model of emerging technology, said Basu, “Despite the change in administration from Trump to Biden, there are certain things where there is continuity – like distrust in China and Chinese technology....”



Quote for the day:

"Leadership is a dynamic process that expresses our skill, our aspirations, and our essence as human beings." -- Catherine Robinson-Walker

Daily Tech Digest - March 25, 2021

Generation Z majority left cold by data literacy

Helena Schwenk, analyst relations and market intelligence lead at Exasol, said: “Regardless of job descriptions, the ability to work with data is becoming increasingly crucial in the workplace. In theory, D/Natives should have developed the data literacy skills necessary for effective data analysis, storytelling and visualisations. Their untapped potential could spur a revolution in the way we use data to transform business and improve our daily lives. “But our survey highlights two issues: a genuine skills shortage when it comes to the more complex data skills gained through the education system, and a clear miscommunication between the language D/Natives use and the business jargon used by employers. There is work for educators, business leaders and the young people themselves to do to bridge the data literacy gap – to create not just a productive workforce, but also a richer society.” Schwenk, a former analyst at IDC and Ovum, has recently been joined at Exasol by Peter Jackson as its chief data and analytics officer. Jackson also has a high profile in the UK data community, as the co-author, with Caroline Carruthers, of The chief data officer’s playbook and a former data leader at the Pensions Regulator, Southern Water and Legal & General.


Strategies to Modernize, Maintain, and Future-Proof Systems

We tend to think about technology advancing in a straight line, with each iteration better and more sophisticated than what came before. The reality is a little more complicated than that because there are no one-size-fits-all solutions. As we make incremental improvements to technology, we are only really optimizing for a specific set of use cases. Those same improvements might make other uses more difficult. Over time what tends to happen is as one technology gets more and more optimized, the group of people for whom things are moving in the opposite direction of what they actually need gets larger and larger, until finally there are enough people to establish a market for a “new” technology to shift things back in the opposite direction. My favorite example of this is cell phone size: for a while cellphones were about staying connected to the office on the go, so each more advanced version was smaller and thinner. Then the emphasis shifted from work functions to entertainment functions, and suddenly cell phones started to get bigger and bigger. Technology is filled with these kind of cycles where it feels like we’re reinventing or repackaging old solutions.


5 Web Application Security Threats and 7 Measures to Protect Against Them

Broken authentication is another common vulnerability that is caused by poorly implemented authentication and session management controls. If an attacker is successful in identifying and exploiting authentication-related vulnerabilities, they can gain direct access to sensitive data and functionality. The goal of the attackers to exploit authentication vulnerabilities is to impersonate a legitimate user of the application. Attackers employ a wide variety of techniques such as credential stuffing, session hijacking, password brute force, Session ID URL rewriting, etc., to leverage these weaknesses. These attacks can be prevented by implementing strong session management controls, multi-factor authentication, restricting and monitoring failed login attempts. For more details on prevention, refer to this article. Sensitive data exposure occurs when the web application does not sufficiently safeguard sensitive information such as session ids, passwords, financial information, client data, etc. The most common flaw of organizations resulting in data exposure is not encrypting sensitive data.


Hidden areas of security and the future of hybrid working

Businesses should think carefully about how they utilize these platforms – starting with security. Many of the platforms, such as Microsoft Teams, do not come with built-in cybersecurity features, and don’t provide a way for data to be easily archived. In fact, Microsoft does not provide any guarantee of restorability – if a file is accidently deleted, it’s gone forever. This leaves a big gap for operations that need to ensure that they have a strong archiving strategy in place. Additionally, IT and security teams must be aware of the vulnerability of these tools to phishing or social engineering attacks. Unlike email, files shared via collaboration platforms cannot be scanned for malicious links or other content. A good example of this is a Microsoft Teams phishing campaign recently discovered by Mimecast which consisted of 772 emails and targeted recipients mainly based in the US. Those targeted were sent fake email notifications asking them to verify their password or telling them they had been added to a project via their Teams account. Similarly, another Teams attack discovered late last year was estimated to have targeted 15,000-50,000 people by the time it was detected, showing how widespread the problem can get.


Hybrid workers are stressed out, but "empathy-based management" could help

As the remote-work landscape has blurred the lines between work and personal life, workers struggle to put up boundaries, and many stay connected long after the work day is done. According to the research, workers in the hybrid world are "1.27 times more likely to struggle to disconnect from work than employees in the on-site world." And "40% of hybrid or remote employees [are] reporting an increase in the length of their workday in the past 12 months." This kind of fatigue caused by the longer workday is a main concern for 92% of HR leaders. Leaders should stop expecting employees to be always "on." The very tools that are used to ensure the smooth transition to a hybrid work model are also its Achilles' Heel. "Organizations have inadvertently been making the fatigue worse," Cambon said. There have actually been more check-ins (78%) between managers and workers, and 84% more virtual meetings with teams, for instance. According to Garter, "HR leaders must lead and support the creation of a hybrid model that mitigates the adverse impacts of digital distraction, virtual overload and the always-on mindset. Ironically, many of the actions that organizations are taking to improve the hybrid employee experience are actually exacerbating the fatigue these hybrid realities are creating."


From Digital To Physical: The Ultimate Challenge For AI

By crossing the digital/physical barrier and implementing AI-powered visual quality inspections, the industry can mitigate the crisis and labor shortage. The use of AI removes the barriers that typically slow technology adoption in that it is cost effective, easy to integrate and doesn't need specially trained staff to operate. AI-based visual inspections are used today to inspect for defects in metal engine parts, check integrity of rugs/carpet, assess whether raw material (such as meat) has foreign contaminants (e.g., plastic particles), check plastic food trays for the right item, inspect quality of baked goods (e.g., bread), determine integrity of vaccine vials and more. These are all real-world, often mission-critical applications of AI technology in challenging physical settings. The value of digital-to-physical applications of AI is clear, as well as how they can be applied in the manufacturing industry—so what's next? For anyone looking to implement AI across their organization, the next steps are simple. First, you need to take a look at your specific workflows and determine what processes could benefit from AI: Is it a quality inspection, is it predictive maintenance or is it something else?


Working with Secure Enclaves in Azure SQL Database

Encryption has always been challenging to implement, but if it is implemented infrequently, data breaches become much more damaging: If a bunch of encrypted data gets breached, it is not useful to anyone. If we think back to database encryption in SQL Server, until Always Encrypted was introduced, anyone who was a system administrator had access to the encryption keys, allowing them to view decrypted data. Always Encrypted changed that paradigm. Instead of storing encryption keys in the database, the keys that can decrypt data were stored in the client application. This meant administrators could only view the ciphertext (the result of the encrypted value) and not the plain text value. Always Encrypted supports two types of encryption: deterministic (in which the value of the ciphertext will always be the same for the same seed value) and randomized (which provides a unique encrypted value for each record). ... The key difference here is that with secure enclaves in place, the database engine can send encrypted results into the secure enclave, where data operations can take place. Then the data is returned to the database engine, and in turn to the client operation in encrypted format. While the enclave is shown in its container, it is part of the SQL Server process on the server.


The unique opportunity for Fintech in the payments space

As a society we’re becoming disengaged with the cumbersome process of card payments and more conscious than ever about typing pin codes into public machines, with antibacterial gel on stand-by. With today’s available technology there is just no need to queue, swipe, PIN and collect paper receipts. We’re moving into an age of completely contactless spending, one where people can exit a taxi without “paying”, leave a shop without visiting the till, and get instant credit at a digital checkout. Where e-wallets account for 8%-10% year on year growth of ecommerce transactions, with no sign of slowing down. We’re moving into a digital-first generation that is used to buying things with the tap of a phone screen or a scan of their face. So much so, physical wallets are becoming obsolete as phones stay glued to hands. Although as a society we’re engaging less and less in person or making payments over a counter, fintechs are leading the way with technology to trust customers are who they say they are, digitally, so that they can access frictionless payment experiences without merchants incurring the risk of fraud. 


What IT Leaders Can Do To Diminish Fear Within Their Teams

First, I take personal responsibility for team progress on the project. I do this visibly and deflect criticism of the team. I make it clear within the team that only the complete team can succeed. As a group, we will work to balance the assignments so no one person feels like the single point of failure. To our sponsors of the project, I am clear about our status and needs from senior leadership. Knowing that we are all on the same journey keeps the team together. Eventually, all businesses run into budget problems. IT spending is a necessary evil because businesses leverage mission-critical applications. But the fear within the employees is that people may not seem as necessary. The threat of possible downsizing casts an enormous shadow and can be debilitating in concentrating on complex mental work. How do I keep our focus amidst layoff rumors? My communication stresses our value. I ask the team to show our company that we are going to continue to strive for excellence. I pose this to my team: “Let’s continue to do great things. Will the company value us more if we slip on quality, complain about our situation, or spread layoff rumors?


The Evolution of Distributed Systems on Kubernetes

If you look at how microservice looks on Kubernetes, you will need to use some platform functionality. Moreover, you will need to use Kubernetes features for lifecycle management primarily. And then, most likely, transparently, your service will use some service mesh, something like an Envoy, to get enhanced networking capabilities, whether that's traffic routing, resilience, enhanced security, or even if it is for a monitoring purpose. On top of that, depending on your use case, you may need Dapr or Knative, depending on your workloads. All of these represent your out-of-process, additional capabilities. What's left to you is to write your business logic, not on top, but as a separate runtime. Most likely, future microservices will be this multi-runtime composed of multiple containers. Some of those are transparent, and some of those are very explicit that you use. ... All the interactions of your business logic with the external world happen through the sidecar, integrating with the platform and does the lifecycle management. It does the networking abstractions for the external system and gives you advanced binding capabilities and state abstraction.



Quote for the day:

"Leadership is, among other things, the ability to inflict pain and get away with it - short-term pain for long-term gain." -- George Will

Daily Tech Digest - March 24, 2021

How Machine Learning Enables Clinical Forecasting, Visualization

“The main problem with using machine learning in clinical care – and being able to make changes therein – is that there are many preprocessing design decisions that will affect the performance of the model. With this tool, healthcare experts are able to select those at their own location so that when they go to train these models, they're focused on the very specific task at hand,” said Weiss. By seeing the impact of their design choices, users can understand their data more completely and adjust machine learning settings for their analysis. The tool allows healthcare experts to develop algorithms tailored to their patients and organizations. “If you were to use a risk scoring system from another site, they might have defined the population based on the patient data that were available at entry and at the beginning of the hospitalization. But then the physician might want to have a risk score for a little bit later in treatment, maybe the first or second day after they've entered and they've already been stabilized,” Weiss explained. “The outside model will not be tailored to that population and could give misleading predictions. Using TL-Lite, the physician can quickly train a model with the risk profiles for the particular population they’re interested in evaluating.”


On the Road to Good Cloud Security: Are We There Yet?

Although most IT security teams are well past being the department of no when it comes to cloud initiatives, many are still struggling with how to best secure those cloud-based assets — at least when they are tasked with doing so ... The research also uncovered a disconnect that raises the question: Is that confidence misplaced? When asked to rate the level of visibility the security team had into their organization's use of specific cloud service types, including software-as-a-service (SaaS), platform-as-a-service (PaaS), and infrastructure-as-a-service (IaaS), that same level of confidence faltered. For example, when asked to rate the security team's level of visibility into their organization's SaaS usage on a five-point scale, with 1 being the highest level, only 18% gave it a 1 and 27% gave it a 2. Visibility into PaaS and IaaS was rated as only slightly better. At the same time, respondents' knowledge of the shared responsibility model was found to be lacking. When asked to indicate whether the customer or cloud provider was responsible for securing a list of seven different elements that make up an IaaS account, around half of respondents gave the wrong answer.


Digital Identity: Fulfilling Consumer Cravings for Elevated ‘Digital Experience’

Whilst some organisations have embraced this potential to strengthen their bond with consumers, others have not been as future-forward, even though 82% of business executives recognise that customer experience is directly intertwined with revenue growth, according to Forrester. It is no longer sufficient for organisations to ‘digitise’ through newly hosting existing products and services on online platforms. Consumer delight is won through the ability to identify market gaps, capitalise on the latest technological capabilities and improve upon existing standards and quality of life that is already on offer. If it is not clear how a product, service and experience is able to add to their existing digital portfolio it is not pushing market boundaries or entertaining consumer curiosities. The sensitivity of this digital shift is clear; companies must ensure that throughout their digital strategy they consider consumer experience as the key driver for change. This means listening to the wants and needs of consumer trends, working along the tide of consumer behaviour to ensure their business remains attractive, socially relevant, and profitable.


How agile can power frontline excellence

The strategic choices that companies make often don’t filter down to the hearts and minds of frontline workers. But what if sales employees could exercise informed judgment, become entrepreneurs within the enterprise, and conduct short-term experiments and share ideas on what works? Magic can happen if frontline employees understand how their targets link to strategic objectives and how their work contributes to wider company success. In agile sales organizations, the average frontline employee receives more information and is included in communications about the purpose of, and strategic choices for, the organization as a whole. Communication is more inclusive and interactive. These agile organizations foster dialogue and understand how sales functions can drive the strategic agenda using customer feedback. They operate from the belief that empowered employees will make more and better emotional connections with customers, leading to greater engagement on both ends and a stronger, longer, and broader relationship as a result. In addition, in agile sales organizations, the number of performance indicators is drastically reduced to a set of clear outcomes to focus energy on the things that matter most through the lens of the strategic aspiration.


Open Source vs. Proprietary DataOps

One advantage of open source is in its flexibility and availability. Open source licenses, excluding the SSPL, gives users incredible freedom over what they can do with the software. If you have the skill, you can compose a DataOps pipeline that can take any data, enrich it and route it to the right place. That flexibility, though, is also a downside. While you can do anything you want, you also have to do it. Open source projects like Kafka, Pulsar, Spark, Airflow and Flink don’t know anything about the data they’re handling. That’s up to the developer. This may not sound like a problem, but today’s data engineers are handling dozens of data types in hundreds – or even thousands – of different formats. If you add in operational data, you’re also looking at data flooding in from firewalls, containers, SNMP traps and HTTP sources. And that’s just what’s coming at you. You also need to fetch data from object stores, multiple activity hubs and other messaging sources. No open source project natively supports the variety and volumes of data required in a modern DataOps pipeline.


What’s the Difference Between Solution Architecture and Design?

As a Technical Lead, I was the communication point for my team as well as leading the actual solution delivery, getting my hands dirty. As a Solution Architect, I became untethered from delivery, sitting outside of the teams doing the actual work. But in both roles, I was producing designs. So is the distinction down to whether or not you write code yourself? Is the answer that Architects don’t get their hands dirty? Absolutely not. That’s just a feature of the organisations you work with, and what they expect from their Solution Architects. My experiences as a Solution Architect just so happen to be with organisations where the role doesn’t touch code — either as a result of outsourcing Delivery or having internal Product Engineers to do the build. Other organisations will have Solution Architects more embedded within Delivery teams, t-shaping to provide additional value. ... In the Agile organisations I’ve worked for over the last few years, Solution Architecture has been best deployed in the early stages of a change to produce a vision of the solution and how it fits into the existing landscape, identifying impacts, opportunities and risks associated with the change.


4 Ways Your Small Business Can Benefit From Blockchain

The first thing a business can do to adopt blockchain technology is to simply accept cryptocurrency as a method of payment. What signals more of a commitment to blockchain than allowing customers to pay with bitcoin or other cryptocurrencies? The rollout will require a lot of planning and testing, as traditional merchant services are not set up to accept bitcoin. As such, a small business will need to evaluate and spend money on a digital wallet, a merchant gateway or a combination of services needed to accept the cryptocurrency from customers. ... Businesses can use blockchain for smart contracts, which are basically self-verifying, self-enforcing contracts. Stored within a blockchain ledger, the contract is recorded in a way that cannot be changed or manipulated. Smart contract examples include commercial leases, agreements with vendors or suppliers and even employee contracts. Smart contracts offer small businesses a level of protection it would otherwise never be able to afford. The middleman — usually an attorney — would not be needed in a smart contract, and as such, a business would have lower costs.


What’s limiting digital transformation initiatives?

CXOs are aware of the need to adopt a cloud-first approach and change the way IT is delivered in response to the digital acceleration brought about by COVID-19. Many have already done so, with 91% increasing their cloud services usage in the first months of the pandemic, and the majority will continue to do so, with 60% planning to add more cloud services to their IT delivery strategy. However, while businesses recognize the need to accelerate their DX journeys over the next 12 months, 40% acknowledge that economic uncertainty poses a threat to their DX initiatives. As organizations increasingly adopt modern IT services at rapid pace, inadequate data protection capabilities and resources will lead to DX initiatives faltering, even failing. CXOs already feel the impact, with 30% admitting that their DX initiatives have slowed or halted in the past 12 months. The impediments to transformation are multi-faceted, including IT teams being too focused on maintaining operations during the pandemic (53%), a dependency on legacy IT systems (51%) and a lack of IT staff skills to implement new technology (49%). 


Apple’s iPhone factories are Industry 4.0 rock stars

Apple being Apple, we don’t know too much about how the company and its manufacturing partners are making use of AI, Internet of Things and connectivity on the factory floor, but we have seen a few examples, such as its Daisy recycling robot. We do know that Foxconn’s state of the art "lights off' Shenzen factory is highly-automated with robots deployed across the production line, reducing its reliance on human workers. The WEF has praised that factory, noting a 30% increase in production efficiency and a 15% lower inventory cycle. Broadening our understanding a little, it claims the factory "utilizes a fully automated manufacturing process," and has an "automated optimization system for Machine Learning and AI devices, an intelligent self-maintenance system, and an intelligent real-time monitoring system.” Foxconn’s Chengdu plant has seen efficiency increase by 200% through the adoption of mixed reality, AI, and IoT technologies. Foxconn says it put these technologies in place to resolve rapid business growth when it faced a lack of skilled workers, presumably on the iPhone production line.


Remote work, one year in: 5 ways to boost mental health

Research consistently shows that social interaction plays an essential role in well-being, which in turn has a positive impact on employee engagement and performance. Building social connections is much easier when you’re in the office; chats at the coffee machine or catch-ups over lunch are all part of normal working life. If someone is stressed, you can usually pick up on the signs. However, opportunities to communicate diminish when you’re working from home, and it can be difficult to know how people are really feeling. Make a conscious effort to encourage personal connections to help prevent people from feeling isolated. This is even more important given social distancing measures, which have left many without their usual support network. Check in regularly with your team members on an individual basis, especially those with heavy workloads or who live alone. Build in time at the start of calls for a general catch-up. Not everyone is comfortable chatting on the phone, so also consider using instant messaging to keep the channels of communication open.



Quote for the day:

“Successful people are not gifted; they just work hard, then succeed on purpose.” -- G.K. Nielson

Daily Tech Digest - March 23, 2021

How Synthetic Data Levels The Playing Field

Synthetic data can be defined as data not collected from real-world events. Today, specific algorithms are available to generate realistic synthetic data used as a training dataset. Deep Generative Networks/Models can learn the distribution of training data to generate new data points with some variations. While it is not always possible to learn the models’ exact distribution, algorithms can come close. ... The big players already have a stronghold on data and have created monopolies or ‘data-opolies’. Synthetic data generation models can address this power imbalance. Secondly, the rising number of cyberattacks, especially after the pandemic, has raised privacy and security concerns. The situation is especially worrying when huge amounts of data are stored in one place. By creating synthetic data, organisations can mitigate this risk. Thirdly, whenever datasets are created, they reflect real-world biases, resulting in the over-representation or under-representation of certain sections of society. The machine learning algorithms based on such datasets amplify such biases resulting in further discrimination. Synthetic data generation can fill in the holes and help in creating unbiased datasets.


Researchers Discover Two Dozen Malicious Chrome Extensions

While malicious extensions are an issue with all browsers, it's especially significant with Chrome because of how widely used the browser is, Maor says. It's hard to say what proportion of the overall Chrome extensions currently available are malicious. It's important to note that just a relatively small number of malicious extensions are needed to infect millions of Internet users, he says. One case in point was Awake Security's discovery last June of over 100 malicious Google Chrome extensions that were being used as part of a massive global campaign to steal credentials, take screenshots, and carry out other malicious activity. Awake Security estimated that there were at least 32 million downloads of the malicious extensions. In February 2020, Google removed some 500 problematic Chrome extensions from its official Chrome Web Store after being tipped off to the problem by security researchers. Some 1.7 million users were believed affected in that incident. In a soon-to-be-released report, Cato says it analyzed five days of network data collected from customer networks to see if it could identify evidence of extensions communicating with command-and-control servers. 


What IT Leaders Need to Know About Open Source Software

Despite conventional wisdom, open-source solutions are, by their nature, neither more nor less secure than proprietary third-party solutions. Instead, a combination of factors, such as license selection, developer best practices and project management rigor, establish a unique risk profile for each OSS solution.The core risks related to open source include: Technical risks, including general quality of service defects and security vulnerabilities; Legal risks, including factors related to OSS license compliance as well as potential intellectual property infringements; Security risks, which begin with the nature of OSS acquisition costs. The total cost of acquisition for open source is virtually zero, as open-source adopters are never compelled to pay for the privilege of using it. Unfortunately, one critical side effect of this low burden of acquisition is that many open-source assets are either undermanaged or altogether unmanaged once established in an IT portfolio. This undermanagement can easily expose both quality and security risks because these assets are not patched and updated as frequently as they should be. Finally, vendor lock-in can still be a risk factor, given the trend among vendors to add proprietary extensions on top of an open-source foundation (open core).


Applying Stoicism in Testing

To consider and look for the unknown information about a system, you need to have justice. In Stoicism, this stands for “showing kindness and generosity in our relationships with others”. And because you don’t know everything, you need other people to help you out. Gathering information is also about creativity, so you have to gather inspiration from past experience, and with your colleagues must be able to connect dots that weren’t connected before. Once I even stated, “The knowledge (and information) you gather as a tester about the software can be an interesting input for new software products and innovations”. But as a Stoic, stay humble ;-). After gathering all the information you need, you should use your wisdom (“based on reasoning and judgment”) to come to conclusions so that you can answer the question “Is this software ready to be used?” Although our customers are the best testers, we as testers are in the position that we are (or at least should be) able to answer the question at every step in the software development: if the software is going to production, what can happen? The information you put on the table for your stakeholder should be based on facts. 


Data Analyst vs. Data Scientist

The typical data analyst role is consulting-centric, as can be seen from the Indeed job spec example. What they are preoccupied with for the most part is wrangling data from Excel spreadsheets and SQL databases, extracting insightful conclusions via retrospective analyses, A/B tests, and generally providing evidence-based business advice. The last point illustrates why reporting routines with visualisation tools such as Tableau are as pivotal as pivoted tables. Data modelling on the other hand is often limited to basic supervised learning or its stats equivalent: regression analysis. ... To be fair, data scientists are for that reason expected to be more than analytical wizards. They are supposed to be builders who employ advanced programming to create pipelines that predict and recommend in production environments with near perfect accuracy. Compared with analysts, who’re like investigative reporters, they are a lot more product development than consulting oriented. Although it’s also required of a data scientist to provide data-led commercial advice. Some say the title was coined to manifest that the role was a confluence of three fields: maths/statistics, computer science and domain expertise.


Tech projects for IT leaders: How to build a home lab

If you're like most technology leaders, the closest you get to the actual technology you select and manage is creating PowerPoint decks that tell others about the performance, maintenance and updating of that technology. There's nothing fundamentally wrong with this of course; you can be a fantastic leader of a construction firm without having swung a hammer, or a cunning military strategist who has never rucked over a hill or fired a weapon. However, hands-on time with the fundamental building blocks of your domain can make you a better leader, just as the architect who spends time in the field and understands the materials and building process makes him or her more effective at creating better structures. ... Think of a home lab as the technology equivalent of the scientist's laboratory. It's a place where you can experiment with new technologies, attempt to interconnect various services in novel ways and quickly clean things up when you're done. While you might be picturing a huge rack of howling servers, fortunately for us you can now create the equivalent of a small data center on a single piece of physical equipment.


IOTA still wants to build a better blockchain, and get it right this time

What went wrong then, and how is IOTA going to fix it -- besides introducing a new wallet? Schiener focused on some key technical decisions that proved wrong, and are being retracted. IOTA wanted to be quantum-proof, and that's why it used a "special cryptography," as Schiener put it. IOTA's cryptography only allowed, for example, to utilize an address once. Reusing an address could lead to a loss of funds. Another questionable decision was choosing to use ternary, rather than binary encoding for data. That was because, according to Schiener, the hypothesis of the future was that ternary is a much better and more efficient way to encode data. The problem is, as he went on to add, that this also needs ternary hardware to work. There are more, having to do with the way the ledger is created. It's still a DAG, but it has different algorithms. Schiener said that over the last one and a half years, IOTA has been reinvented and rewritten from the ground up. This new phase of the project is Chrysalis, which is this new network upgrade. With Chrysalis, IOTA is also moving toward what it calls Coordicide.


Browser powered scanning in Burp Suite

One of the main guiding principles behind Burp’s new crawler is that we should always attempt to navigate around the web applications behaving as much as possible like a human user.. This means only clicking on links that can actually be seen in the DOM and strictly following the navigational paths around the application (not randomly jumping from page to page).Before we had browser-powered scanning, the crawler (and the old spider) essentially pretended to be a web browser. It was responsible for constructing the requests that are sent to the server, parsing the HTML in the responses, and looking for new links that could be followed in the raw response we observed being sent to the client. This model worked well in the Web 1.0 world, where generally the DOM would be fully constructed on the server before being sent over the wire to the browser. You could be fairly certain that the DOM observed in an HTTP response was almost exactly as it would be displayed in the browser. As such, it was relatively easy to observe all the possible new links that could be followed from there.Things start to break down with this approach in the modern world. 


Fintech disruption of the banking industry: innovation vs tradition?

The first was the rise of the internet. Constantly improving speeds and widespread access meant hundreds of millions of consumers were suddenly able to access digital services. The second was the rise of the smartphone. This hardware transformed consumer behaviour beyond recognition. Apps and other software products providing significant upfront value made smartphones indispensable — just think of Shopify, Google Maps and Uber. The third driver which paved the way for fintech providers’ success was the financial crisis in 2008. Not only did this bring the traditional banking system to the brink of collapse, but consumers were far less trusting of the big banks thereafter. The new breed of financial services providers was not tied down by legacy infrastructures and, with smaller teams and flexible IT infrastructures, they were more agile. And this allowed them to easily circumnavigate the new regulatory and compliance requirements that were introduced in the wake of the financial downturn. Fintech providers sought to solve problems the banks could not. Or at least to do what the banks do, but better. 


Top 3 Cybersecurity Lessons Learned From the Pandemic

As the world began relying on these new digital capabilities, new risks and challenges were introduced. Organizations that were well-equipped to extend visibility and control to this new way of working found themselves in a far better situation than those that were scrambling to completely reengineer their security capabilities. The ones that had built an empowered and proactive security team, backed by robust processes and supported by effective technology, were able to adapt and overcome. Organizations that were locked into a rigid operational model, overly reliant on vendor platforms or lacking a defined set of processes to support their new reality, struggled to keep pace. ... Since the pandemic began, we have seen an increased emphasis and shift toward zero trust and security access service edge (SASE) principles. With strong identity and access management capabilities, insights into services and APIs, and visibility into remote endpoint devices, security teams can put themselves in position for rapid and effective responses — even within this unique virtual setting. Access to sensitive and confidential data is the new perimeter for an organization's cybersecurity posture.



Quote for the day:

"A tough hide with a tender heart is a goal that all leaders must have." -- Wayde Goodall

Daily Tech Digest - March 22, 2021

Bitcoin’s Greatest Feature Is Also Its Existential Threat

The botnet’s designers are using this idea to create an unblockable means of coordination, but the implications are much greater. Imagine someone using this idea to evade government censorship. Most Bitcoin mining happens in China. What if someone added a bunch of Chinese-censored Falun Gong texts to the blockchain? What if someone added a type of political speech that Singapore routinely censors? Or cartoons that Disney holds the copyright to? In Bitcoin’s and most other public blockchains there are no central, trusted authorities. Anyone in the world can perform transactions or become a miner. Everyone is equal to the extent that they have the hardware and electricity to perform cryptographic computations. This openness is also a vulnerability, one that opens the door to asymmetric threats and small-time malicious actors. Anyone can put information in the one and only Bitcoin blockchain. Again, that’s how the system works. Over the last three decades, the world has witnessed the power of open networks: blockchains, social media, the very web itself. What makes them so powerful is that their value is related not just to the number of users, but the number of potential links between users.


India’s Quest Towards Quantum Supremacy

The digital partnership between the Indian Institute of Science Education and Research (IISER) at Pune and Finland’s Aalto University has created a high probability of getting its first quantum computer. ... Talking about the partnership, Neeta Bhushan, the joint secretary (Central Europe), external affairs ministry, stated that the idea of jointly developing a quantum computer with the use of AI and 5G technology is an important area of collaboration for both countries. Considering that Nokia and other Finnish companies are leading the world in mobile technology growth, this digital collaboration will witness the two countries collaborating on quantum technologies and computing. Hence, the partnership will have the leverage to deploy the latest technologies available with both countries. ... The partnership can lead us towards a new ecosystem altogether, and many things can be expected out of the same. The post-COVID changes in global power-sharing and the recent technological developments to handle the crisis have brought India to the centre stage. Consequently, quantum encryption is one of the basic applications derived from this collaboration.


Remote working still isn't perfect. These are the things that need fixing

A new report from O2 Business explores these insights in greater depth. The UK mobile operator surveyed 2,099 workers who had previously been office-based to understand how their needs and expectations of work had changed. It found that the majority of employees welcomed the notion of splitting their time between the office and home-working going forward, but also called for a closer alignment of operations, IT and HR in order to support individual work choices and maximize workplace productivity. Generally, employees are satisfied with their organization's response to the pandemic, O2 found: 69% of workers felt that their employers had supported them during the pandemic, with just 11% disagreeing with this statement. But less than two-thirds (65%) of employees felt confident that their organization was prepared for the future world of work. O2 said this indicated some businesses would struggle to adapt to the more flexible working arrangements that many are planning to adopt post-pandemic. The mad scramble to remote working has been one of the most trying aspects for businesses over the past year.


Fight microservices complexity with low-code development

A low-code platform takes care of nearly everything that conventionally is coded for an application. Most of the low-level programming and integration work is taken care of via tool configurations, which saves developers a lot of time and headaches. However, think carefully about where you apply low-code in a microservices architecture. As long as the app is simple, clean and doesn't require many integration points, low-code development might be the right alternative to more manual and complex microservices projects. Low-code builds are an easy choice for applications that don't need to integrate with other databases or only rely on a series of small tables. Short-lived conference apps or marketing promotions that run with user ID information are good examples of this. However, a low-code approach does not replace large-scale microservices development. Once you need to share information between applications in real time, the tools and programming techniques involved become much more sophisticated. While the low-code approach helps developers steer clear of over-engineering apps that don't need it, low-code likely won't provide the database integration, messaging or customization capabilities needed for an enterprise-level microservices architecture.


Edge Computing Growth Drives New Cybersecurity Concerns

Effectively protecting the edge means understanding how cybersecurity protection schemas work in an enterprise that uses not only edge computing, but also the cloud and traditional resources. Most enterprises are clearly focused on data security and application security, and are using tools such as web application firewalls (WAF), runtime application self-protection (RASP), data exfiltration protection and, of course, endpoint protection. Since the edge has the ability to “touch” data and applications, as well as use identity to connect and determine entitlements, a great deal of potentially sensitive information passes through the edge. Much, if not all of that traffic moves through a content delivery network (CDN), where hosts provide the connectivity and, hopefully, wrap encryption around that traffic to protect it from interception. However, intrusion and data exfiltration still happens. “Digital transformation is driving more and more applications to the edge, and with that movement, businesses are losing visibility into what is actually happening on the network, especially where edge operation occurs,” Hathaway said. “Gaining visibility allows cybersecurity professionals to get a better understanding of what is actually happening at the edge,” he said.


Move Your Automation Efforts From Pilot To Reality

Talent is another crucial part of the equation that not enough customers take into account. I’ve worked with many customers that don’t have dedicated automation centers of excellence, or specific in-house expertise to tackle automation the right way. An enterprise with multiple technologies in place must ensure that those technologies are communicating with each other. By bringing together technical experts, your processes can be better visualized and monitored end-to-end across the organization, leading to a higher chance of success. The complexity and effort involved in this kind of endeavour can be off-putting, but it’s worth the reward. Nor is it truly as complicated as it sounds — execution management systems, for example, already bring together technologies like process mining, automation and AI into a seamless, intelligent execution layer. Bring in or train the right people to champion it, and you’ve got a headstart on the next step of the journey. So while many companies haven’t been able to bring the full promise of automation to bear at scale just yet, that promise is getting closer to becoming a reality every day.


HowTo: Optimize Certificate Management to Identify and Control Risk

End-to-end certificate management gives businesses complete visibility and lifecycle control over any certificate in their environment, helping them reduce risk and control operational costs. Even in the most complex enterprise environments, certificate automation offers speed, flexibility and scale. Full visibility over all digital certificates and keys means that even the largest enterprises can have a centralized view of digital identities and security processes. Security leaders can then access expiration dates and maintain cryptographic strength while avoiding the time-consuming, demanding, and risky task of manually discovering, supervising, and renewing certificates. As organizations continue to grow and evolve, so does the range of certificates deployed and the set of people deploying them, which increases the potential for certificates to be installed in your environment that are out of sight of IT security teams and left unmanaged. To avoid being blindsided by these “rogue” certificates, enterprises are turning toward automated universal discovery.


On the Road to Good Cloud Security: Are We There Yet?

The research also uncovered a disconnect that raises the question: Is that confidence misplaced? When asked to rate the level of visibility the security team had into their organization's use of specific cloud service types, including software-as-a-service (SaaS), platform-as-a-service (PaaS), and infrastructure-as-a-service (IaaS), that same level of confidence faltered. For example, when asked to rate the security team's level of visibility into their organization's SaaS usage on a five-point scale, with 1 being the highest level, only 18% gave it a 1 and 27% gave it a 2. Visibility into PaaS and IaaS was rated as only slightly better. At the same time, respondents' knowledge of the shared responsibility model was found to be lacking. When asked to indicate whether the customer or cloud provider was responsible for securing a list of seven different elements that make up an IaaS account, around half of respondents gave the wrong answer. Specifically, 63% erroneously indicated that the cloud provider was responsible for securing virtual network connections, 55% erroneously indicated that the cloud provider was responsible for securing applications, and 50% got it wrong when they said the cloud provider was responsible for securing users who were accessing cloud data and applications.


5 AI-for-Industry Myths Debunked

Up until, and during, the AI hype in the nineties, artificial intelligence was a scientific discipline that almost exclusively dealt with data and algorithms. Over the past decades however, the field has matured, and AI has become an integral part of automated decisioning systems that are at the heart of what we do as individuals and organizations. Consequently, a large portion of AI research, development, and implementation encompasses people and processes. I remember having a business conversation with a large energy provider in which we were talking about automated systems and data-driven methods that, driven by customer data and smart meters, could enhance their customers’ experience. One hour into the meeting, they suddenly asked: “This all looks very promising, but shouldn’t we also do something with AI?” ... If you have the combined luck and skills, you can probably cook a decent meal with ingredients that come from a randomly filled refrigerator. The real question, however, is: “What do you want to achieve?” In the example of the refrigerator, it might occasionally be an effective solution if you need to quickly fill stomachs and don’t have time to go shopping. 


Cloudflare wants to be your corporate network backbone

With Magic WAN, Cloudflare aims to simplify that. Cloudflare's global Anycast network is already built for high performance and availability to serve its core CDN business. The company has data centers in more than 200 cities across over 100 countries with local peering at internet exchange points. Regardless of where branch offices or employees are located, chances are high they'll always connect to a server close to them and then the traffic will be routed through Cloudflare's private network efficiently benefiting from its performance optimizations, smart routing and security. With Magic WAN organizations only need to set up Anycast GRE tunnels from their offices or datacenters to Cloudflare and they can then define their private networks and routing rules in a central dashboard. Cloudflare's existing Argo Tunnel, Network Interconnect and soon IPsec can also be used to connect datacenters and VPCs to its network, while roaming employees will connect using Cloudflare WARP, a secure tunneling solution that's built around the highly performant Wireguard VPN protocol. This also solves the scalability and performance issues that organizations have faced with traditional VPN gateways and concentrators when they were suddenly faced with a large remote workforce due to the pandemic.



Quote for the day:

"A true dreamer is one who knows how to navigate in the dark" -- John Paul Warren