Daily Tech Digest - December 02, 2021

Web 3.0: The New Internet Is About to Arrive

Some experts believe this decentralized Web, which is also referred to as Web 3.0, will bring more transparency and democratization to the digital world. Web 3.0 may establish a decentralized digital ecosystem where users will be able to own and control every aspect of their digital presence. Some hope that it will put an end to the existing centralized systems that encourage data exploitation and privacy violation. ... As a user, you will have a unique identity on Web 3.0 that will enable you to access and control all your assets, data, and services without logging in on a platform or seeking permission from a particular service provider. You will be able to access the internet from anywhere for free, and you will be the only owner of your digital assets. Apart from experiencing the internet on a screen in 2D, users will also get to participate in a larger variety of 3D environments. From anywhere, you could visit the 3D VR version of any historical place you search, play games while being in the game as a 3D player, try clothing on your virtual self before you buy. 


Report: Aberebot-2.0 Hits Banking Apps and Crypto Wallets

Based on the Aberebot-2 creator's claim and Cyble's findings, the banking malware's new variant appears to have multiple capabilities. It can steal information such as SMS, contact lists and device IPs, and it also can perform keylogging and detection evasion by disabling Play Protect - Google's safety check that is designed to detect spurious apps, according to the researchers. Cyble says the "new and improved" version of the banking Trojan can steal messages from messaging apps and Gmail, inject values into financial applications, collect files on the victim's device and inject URLs to steal cookies. Medhe says that Aberebot-2.0 has 18 different permissions, including internet permission, and 11 of the permissions are dangerous. One key difference between the earlier and the latest version of the Aberebot malware, he says, is the use of the Telegram API. "In the newer version, the malware author has included features such as the ability to inject or modify values in application forms, such as receiver details or the amount during financial transactions.


New Ransomware Variant Could Become Next Big Threat

Symantec's investigation of Yanluowang activity showed the former Thieflock affiliate is using a variety of legitimate and open source tools in its campaign to distribute the ransomware. This has included the use of PowerShell to download a backdoor called BazarLoader for assisting with initial reconnaissance and the subsequent delivery of a legitimate remote access tool called ConnectWise. To move laterally and identify high-value targets, such as an organization's Active Directory server, the threat actor has used tools such as SoftPerfect Network Scanner and Adfind, a free tool for querying AD. "The tool is frequently abused by threat actors to find critical servers within organizations," Neville says. "The tool can be used to extract information pertaining to machines on the network, user account information, and more." Other tools the attacker is using in Yanluowang attacks include several for credential theft, such as GrabFF for dumping passwords from Firefox, a similar tool for Chrome called GrabChrome, and one for Internet Explorer and other browsers called BrowserPassView.


Cloud computing is evolving: Here's where it's going next

"The era of multi-cloud is here, driven by digital transformation, cost concerns and organizations wanting to avoid vendor lock-in. Incredibly, more than half of the respondents of our survey have already experienced business value from a multi-cloud strategy," said Armon Dadgar, co-founder and CTO, HashiCorp in a statement. "However, not all organizations have been able to operationalize multi-cloud, as a result of skills shortages, inconsistent workflows across cloud environments, and teams working in silos." ... The focus is now on overcoming the various barriers to successful multi-cloud deployment, which include skills shortages and workflow differences between cloud environments. Cloud spend management is a continuing issue, while infrastructure automation tools are becoming increasingly important, particularly when it comes to provisioning and application deployment. In five years' time, we won't be talking about the pros and cons of hybrid/multi-cloud architecture. Instead, the discussion will be all about enterprises as efficient developers of industry-specific cloud-native apps, and automatic, optimised and AI-driven workload deployment.


Recovering from ransomware: One organisation’s inside story

As far as the ransom demand itself was concerned, the service provider warned that it was important Manutan not respond, even more so that it not pay. In the case of this particular gang, as soon as the victim shows up to negotiate, the criminals activate a three-week timer at the end of which – if there is no resolution – they make good on a series of threats, disclosing the victim’s sensitive information and irreparably destroying the data. Therefore, to pretend that Manutan had not yet realised it had been attacked – in effect, to play dead – would serve to buy it valuable time. In terms of actually paying, this could make the gang ask for more and would not provide any guarantee that the data would be recovered. “We spent time determining what data they had recovered and the risk it posed. We concluded that it was not critical – for example, they did not access our contracts with suppliers. Then we evaluated our ability to put a functioning IT system back together, which we could do, and we decided that we would not pay,” says Marchandiau.


How Decryption of Network Traffic Can Improve Security

Today, it’s nearly impossible to tell the good from the bad without the ability to decrypt traffic securely. The ability to remain invisible has given cyberattackers the upper hand. Encrypted traffic has been exploited in some of the biggest cyberattacks and exploit techniques of the past year, from Sunburst and Kaseya to PrintNightmare and ProxyLogon. Attack techniques such as living-off-the-land and Active Directory Golden Ticket are only successful because attackers can exploit organizations’ encrypted traffic. Ransomware is also top of mind for enterprises right now, yet many are crippled by the fact that they cannot see what is happening laterally within the east-west traffic corridor. Organizations have been wary to embrace decryption due to concerns around compliance, privacy and security, as well as performance impacts and high compute costs. But there are ways to decrypt traffic without compromising compliance, security, privacy or performance. Let’s debunk some of the common myths and misconceptions.


5 (more) Common Misconceptions about Scrum

Many people think that Scrum Team members shouldn’t be assigned to a team part-time. However, there is nothing in the Scrum Guide prohibiting it. There are, of course, trade-offs for part-time Scrum Team members. If too many individuals are part-time, the team may not accomplish as much meaningful work during a Sprint. Additionally, with part-time members it can be more difficult for the team to learn how much work they can achieve during a Sprint, particularly if a member’s part-time status fluctuates. Moreover, if the part-time members support multiple Scrum Teams, they can feel exhausted attending numerous Daily Scrum meetings and splitting their focus. The Scrum Team should consider these trade-offs when self-organizing into teams that include part-time members. ... Timeboxes are an essential part of all Scrum events because they help limit waste and support empiricism, making decisions based on what is known. For example, the result of the Sprint Planning event should be enough of a plan for the team to get started. 



What Will AI Bring to the Cybersecurity Space in 2022

When you deploy AI to monitor your company network, for example, it creates an activity profile for every user in that network. What files they access, what apps they use, when, and where. If that behavior suddenly changes, the user is flagged for a deep scan. This is a vast improvement in threat detection. Currently, a lot of time is lost before an attack is even noticed. According to IBM’s 2020 Data Breach Report, businesses take 280 days on average to detect and contain a breach. That’s plenty of time for hackers to cause massive damage. AI cuts that time short. It instantly spotlights irregularities, allowing businesses to contain breaches fast. One of the major issues with this, however, is the fact that there's always a strong risk that some clean behaviors may appear as though they are problematic when they're not. Current generation ML-based threat detection algorithms rely almost exclusively on the adaptation of neural networks that more or less replicate the perceived functioning of human thought patterns. These systems use validation subroutines that crosscheck behavior patterns against previous behaviors.
So far, only 9 countries have commercialized 5G mmWave. However, this is not surprising given that, the main restriction of mmWave transmissions is their low propagation range. Telecom companies would not employ the mmWave frequency band for national coverage. Looking at telecom operators’ deployment strategies, we can see that low-frequency bands (for example, 700 MHz) are used for national coverage, whereas sub-6 GHz bands are utilized for city coverage, and mmWave is used for megacity hotspots. ... One crucial part of deploying a large-scale 5G network employing massive MIMO gear is that the radio must be lightweight and have a compact footprint, as these characteristics will help operators save significant money on overall deployment. This is where silicon comes in. Si’s performance will have a huge influence on a radio’s essential aspects, such as connection, capacity, power consumption, product size, and weight, and, ultimately, cost. In the 5G system sector, all of these are critical.


7 ways to balance agility and planning

By building learning and development (L&D) into planning, your organization can enhance employee engagement and investment in strategic goals. A Quantum Workplace trend report found employee engagement was at its peak in 2020 (up 3 percent from 2019), with 77 percent of employees reporting high engagement. Spring and fall of 2020 indicated the greatest engagement levels at 80 percent, with a 7 percent drop by the summer of 2021. Leadership communication also tapered off since the emergence of COVID, creating a downward trend in employees’ transparency, communication, and leadership trust perceptions. Consequently, many employees felt their career paths were stunted or unclear. These findings underscore the importance of L&D in keeping employees engaged and motivated and in fostering more consistent communication between managers and their teams. From the organization’s perspective, employees are encouraged to flex their adaptability muscles as they learn, galvanizing them to become more agile and enabling the organization to pivot efficiently.



Quote for the day:

"It is, after all, the responsibility of the expert to operate the familiar and that of the leader to transcend it." -- Henry A. Kissinger

Daily Tech Digest - December 01, 2021

Does Your Organization Need a Data Diet?

The scenario is all-too-familiar: There’s a security breach, and afterward, the affected organization asks what it must do to better protect its data. But what if that organization never collected and stored that sensitive information in the first place? Often, the best defense against an embarrassing and costly breach is to collect only data that is essential to an entity’s mission. Some who work in computer privacy circles refer to this as “going on a data diet.” They know the temptation is great for organizations to bulk up on data of all kinds. After all, storage costs are low. With the move to the cloud, an organization doesn’t even need to invest in hardware and upkeep to store the information it collects. So why not grab whatever data a customer or client is willing to provide? Because we live in a time when the only sensible approach to computer security is to wonder when your entity might be breached—not if. That’s why it pays to not only go on a data diet, but to adopt a regimen for keeping your organization’s databases lean and, as a result, your customer relationships healthy.


DeFi Opens New Possibilities for Banks Willing to Embrace Change

Banks have already shown that they are aware of the growing urgency to pivot fully into the digital age. FinTechs have been key drivers in the development of banking alternatives, offering customers new ways to pay and manage their money, and urgency is building for banks to adapt. Authentication plays a role in the customer experience, and DeFi can help improve trust between financial institutions (FIs) and the customers who trust them with their money, the panelists said. Vicandi told PYMNTS that the growth in DeFi is shaking the very core of financial services. ... “They’ve produced their own products, sell those products though their networks, to their own end customers.” But in the current environment and with the emergence of blockchains, Vicandi said that DeFi exists as a threat not only to the regional banks which tend to lack the scale and of their larger national and international brethren, but as an existential and cultural change to finance as we know it.


Solutions against overfitting for machine learning on tabular data

The simplest way to detect overfitting is to look at the performance between your train and test data. If your model performs very well on the training data and poorly on the test data, your model is probably overfitting. ... Cross-validation is an often used alternative to the train/test split. The idea is to avoid creating a train/test split by fitting the model multiple times on a part of the data, while testing on the other part of the data. At the end, all parts of the data have been used for both training and testing. The cross-validation score is the average score on all the test evaluations that have been done throughout the process. The cross-validation score is based only on test data, and therefore it is a great way to have one metric instead of a separate train and test metric. We could say that cross-validation is equivalent to the test score of the model. Using the cross-validation score will not help you to detect overfitting, because it will not show you the training performance. So it is not a way to detect overfitting, but the cross-validation score can tell you that your test performances are bad (without the reason).


Working Together as Embedded Engineering Teams

Part of the value of these long-term pairings is you build relationships and context. It is expensive to move people because it breaks these connections. Most embedded organizations tend to move people more than they should. It can be frustrating to have embeds on your team if you can’t rely on them to stay. Yet it is tempting for the embed’s manager to move people around. They want to react to changing needs, and sometimes there aren’t enough people to go around. This can be a source of friction. A new embedded person has a more complex situation than new employees do. They have another manager and things outside your team they may be paying attention to. Kick off the relationship with explicit conversations. Spend twice the care you would with onboarding a generalist employee. Gus Shaffer offers this advice: “One thing that I found helpful when embedding staff engineers was to conduct a standard Kick-off process with a Statement of Work as output. Early on I learned the hard way that leaving success criteria loose leads to lingering engagements/disappointment/confusion.”


Microsoft under fire in Europe for OneDrive bundling; legal fight brewing

Lead by Nextcloud, a coalition of European Union (EU) software and cloud organizations and companies formed the "Coalition for a Level Playing Field.” “Microsoft’s combination of the dominant Windows (operating software) with the OneDrive (cloud) offering makes it nearly impossible to compete with their SaaS services,” Nextcloud said in a blog post. “It illustrates anti-competitive practices such as ‘self-preferencing’ on the basis of the market dominance of Windows.” Frank Karlitschek, CEO and founder of Nextcloud, also pointed out in his blog that over the last several years, Microsoft, Google, and Amazon have grown their market shares to 66% of the total European market, while local European software and service providers have declined from 26% to 16%. “Behavior as described above is at the core of this dramatic level of growth of the global tech giants in Europe,” Karlitschek said. “This should be addressed without any further delay.” “There are deliberate, abusive practices and those practices are no accidents. Other Big Tech firms are showing similar conduct.


RansomOps: Detecting Complex Ransomware Operations

It’s possible for organizations to defend themselves at each stage of a ransomware attack. In the delivery stage, for instance, they can use malicious links or malicious macros attached documents to block suspicious emails. Installation gives security teams the opportunity to detect files that are attempting to create new registry values and to spot suspicious activity on endpoint devices. When the ransomware attempts to establish command and control, security teams can block outbound connection attempts to known malicious infrastructure. They can then use threat indicators to tie account compromise and credential access attempts to familiar attack campaigns, investigate network mapping and discovery attempts launched from unexpected accounts and devices. Defenders can flag resources that are attempting to gain access to other network resources with which they don’t normally interact, and discover attempts to exfiltrate data as well as encrypt files. 


A unique quantum-mechanical interaction between electrons and topological defects in layered materials

"Once we first identified the anomaly in electronic conductivity, we remained very puzzled," says Edoardo Martino, the study's first author. "The material was behaving like a pretty standard metal whose electrons move along the plane, but when forced to move between planes its behavior became that of neither a metal nor an insulator, and was unclear what else to expect. It was thanks to a discussion with our fellow colleagues and theoretical physicists that we were pushed in the right direction: just apply a magnetic field and see what happens." After applying the magnetic field, the EPFL scientists realized that the more powerful the magnet, the more exotic the material's behavior becomes. They started experimenting with 14 Tesla superconducting magnets available at EPFL, but soon they realized they needed more. Working with the Laboratoire National des Champs Magnétiques Intenses in Grenoble and Toulouse, they accessed some of the world's most powerful magnets. 


The purpose of “purpose”

It starts with a goal that feels directly connected to the business, rather than a lofty statement that could be used by dozens or hundreds of other organizations. “It has to be real and tangible and live,” said author Margaret Heffernan when I interviewed her about her most recent book, Uncharted: How to Navigate the Future. “It has to be something people feel that they can do.” These are the difficult questions that every leader must wrestle with, even though they may seem philosophical and not directly relevant to the bottom line: Why do you matter? How do you make a difference? What would be lost if your organization went out of business? Healthcare businesses can make a credible case that they are improving or saving people’s lives. A nonprofit is often founded with a clear idea of the impact it wants to have. But the job can seem trickier if you are in a kind of commodity business. Imagine for a second that you run, say, a company that processes beets for sugar. How do you build a purpose around that? Paul Kenward took up that challenge. As managing director of British Sugar, which is based in the east of England, he faced the task of defining a sense of purpose for the company.


Report: No Patch for Microsoft Privilege Escalation Zero-Day

The flaw found under the "Access work or school" settings can only be triggered by clicking on "export your management log files" and confirming by pressing "export," he says. "At that point, the Device Management Enrollment Service is triggered, running as Local System. This service first copies some log files to the MDM Diagnostics folder, and then packages them into a CAB file whereby they're temporarily copied to Temp folder. The resulting CAB file is then stored in the MDM Diagnostics folder, where the user can freely access it," Kolsek notes. He highlights that while copying the CAB file to the Temp folder is vulnerable, a local attacker could create a soft link with a predictable file name used in routine export processes, directing to some file or folder that the attacker would want to have copied, in a location accessible to the attacker. "Since the Device Management Enrollment Service runs as Local System, it can read any system file that the attacker can't," Kolsek notes.


Design Patterns for Serverless Systems

In agile programming, as well as in a microservice-friendly environment, the general approach to designing and coding has changed from the monolith era. Instead of stuffing all the logic into a single functional unit, agile and microservice developers prefer more granular services or tasks obeying the single responsibility principle (SRP). Keeping this in mind, developers can decompose complex functionality into a series of separately manageable tasks. Each task gets some input from the client, executes its specific responsibility consuming that input, and generates some output, which is then transferred to the next task. Following this principle, multiple tasks constitute a chain of tasks. Each task transforms input data into the required output, which is an input for the next task. These transformers are traditionally known as filters and the connector to pass data from one filter to another is known as a pipe. A very common usage of the pipes and filter pattern is the following: when a client request arrives at the server, the request payload must go through a filtering and authentication process



Quote for the day:

"Strategy is not really a solo sport _ even if you_re the CEO." -- Max McKeown

Daily Tech Digest - November 30, 2021

Alation: How to develop a data governance framework

According to Alation, there are seven key steps to building a successful data governance framework: Establish a mission and vision and create a set of policies, standards and glossaries; Populate a data catalog with metadata that shows data lineage and analyze that metadata to discover what data is most popular and who are the top users of data; Recognize and assign data stewards and empower those stewards to govern the organization's data; Curate data assets by describing different data sets and applying quality flags to the data sets so users can easily find the data they'll find most useful; Apply policies and controls so that not all data can be accessed by everyone within an organization and organizations can remain compliant with applicable regulations; Drive community and collaboration to promote trusted data use; and Monitor and measure the entire data governance framework to determine policy conformance, create curation analysis, measure the usage and creation of data assets, and determine the quality of data. "If you're going to really do a process, you need to find out where the gaps are and where you need to make course corrections," Myles Suer


Get to Know EF Core 6

EF Core 6.0 is a modern, cloud-native-friendly data access API that supports multiple backends. Get up and running with a document-based Azure Cosmos DB container using only a few lines of code, or use your LINQ query skills to extract the data you need from relational databases like SQL Server, MySQL, and PostgreSQL. EF Core is a cross-platform solution that runs on mobile devices, works with data-binding in client WinForms and WPF apps “out of the box”, and can even run inside your browser! Did you know it is possible to embed a SQLite database in a web app and use Blazor WebAssembly to access it using EF Core? Check out Steven Sanderson’s excellent video to see how this is possible. Combined with the power of .NET 6, EF Core 6.0 delivers major performance improvements and even scored 92% better on the industry standard Tech Empower Fortunes benchmark compared to EF Core 5.0 running on .NET 5. The EF Core team and global OSS community have built and published many resources to help you get started.


Digital transformation: 4 questions CIOs should ask now

The heart of transformation is really the culture, and that requires commitment from the entire leadership team. In 2016, CarMax began a significant transformation effort. “We wanted to make sure that our organization – not just technology, our entire organization – was ready for this change. And that required a significant commitment not only from me, but also from the CEO. The two other partners I worked closely with were the chief marketing officer and the chief operations officer. So between the leadership team support, we were able to demonstrate and articulate to the whole company that this is a change for the entire company, not just a technology initiative,” says Mohammad. “We put cross-functional teams together to show that we’re serious about this change and we do have to transform ourselves to this digital way of working,” he says. "It is difficult to change the fabric and core operating system of an organization, so the leadership team needs to be there all the way. It is a journey that never ends, so the support cannot subside. The support has to be there all the time.


Why Machine Learning Engineers are Replacing Data Scientists

Not many people actually talk about ML engineering — at least compared to the amount of people talking about data science — and yet I believe the demand for ML engineers might surpass the one for data scientists. We can see the amount of data scientists soaring all over the world, within companies of all sizes, while most of these people aren’t actually doing data science at all, just analytics. And many of those who are actually doing data science probably didn’t have to. That means many organisations are hiring people to solve basically the same types of problems, over and over again, in parallel. There is just a lot of redundancy, and the quality of the people doing it varies substantially. At the same time, we see companies like Google and Amazon, who have some of the best data scientists in the world, working on “ready-to-use” ML systems on their cloud platforms (GCP and AWS, respectively). This means you can plug your data into their systems to benefit from all that knowledge, and all you need is someone who knows how to make that connection and the necessary tuning--someone like an ML engineer.


How to combat ransomware with visibility

The recovery process is often the last thing anyone thinks about. Disaster recovery and business continuity (DRBC) is probably the toughest piece to solve and, often, the most ignored. But if your organization is in healthcare or part of critical infrastructure like utilities, there can be life-and-death consequences to service interruptions. Ensuring business continuity might mean the ability to keep working to save lives, which means that immediate time-to-recovery is going to be very important. In the past, we used to have to go and pull tapes from an archive at some off-site place to restore systems—and that could take days. A few years ago, many businesses had backup systems inside a hosted data center, allowing them to restore from another server by replicating data across the pipe. That was a lot quicker than tape backups, but it still had limitations. Today, cloud-hosted solutions make things much easier because they take snapshots in time of your data. For this reason, cloud storage makes DRBC much faster than legacy solutions that are still stuck in a physical-servers-and-appliances frame of mind.


Lessons Learned from Self-Selection Reteaming at Redgate

At Redgate we believe the best way to make software products is by engaging small teams empowered with clear purpose, freedom to act and a drive to learn. We believe this because we’ve seen it; teams who have had laser-like focus on an aim, decision-making authority and the space to get better at what they do, were the most engaged and effective. If you have read Dan Pink’s seminal book Drive: The Surprising Truth About What Motivates Us, you might recognise that our beliefs echo what the author demonstrates are key to an engaged and motivated workforce — autonomy, mastery and purpose. To remain true to our beliefs, Redgate needs to ensure that the goals and expectations of our teams are crystal clear, that we push authority to teams as much as we can, and we encourage people to grow. We also recognise that different people have different ambitions, preferences for work and views on what counts as personal development. We have a large portfolio of products, written in a variety of languages, structured in a variety of ways and that exist at various stages of the product life cycle.


Global Tech Policy Briefing for November 2021: Banking, Broadband, & Big Tech

It’s not a secret that cryptocurrencies make central banks nervous: Bitcoin and the like exist to flout regulation and control. So far, few national governments have dared ban cryptocurrency outright; we’ve seen a few years of cold war, with the American Securities and Exchange Commission publicly sniping at Terraform Labs, for example, and many governments mulling a ban on mixers. Turkey, whose lira is in freefall, is moving toward an outright ban. Nigeria attempted a ban, but remains the second-largest Bitcoin market in the world. In Russia, the Kremlin’s stance is ambiguous, as rumors of a CryptoRuble make the rounds. But China is the only major international power to successfully outlaw crypto transactions by its citizens, full stop. Now, India may be joining them. A new bill, the Cryptocurrency and Regulation of Official Digital Currency Bill, 2021, “seeks to prohibit all private cryptocurrencies in India, however, it allows for certain exceptions to promote the underlying technology of cryptocurrency and its uses.”


The future of hyperautomation in 2022

Hyperautomation has come to prominence as a trend for 2021 thanks to the maturity of digitalisation and of data management tools. The aforementioned digital upskilling that many organisations have committed to during the pandemic, combined with these tools, form the basis for hyperautomation to take place in the right environment. But we’re only at the start of a long journey. The digital recreation of your business, warts and all, can prove a gruelling exercise in self-examination given the speed at which systems are recreated, and resultant insights are amalgamated. Companies will need to invest a lot of time and energy in order to create long-term adoption of hyperautomation. Turning theory into action is a big challenge to take on, and preparation is key. That means that the value of hyperautomation will only start to materialise for the pioneers that stay focused. Organisations need to stay on the ball and avoid slipping back into old, stagnant processes driven by more operational, tactical initiatives.


Sneaky New Magecart Malware Hides in Cron Jobs

Dubbed CronRAT, it hides in the Linux calendar subsystem as a task that has a nonexistent date, such as Feb. 31. The malware remains undetected by the security vendor and enables server-side Magecart data theft that bypasses browser-based security solutions, according to researchers at Dutch security firm Sansec. "This is very concerning, having been discovered just after Black Friday and Cyber Monday, as well as before the upcoming busy Christmas shopping period, where many unsuspecting shoppers will likely move to online shopping due to the new variant of COVID-19, which may result in further restrictions limiting in-person shopping," says Joseph Carson, chief security scientist and advisory CISO at enterprise security firm ThycoticCentrify. So far, Sansec has not directly tied this recently uncovered RAT to one particular Magecart group. And while it’s not clear who exactly is behind this malware, the report notes that its operators have created an unusual and sophisticated threat that is packed with never-before-seen stealth techniques.


Digital Resilience Requires Changes In The Taxonomy Of Business IT Systems

Enterprises need to ensure they leverage Data as Assets and implement Systems of Insights to support Data-Driven Decision making across the entire Supply Chain and the broad spectrum of business processes and functions. Today enterprises are under immense pressure from Regulatory Authorities, Cyberattacks, and the pivot in customer buying patterns to prefer Trusted, Responsible and Sustainable Products. This effectively means enterprises need to look at Security and Compliance by design – which is best implemented by transitioning to Systems of Insights and Compliance. Skills and Talent are the new currency for the business. It is critical to capture the knowledge and experiences across the company both from a perspective of improving productivity and faster time to market. Many enterprises implemented Learning Management Systems (LMS) in some shape or form. Still, they were seen as a secondary system for Talent Retention and tracking employee training. 



Quote for the day:

"The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann

Daily Tech Digest - November 29, 2021

The Next Evolutions of DevOps

The old adage is that complexity is like an abacus: You can shift complexity around, but it never really goes away. With the movement to shift responsibility left to development teams, this also means that associated complexity is shifting to the development teams. Modern platform engineering teams provide the infrastructure (compliant Kubernetes clusters) to teams and any workload that is run on those clusters is up to the development team that owns it. Typically, development teams then focus on features and functionality. ... If you are a DevOps or platform engineer, making your internal customers—your development teams—successful is a great goal to work toward. Crucial to this is disseminating expertise. This can be in the form of automation and education. A common practice with the DevSecOps movement is to have some sort of scanning step as part of the build or deployment process, disseminating the internals as far as how the scan is performed, what happens if something is found, etc.


Fast-paced dash to digital leaves many public services exposed

When organisations introduce new solutions to their technology stack, protection capabilities need to be extended to cover it. But faced with a global pandemic that no one could’ve seen coming, businesses needed to innovate fast, and their security measures failed to keep pace. This created a vulnerability lag, where systems and data have been left unprotected and open to attack. Veritas’ Vulnerability Lag Report explores how this gap between innovation and protection is affecting a variety of organisations, public and private; only three-fifths (61%) believe their organisation’s security measures have fully kept up since the implementation of COVID-led digital transformation initiatives. This means 39% are experiencing some form of security deficit. While such swift digital transformation has delivered a wealth of benefits for public sector organisations, there is a dark side to this accelerated innovation. In the rush to digitally transform, security has taken a back seat. As a result, there may be significant gaps just waiting for cyber criminals to exploit for their own gain.


Towards Better Data Engineering: Mostly People, But Also Process and Technology

Traditional software engineering practices involve designing, programming, and developing software that is largely stateless. On the other hand, data engineering practices focus on scaling stateful data systems and dealing with different levels of complexity. ... Setting up a data engineering culture is therefore crucial for companies to aim for long-term success. “At Sigmoid, these are the problems that we’re trying to tackle with our expertise in data engineering and help companies build a strong data culture,” said Mayur. With expertise in tools such as Spark, Kafka, Hive, Presto, MLflow, visualization tools, SQL, and open source technologies, the data engineering team at Sigmoid helps companies with building scalable data pipelines and data platforms. It allows customers to build data lakes, cloud data warehouses and set up DataOps and MLOps practices to operationalize the data pipelines and analytical model management. Transitioning from a software engineering environment to data engineering is a significant ‘cultural change’ for most companies. 


Performing Under Pressure

Regardless of the task, pressure ruthlessly diminishes our judgment, decision-making, focus, and performance. Pressure moments can disrupt our thoughts, prevent us from thinking clearly, feel frustrated, and make us act in undesirable ways. The adverse impact of pressure on our cognitive skills can downgrade our performance, make us perform below our capability, commit more errors and increase the likelihood of failure. Pressure can even make us feel embarrassed and shameful when we do fail because we can act in a way that we will otherwise not act and say or do unusual things. Consider these pressure moments. Stepping out of an important client meeting and wondering “why did I make that joke. I was so stupid” or failing to share your opinion while participating in a critical decision meeting and thinking afterward, “Why didn’t I speak up? We could have made a better decision.” Pressure can either result in wrongful action or inaction. Such events make it much more difficult to deal with the pressure next time. But there are things you can do to diminish the effects of pressure on your performance.
 

Behavioral biometrics: A promising tool for enhancing public safety

There are several promising applications in the field of behavioral biometrics. For computer-based identity verification, there are solutions that allow identification based on keystrokes—the frequency and patterns of which prove to be individual enough to recognize identity. Due to the nature of typing, the models can also get better because they can continuously monitor and analyze keystroke data. Software developers tend to also customize confidence thresholds depending on the use case. However, in some cases, the reliability of this behavioral biometric factor is limited to the circumstances. On a different keyboard, individual patterns may differ, and physical conditions like carpal tunnel syndrome or arthritis may affect unique abilities. The lack of benchmarks makes it difficult to compare different providers’ trained algorithms in these cases, providing room for false marketing claims. Image analysis for image recognition can provide more data for behavioral research. Gait and posture biometrics are rapidly becoming useful tools, even if they do not yet match the accuracy and robustness of traditional biometric approaches.


Privacy in Decentralized Finance: Should We Be Concerned?

It is alarming that the pace of DeFi’s growing influence is so fast-paced because many of the issues it presents are not addressed or solved enough in depth. People are investing in all sorts of cryptocurrency before they even educate themselves on how to manage private keys properly. Coupled with the lag in robust protective regulation, the general lack of awareness for DeFi’s threats to privacy inevitably results in large populations of users that are vulnerable to attack. Though some progress has been made at the state level to set standards for blockchain, there is a greater need for industry standardization at the international level. Additionally, the rapid expansion of blockchain technology in many industries is not met with sufficient safety protocols. As such, cybercriminals are aggressively taking action to target both users and exchanges of cryptocurrency in its under-secured state. On the flip side, there are some aspects about DeFi that are directly beneficial to protecting the privacy of users. When comparing the decentralized network that DeFi uses to a centralized one, DeFi’s “peer-to-peer” model is preferable because it prevents a “single source of failure”. 


Hackers Exploit MS Browser Engine Flaw Where Unpatched

The modus operandi of these attackers parallels that of the Iranian attackers, in that it follows the same execution steps. But the researchers did not specify whether the intent of this campaign appeared to be data exfiltration. AhnLab did not respond to Information Security Media Group's request for additional information. With multiple attackers actively exploiting CVE-2021-40444, firms using Microsoft Office should immediately update their software to the latest version as a prevention measure, say researchers from EST Security, which discovered yet another campaign targeting the vulnerability. In this case, the campaign used communications that attempted to impersonate the president of North Korea's Pyongyang University of Science and Technology. "The North Korean cyberthreat organization identified as the perpetrator behind this campaign is actively introducing document-based security vulnerabilities such as PDF and DOC files to customized targeted attacks such as CVE-2020-9715 and CVE-2021-40444," the EST Security researchers say. CVE-2020-9715 is a vulnerability that allows remote attackers to execute arbitrary code on affected installations of Adobe Acrobat Reader DC.


Data Mesh: an Architectural Deep Dive

Data mesh is a paradigm shift in managing and accessing analytical data at scale. Some of the words I highlighted here are really important, first of all, is the shift. I will justify why that's the case. Second is an analytical data solution. The word scale really matters here. What do we mean by analytical data? Analytical data is an aggregation of the data that gets generated running the business. It's the data that fuels our machine learning models. It's the data that fuels our reports, and the data that gives us an historical perspective. We can look backward and see how our business or services or products have been performing, and then be able to look forward and be able to predict, what is the next thing that a customer wants? Make recommendations and personalizations. All of those machine learning models can be fueled by analytical data. What does it look like? Today we are in this world with a great divide of data. The operational data is the data that sits in the databases of your applications, your legacy systems, microservices, and they keep the current state. 


Google Data Studio Vs Tableau: A Comparison Of Data Visualization Tools

Business analysts and data scientists rely on numerous tools like PowerBI, Google Data Studio, Tableau, and SAP BI, among others, to decipher information from data and make business decisions. Coming from one of the best companies in the world, Google Data Studio, launched in 2016, is a data visualisation platform for creating reports using charts and dashboards. Tableau, on the other hand, was founded more than a decade before Google Data Studio in 2003 by Chris Stolte, Pat Hanrahan, and Christian Chabot. Tableau Software is one of the most popular visual analytics platforms with very strong business intelligence capabilities. The tool is free, and the user can log in to it by using their Google credentials. Over the years, it has become a popular tool to visualise trends in businesses, keep track of client metrics, compare time-based performance of teams, etc. It is a part of the Google Marketing Platform and downloads data from Google’s marketing tools to create reports and charts. Recently, Google announced that users can now include Google Maps in embedded reports in Google Data Studio.


5 Trends Increasing the Pressure on Test Data Provisioning

Not only is the pace of system change growing; the magnitude of changes being made to complex systems today can be greater than ever. This presents a challenge to slow and overly manual data provisioning, as a substantial chunk of data might need updating or replacing based on rapid system changes. A range of practices in development have increased the rate and scale of system change. The adoption of containerization, source control, and easily reusable code libraries allow parallelized developers to rip and replace code at lightning speed. They can easily deploy new tools and technologies, developing systems that are now intricately woven webs of fast-shifting components. A test data solution today must be capable of providing consistent test data “journeys” based on the sizeable impact of these changes across interrelated system components. Data allocation must occur at the pace with which developers chop-and-change reusable and containerised components. 



Quote for the day:

"One must be convinced to convince, to have enthusiasm to stimulate the others." -- Stefan Zweig

Daily Tech Digest - November 28, 2021

Government must prove its plans to police encryption work, says ex-cyber security chief

Technology companies and cryptographers claim that the government’s demands are simply not possible - the government is in effect, trying to argue against the laws of mathematics. If the UK and US governments can read encrypted messages, so potentially can criminals, or hostile nation states such as North Korea or Russia. Extensively researched proposals to find a compromise, including proposals by Ian Levy, technical director of the National Cyber Security Centre to use “virtual crocodile clips” to listen in to encrypted communications, have failed to convince sceptics, said Martin. Plans by Apple to introduce “client-side scanning” technology to detect child abuse images before they are encrypted provoked a backlash from the world’s top cryptographic experts and internet pioneers and have now been suspended. An expert report identified over 15 ways in which states or malicious actors, and targeted abusers, could turn the technology around to cause harm to others or society. 


India: One Law To Rule Them All: On NFTs And India's Prospective Cryptocurrency Law

It is not the case that NFTs do not pose any risks. Like traditional art, which has always had a money laundering problem. NFTs pose the same (or even greater) money laundering risks. Greater risks, because the prices of NFTs are determined in private, in one-to-one trade. Like with art or real estate, the value attributed to a trade cannot be questioned and hence these assets can be sold at any price and the balance be settled for cash. One of the things that works in favour of NFTs though is that if they are on a public blockchain such as Ethereum and the user uses a centralised platform to purchase them, transactions are traceable. Other than the money laundering risks, NFTs neither pose the same category of risks, nor the same degree of risks as cryptocurrencies. NFTs are non-fungible and cannot be used as a medium of exchange as opposed to several cryptocurrencies that can be. This alleviates central bankers' concerns around monetary policy and control of cross-border payments. 


Design Pattern vs Anti Pattern in Microservices

An anti-pattern is a common response to a recurring problem that is usually ineffective and risks being highly counterproductive.” Note the reference to “a common response.” Anti-patterns are not occasional mistakes, they are common ones, and are nearly always followed with good intentions ... Ambiguous Service: An operation’s name can be too long, or a generic message’s name can be vague. It’s possible to limit element length and restrict phrases in certain instances. API Versioning: It’s possible to change an external service request’s API version in code. Delays in data processing can lead to resource problems later. Why do APIs need semantically consistent version descriptions? It’s difficult to discover bad API names. The solution is simple and can be improved in the future. Hard code points: Some services may have hard-coded IP addresses and ports, causing similar concerns. Replace an IP address, for example, by manually inserting files one by one. The current method only recognizes hard-coded IP addresses without context. Bottleneck services: A service with many users but only one flaw. 


Designing Resilient Microservices — Part 1

The more interesting question is — What do you do when you detect a dependency failure (partial or full). The obvious answer is to return an appropriate HTTP or gRPC error code to your caller, but depending on your business logic/content, you should explore a graceful degradation. For example, if your application is enabling users to track the status of the order, and the exact location of the delivery agent (which is served by a dependency) is unavailable, you could choose to use extrapolation to compute an approximate location. This is further subject to a timing threshold so that if the dependency recovers, we could pivot back to providing the most recent/accurate response. Another solution often suggested for handling of faults is retries. While the principle is simple, the more critical question is how many times should I retry and how long should I wait between retries. A misconfigured retry logic can actually take a service under stress (in brownout) to a blackout. Consider, for example, a service that has N callers and each of whom have M callers. 


DeFi Lending: When Will It Threaten Traditional Lenders?

In our view, DeFi will be disruptive for financial-services companies even if almost all applications currently relate to digital assets. Banks, insurance companies and other traditional financial firms are considering the advantages of DLT solutions and monitoring developments in the DeFi market. Ignoring this trend might lead to a wake-up call in the future, although we think this is a few years off, given that DeFi is still in its infancy. DeFi lending could improve the liquidity of certain digital assets. Holders of better-established digital assets can diversify their portfolios by pledging existing digital assets for the purchase of other types. DeFi lending can, therefore, improve liquidity within the overall digital-assets ecosystem. That said, it does not come without risk. Given the typically collateralized nature of the activities, we believe that volatility in the valuations of the digital assets posted as collateral could translate into volatility in the valuations of the digital assets acquired. The volume of activities remains relatively low, but greater DeFi-lending volumes could ultimately lead to increased contagion risks between digital assets.


The Evolution of Enterprise Architecture in an Increasingly Digital World

EA talent is hard to find. They must be comfortable with both business strategy and with the digital technologies necessary to implement the strategies. To better understand the key role played by EA teams in their companies, McKinsey conducted a survey that received over 150 responses from a variety of countries and industries. Respondents who described their companies as “digital leaders” said that EA teams add value by following several best practices, including: Engage top executives in key decisions. The most effective EA teams invest their time in understanding their company’s business needs. 60% of enterprise architects at companies considered digital leaders said they interacted most with C-suite executives and strategy departments, compared with just 24% of those in other companies. Digital transformations are more likely to succeed when a company’s senior leaders understand the impact of technology on the business “and commit their time to making decisions that seem technical but ultimately influence the success or failure of the company’s business aims.”


Turning up the scale knob on threat intelligence operations

The only way to harness the true potential of threat intelligence is to gain maximum benefit by fully leveraging that intelligence to facilitate rapid detection of and response to emerging threats. The need of the hour is modern-day threat intelligence platform (TIP) capabilities that come integrated within a comprehensive cyber fusion center that can drive the entire threat intelligence lifecycle management from ingestion to actioning and response in a fully automated way. Modern-day TIPs integrate frameworks like MITRE ATT&CK Navigator that enable you to gain insights into adversaries’ TTPs to identify trends across the kill chain and produce contextualized intelligence. Such TIPs have made operationalization of different types of threat intelligence—strategic, tactical, technical, and operational—possible for security teams. As threat intelligence continues to be the central theme in today’s cybersecurity programs, the need to scale threat intelligence capabilities has become vital for business and operational success.


Executive Q&A: The Value of Improved Data Management

There are three main challenges that enterprises face in achieving the maximum benefit from their data. First, the compounding effect of continually adding new data sources, and thus more data, dilutes the value of data under analysis. Adding demographic data enriches the data set, which is like adding electrolyte to tap water -- it is good and can be done easily. The challenge we face today is that we also have many new sources for the transaction data (e.g., from online purchases, business partners, and mobile apps). We suddenly have data for every page visit, every click, and every location. This is like upgrading a faucet to a fire hose in your kitchen. In theory you have access to a lot of water, but how much of it will go wasted if you don't have the right tool or technology to process it? Second, the increasing reliance on data captured or purchased in the cloud raises questions about how to rationalize on-premises data as part of an analytics strategy. For many organizations, data generated on premises cannot leave the confines of its firewall. This complicates the creation of a complete picture of the truth.


4 Ways Data Governance Can Improve Business Intelligence

Data is the lifeblood of all operational processes. Data is an asset that needs to be managed so that it is highly accessible, easily usable and reusable, and highly secure. Developing effective data governance can help business owners streamline all operational processes and improve decision-making, so any potential efficiency gaps are easily mitigated. When properly implemented, it can reduce data inconsistencies to a minimum and remove the risk of human error from the equation. According to Statista, the US alone saw over 1000 data breach cases with over 150 million records exposed to cybercriminals. Granted, this is lower than back in 2018 when 471 million records got exposed, and these attacks seem to be decreasing lately, but the overarching trend since 2005 is alarming. We also need to address the insight provided by an Osterman Research study stating that companies typically move, store, and archive 75% of their critical data and intellectual property within their complex ecosystems of communication channels.


13 Areas Where NFTs Have Huge Potential!

Tokenization offers more transparency, and the transactions involved are easy to execute and, most importantly, cost-effective. The representation of intellectual property is also infringing on the patent system. IP-based NFTs are one way to deal with intellectual property. The IPwe platform allows the representation of patents by storing and sharing the NFTs on this platform. The forum is hosted by the IBM Cloud and is supported by the IBM Blockchain. Clients can also trade, buy, license, finance, sell, research, and market patents there. The patent marketplace is the first of its kind, and companies benefit from treating and showcasing their patents as digital assets for security or to secure the value of their business. The freely accessible registry is supported by IBM AI and will be further expanded in the coming months. The registry features current, active, and historical patent records that can be tokenized through NFTs.



Quote for the day:

"Supreme leaders determine where generations are going and develop outstanding leaders they pass the baton to." -- Anyaele Sam Chiyson

Daily Tech Digest - November 27, 2021

Enhancing zero trust access through a context-aware security posture

A policy engine is the “brain” of a ZTA-based architecture, which dictates the level of scrutiny applied to human and machine network agents as they attempt to authenticate themselves and gain access to resources. These engines make decisions about whether to approve or deny access—or demand additional authentication factors—based on different factors including implied geolocation, time of day, threat intelligence indicators, and sensitivity of data being accessed. ZTA does not merely facilitate heightened scrutiny of network actors that behave suspiciously. It also allows for streamlined access by bona fide users to enhance productivity and reduce business interruptions resulting from security measures. Thus, properly implemented zero-trust systems achieve the best of both worlds: enhanced cybersecurity and more rapid generation and delivery of business value. To make this model even more powerful in the face of the evolving ransomware threat, I would suggest that ZTA systems incorporate additional factors—in concert with the aforementioned ones—to allow organizations to assume a context-aware security posture.


Key trends driving the workforce transformation in 2022

As employers look for ways to drive inclusion amidst new work models, connection will become a measurement of workforce culture. ADP Research Institute found that U.S. workers who feel they are Strongly Connected to their employer are 75 times more likely to be Fully Engaged than those who do not feel connected. With connection driving engagement, employers will need to heighten their focus on their people and reflect on the larger purpose that unites their workforce. Workforce flexibility will stretch beyond perceived limits and employers will embrace people-centered initiatives to build a workplace where everyone can thrive. Diversity, equity, and inclusion strategies will additionally evolve to drive true, measurable progress. ADP data shows more than 50 percent of companies that leveraged ADP DataCloud’s DEI analytics capabilities have taken action and realized positive impact on their DEI measures. With employees remaining remote and hybrid, operational and compliance considerations will grow, adding to an already complex regulatory environment. In fact, the survey found nearly 20 percent of U.S.


AI Weekly: UN recommendations point to need for AI ethics guidelines

While the policy is nonbinding, China’s support is significant because of the country’s historical — and current — stance on the use of AI surveillance technologies. According to the New York Times, the Chinese government — which has installed hundreds of millions of cameras across the country’s mainland — has piloted the use of predictive technology to sweep a person’s transaction data, location history, and social connections to determine whether they’re violent. ... Regardless of their impact, the UNESCO recommendations signal growing recognition on the part of policymakers of the need for AI ethics guidelines. The U.S. Department of Defense earlier this month published a whitepaper — circulated among National Oceanic and Atmospheric Administration, the Department of Transportation, ethics groups at the Department of Justice, the General Services Administration, and the Internal Revenue Service — outlining “responsible … guidelines” that establish processes intended to “avoid unintended consequences” in AI systems. NATO recently released an AI strategy listing the organization’s principles for “responsible use [of] AI.” 


From Naked Objects to Naked Functions

Naked Functions runs on .NET 6.0 and you can write your domain code in either C# or F# – I’ll use the former in the following code examples. The persistence layer is managed via Entity Framework Core, either relying on code conventions or explicit mapping. Naked Functions reflects over your domain code to generate a complete RESTful API – not just to the data but to all the functions too – and this RESTful API may be consumed via a Single Page Application (SPA) client. We provide a generic implementation of such a client, written in Angular. But where in Naked Objects you write only behaviourally complete domain objects, with Naked Functions you define only immutable domain types and pure side-effect free domain functions. You do not typically need to write any I/O at all, because the Naked Functions framework handles I/O with the client and the database transparently. Critically, your domain functions never make calls into the Naked Functions framework – it is entirely the other way around.


Introducing the KivaKit Framework

KivaKit is an Apache License open source Java framework designed for implementing microservices. KivaKit requires a Java 11+ virtual machine, but is source-compatible with Java 8 and 9 projects. KivaKit is composed of a set of carefully integrated mini-frameworks. Each mini-framework has a consistent design and its own focus, and can be used in concert with other mini-frameworks or on its own. ... Each mini-framework addresses a different issue that is commonly encountered when developing microservices. This article provides a brief overview of the mini-frameworks in the diagram above, and a sketch of how they can be used. ... In KivaKit, there are two ways to implement Repeater. The first is by simply extending BaseRepeater. The second is to use a stateful trait or Mixin. Implementing the RepeaterMixin interface is the same as extending BaseRepeater, but the repeater mixin can be used in a class that already has a base class. Note that the same pattern is used for the Component interface discussed below.


Your supply chain: How and why network security and infrastructure matter

Threats to the supply chain can take many forms, including malware attacks, piracy, unauthorized access to enterprise resources and data, and unintentional or maliciously injected backdoors in software source code. In addition to these threats, the hyper-connected structure of global supply chains creates additional complexity for organizations to manage and protect. Although one organization may have a strong security infrastructure in place, other firms, suppliers, and resellers they are in close communication with may not. As vendor networks become interconnected, the sharing of information (both intentional and unintentional) will occur. An accidental data leak indicates a weak spot in an organization’s network, giving the green light to malicious actors looking for a way into it. Attacks can happen at any tier of a supply chain, but most attackers will look for weaker spots to exploit, which then impacts the entire operation. Having a security-first mindset will help businesses stay ahead of threats. This means putting security at the center of the supply chain and making it a foundational element.


From digital transformation to work-life balance for talent, how the future of management consulting looks

As widespread digital acceleration occurs, a consulting firm will be expected to provide services along with cyber security, design thinking, user-interface design, digital transformation and M&A deal-making. There will be greater expectations from clients that consulting firms own a bit of the transformation and become private equity-oriented partners. A lot of consulting firms, like Bain Capital today, are likely to embrace this route. With geopolitical complexities coming in, supply chain re-alignment for risk hedging is likely to emerge as a key piece of work. Also, the emerging countries are likely to drive disproportionate growth for the industry. Another likely big change will be that all consulting firms offer the same services of strategy, design, implementation, cyber and M&A. The concept of Big 3 (McKinsey, BCG, Bain) or Big 4 (PwC, EY, Deloitte, KPMG) will be outdated since every consulting firm will compete on every deal. No case will ever be called a strategic piece of work.


What Makes A Good Product Owner?

There are many opinions about this in our community. For example, there are supposedly eight stances for Product Owners. Others argue that Product Owners are great when their team doesn’t need them. A common opinion is that Product Owners should actively experiment and test hypotheses. I gladly support these opinions. At the same time, I wonder what a scientific perspective has to offer. From our own quantitative research with 1.200 Scrum Teams, we know that teams are more effective when they are more aware of the needs of their stakeholders. And Product Owners certainly seem to play a role there. But as Unger-Windeler and her colleagues write (2019): “While [the] role is supposed to maximize the value of the product under development, there seemed to be several scattered results on how the Product Owner achieve this, as well as what actually constitutes this role in practice.” In this post, I explore scientific research that addresses the role of the Product Owner. So I opened Google Scholar and searched for all academic publications containing the word “Product Owner”.


Emerging tech in security and risk management to better protect the modern enterprise

When it comes to emerging technologies in security and risk management, Contu focused on eight areas: confidential computing; decentralized identity; passwordless authentication; secure access service edge (SASE); cloud infrastructure entitlement management (CIEM); cyber physical systems security; digital risk protection services; and external attack surface management. Many of these technologies are geared toward meeting the new requirements of multicloud and hybrid computing, Contu said. These emerging technologies also align to what Gartner has termed the “security mesh architecture,” where security is more dynamic, adaptable, and integrated to serve the needs of digitally transformed enterprises, he said. ... While still relatively new, secure access service edge (SASE) has gotten significant traction in the market because it’s a “very powerful” approach to improving security, Contu said. The term was first coined by Gartner analysts in 2019. SASE offers a more dynamic and decentralized security architecture than existing network security architectures, and it accounts for the increasing number of users, devices, applications, and data that are located outside the enterprise perimeter.


UK Legislation Seeks Mandatory Security Standards for IoT

Introduced to Parliament on Wednesday, the bill seeks to to allow "the government to ban universal default passwords, force firms to be transparent to customers about what they are doing to fix security flaws in connectable products, and create a better public reporting system for vulnerabilities found in those products," according to the government's Department for Digital, Culture, Media & Sport. The bill was developed by DCMS together with Britain's national incident response team, the National Cyber Security Center, which is part of intelligence agency GCHQ. The bill also includes a proposal to appoint a regulator to oversee compliance with the standards, backed by the ability to fine violators up to 10 million pounds ($13.3 million), or up to 4% of a firm's global revenue, whichever is greater. "The regulator will also be able to issue notices to companies, requiring that they comply with the security requirements, recall their products, or stop selling or supplying them altogether. 



Quote for the day:

"I think the greater responsibility, in terms of morality, is where leadership begins." -- Norman Lear