Daily Tech Digest - April 24, 2022

Zero-Trust For All: A Practical Guide

In general, zero-trust initiatives have two goals in mind: reduce the attack surface and increase visibility. To demonstrate this, consider the (common) scenario of a ransomware gang buying initial access to a company’s cloud through an underground initial-access broker and then attempting to mount an attack. In terms of visibility, “zero trust should stop that attack, or make it so difficult that it will be spotted much earlier,” said Greg Young, vice president of cybersecurity at Trend Micro. “If companies know the postures of their identities, applications, cloud workloads, data sources and containers involved in the cloud, it should make it exceedingly hard for attackers. Knowing what is unpatched, what is an untrusted lateral movement, and continuously monitoring the posture of identities really limits the attack surface available to them.” And on the attack-surface front, Malik noted that if the gang used a zero-day or unpatched vulnerability to gain access, zero trust will box the attackers in. “First, at some point the attackers will cause a trusted user or process to begin misbehaving,” he explained. 


Web3 Security: Attack Types and Lessons Learned

Expert adversaries, often called Advanced Persistent Threats (APTs), are the boogeymen of security. Their motivations and capabilities vary widely, but they tend to be well-heeled and, as the moniker suggests, persistent; unfortunately, it’s likely they will always be around. Different APTs run many different types of operations, but these threat actors tend to be the likeliest to attack the network layer of companies directly to accomplish their goals. We know some advanced groups are actively targeting web3 projects, and we suspect there are others who have yet to be identified. ... One of the most well-known APTs is Lazarus, a North Korean group which the FBI recently attributed as having conducted the largest crypto hack to date. ... Now that web3 lets people directly trade assets, such as tokens or NFTs, with almost instant finality, phishing campaigns are targeting its users. These attacks are the easiest way for people with little knowledge or technical expertise to make money stealing crypto. Even so, they remain a valuable method for organized groups to go after high-value targets, or for advanced groups to wage broad-based, wallet-draining attacks through, for example, website takeovers.


The New Face of Data Governance

In light of the changes in the nature of data, the level of data regulation, and the data democratization trend, it’s safe to say that the traditional, old, boring, data governance is dead. We can’t let it in the grave, as we need data governance more than ever today. Our job is thus to resurrect it, and give it a new face. ... Data governance should embrace the trends of operational analytics and data democratization, and ensure that anybody can use data at any time to make decisions with no barrier to access or understanding. Data democratization means that there are no gatekeepers creating a bottleneck at the gateway to data. This is worth mentioning, as the need for data governance to be secure and compliant often leads programs to create bottlenecks at the gateway to data, as the IT team is usually put in charge of granting access to data. Operational people can end up waiting hours until they manage to get access to a dataset. By then, they have already given up on their analysis. It’s important to have security and control, but not at the expense of the agility that data offers.


It’s time for businesses to embrace the immersive metaverse

Companies need to understand what’s possible in the metaverse, what’s already in use and what customers or employees will expect as more organizations create immersive experiences to differentiate their products and services. The possibilities may include improvements in what companies are doing now as well as revolutionary changes in the way companies operate, connect and engage with customers and employees to increase loyalty. How can leaders start to identify opportunities in the metaverse? Start, as always, with low-hanging fruit, like commerce and brand experiences that can benefit from immersive support. Also consider the technology that can enable what you need. From an architectural standpoint, it’s helpful to think of immersive experiences as a three-layer cake. The top layer is where users get access via systems of engagement. The middle layer is where messages are sent, received and routed to the right people via systems of integration. The bottom layer comprises the databases and transactions — the systems of record. 


Why it’s so damn hard to make AI fair and unbiased

The problem is that if there’s a predictable difference between two groups on average, then these two definitions will be at odds. If you design your search engine to make statistically unbiased predictions about the gender breakdown among CEOs, then it will necessarily be biased in the second sense of the word. And if you design it not to have its predictions correlate with gender, it will necessarily be biased in the statistical sense. So, what should you do? How would you resolve the trade-off? Hold this question in your mind, because we’ll come back to it later. While you’re chewing on that, consider the fact that just as there’s no one definition of bias, there is no one definition of fairness. Fairness can have many different meanings — at least 21 different ones, by one computer scientist’s count — and those meanings are sometimes in tension with each other. “We’re currently in a crisis period, where we lack the ethical capacity to solve this problem,” said John Basl, a Northeastern University philosopher who specializes in emerging technologies. So what do big players in the tech space mean, really, when they say they care about making AI that’s fair and unbiased? 


Quantum computing to run economic models on crypto adoption

Indeed, QC makes use of an uncanny quality of quantum mechanics whereby an electron or atomic particle can be in two states at the same time. In classical computing, an electric charge represents information as either an 0 or a 1 and that is fixed, but in quantum computing, an atomic particle can be both a 0 and a 1, or a 1 and a 1, or a 0 and a 0, etc. If this unique quality can be harnessed, computing power explodes manyfold, and QC’s development, paired with Shor’s algorithm — first described in 1994 as a theoretical possibility, but soon to be a wide-reaching reality, many believe — also threatens to burst apart RSA encryption, which is used in much of the internet including websites and email. “Yes, it’s a very tough and exciting weapons race,” Miyano told Cointelegraph. “Attacks — including side-channel attacks — to cryptosystems are becoming more and more powerful, owing to the progress in computers and mathematical algorithms running on the machines. Any cryptosystem could be broken suddenly because of the emergence of an incredibly powerful algorithm.”


Cybercriminals are finding new ways to target cloud environments

Criminals have also shifted their focus from Docker to Kubernetes. Attacks against vulnerable Kubernetes deployments and applications increased to 19% in 2021, up from 9% in 2020. Kubernetes environments are a tempting target, as once an attacker gains initial access, they can easily move laterally to expand their presence. Attacks that affect an entire supply chain have increased over the past few years, and that has been felt across the software supply chain as well. In 2021, attackers aiming at software suppliers as well as their customers and partners employed a variety of tactics, including exploiting open source vulnerabilities, infecting popular open source packages, compromising CI/CD tools and code integrity, and manipulating the build process. Last year, supply-chain attacks accounted for 14.3% of the samples seen from public image libraries. “These findings underscore the reality that cloud native environments now represent a target for attackers, and that the techniques are always evolving,” said Assaf Morag, threat intelligence and data analyst lead for Aqua’s Team Nautilus.


Addressing the last mile problem with MySQL high availability

Because a single database server is shared between a variety of client applications, a single rogue transaction from an unoptimized query could potentially modify millions of rows in one of the databases on the server, causing performance implications for the other databases. These transactions have the potential to overload the I/O subsystem and stall the database server. In this situation, the Orchestrator is unable to get a response from the primary node, and the replicas also face issues in connecting to the primary. This causes the Orchestrator to initiate a failover. This problem is compounded by the application re-trying these transactions upon failure, and stalling the database operations repeatedly. These transactions halt the database for many seconds and the Orchestrator is quick to catch the stalled state and initiate a failover, impacting the general availability of the MySQL platform. We knew that MySQL stores the number of rows modified by any running transaction, and this number can be obtained by querying the trx_rows_modified of the innodb_trx table, in the information_schema database.


California eyes law to protect workers from digital surveillance

The bill would “establish much needed, yet reasonable, limitations on how employers use data-driven technology at work,” Kalra told the Assembly Labor and Employment Committee on Wednesday. “The time is now to address the increasing use of unregulated data-driven technologies in the workplace and give workers — and the state — the necessary tools to mitigate any insidious impacts caused by them.”  The use of digital surveillance software grew during the pandemic as employers sought to track employees’ productivity and activity when working from home, installing software that uses techniques such as keystroke logging and webcam monitoring. Digital monitoring and management is being used across a variety sectors, with warehouse staff, truck drivers and ride-hailing drivers subject to movement and location tracking for example, with decisions around promotions, hiring and even firing made by algorithms in some cases. The bill, which was approved by the committee on a 5-2 vote and now moves to the Appropriations Committee for more debate, makes three core proposals


Data privacy: 5 mistakes you are probably making

It is a mistake to act on laws that apply only in the geographic location of business operations. There might be privacy regulations/compliance issues that apply to a company beyond those that exist where the company is located – for example, a company headquartered in New York might have customers in Europe, and some European data privacy regulations likely would apply beyond any U.S.-based regulations. This is a significant problem with breach response laws. A large number of U.S. organizations follow the requirements only for their own state or territory. There are at least 54 U.S. state/territory breach laws, so this belief could be very costly. Privacy management programs should apply to all applicable laws and regulations of the associated individuals and also synthesize all requirements so that one set of procedures can be followed to address the common requirements, in addition to meeting unique requirements for specific laws. Many organizations are also overconfident that they will not experience a privacy breach, which leaves them unable to respond effectively, efficiently, and fully when a breach does happen.



Quote for the day:

"Leadership is just another word for training." -- Lance Secretan

Daily Tech Digest - April 23, 2022

Return on CI/CD Is Larger than the Business Outcome

In very simple terms, when you adopt CI/CT/CD, every dev work — new feature, bug fix, improvement — is continuously tested and integrated into your “ready to ship” branch and is, well, ready to be released to your customers based on your criteria for delivery. Since new dev work is continuously tested for quality and regressions, you have high confidence to release more frequently. I used to work at a company where, when a critical patch was needed, we just triggered our pipeline, which performed extensive validations involving just a handful of people and, after a short time, we were ready to cut a release. However, for a software organization looking into adopting effective CI/CD, the return on investment (ROI) should not be purely focused on measuring its business outcomes. The DORA metrics can give you a measure of the positive business outcomes from adopting an effective CI/CD process — that is, more frequent releases, faster delivery of changes to customers, fewer bugs and incidents, faster recovery from incidents. On the other hand, and equally important, adopting effective CI/CD has positive outcomes to the development teams as well — that is, it leads to higher innovation, higher throughput, quality and automation mindset, and higher team morale.


Customer experience and data privacy need to go hand-in-hand

Not only are new data privacy laws impacting the future of marketing and advertising to consumers, but new approaches as a means to adhere to data privacy laws from Google and Apple are having an impact as well. However, while these steps are thinly veiled attempts to make it look like data privacy is the concern, it’s yet another attempt by big tech to distract from the issue at hand where the consumer no longer has the say. Tracking customers’ page views, serving up ideas of what they might like in the future and just forgetting to ask what they prefer has become the norm. Brands have a real opportunity to adapt their current infrastructures to build privacy-safe data stores that adhere to compliance and regulations as part of the platform or ecosystem. This allows them to keep using their (first-party) data-driven approach, while allowing consumers to feel assured their data is being protected and they have a voice. It’s the same problem all over again — brands getting excited to capitalize on the latest trends and, in their frenzy, pushing consumer data privacy concerns aside to get there first. 


Why So Many Security Experts Are Concerned About Low-Code/No-Code Apps

Since low-code/no-code platforms often find their way into the enterprise through business units rather than top-down through IT, they can easily slip through the cracks and be missed by security and IT teams. While security teams are in most cases part of the procurement process, it's easy to treat a low-code/no-code platform as just another SaaS application used by the business, not realizing that the result of adopting this platform would be empowering a whole array of new citizen-developers in the business. In one large organization, citizen-developers in the finance team built an expense management application to replace a manual process filled with back-and-forth emails. Employees quickly adopted the application since it made it easier for them to get reimbursed. The finance team was happy because it automated part of its repetitive work. But IT and security were not in the loop. It took some time for them to notice the application, understand that it was built outside of IT, and reach out to the finance team to bring the app under the IT umbrella. Security and IT teams are always in a state where the backlog of concerns is much larger than their ability to invest. 


‘Decentralized’ web3 startups find out the hard way there’s no safety net

The problem, he explains, is that the policies “very specifically do not include digital assets, meaning if the hackers had gotten in and stolen cash [from Axie], it would have been squarely covered by a crime policy.” Since they didn’t, it wasn’t. The challenge for insurers largely ties to the lack of protections that digital assets currently receive from banking regulators. As Wallace explains it, “Some [insurance] markets are open to making some modifications, but I wouldn’t say it’s mainstream at this point” largely because there is no kind of equivalent to the FDIC or the Securities Investor Protection Corporation (SIPC), which partly protect financial institutions in the event that money deposited in a bank or with a broker-dealer is stolen. “That concept does not yet exist in digital,” Wallace says, adding that it’s “probably the most common point of interest of web3 companies.” Insurers hoping for protections to emerge could be waiting a while, given the way things are trending. Consider that earlier this month, the FDIC issued a “financial institution letter” (or FIL) that suggests the agency is still evaluating — and concerned by — the risk posed by crypto assets and that it wants more information about how the institutions it covers can conduct crypto-related activities in a safe and sound manner.


How To Automate Training Programs To Develop Employees' Leadership Skills

When investing in corporate learning, companies expect to make a real impact on business outcomes. Nevertheless, only 1 in 4 senior managers reports that leadership training tangibly influences a company's outcomes (paywall). Corporations spend plenty of resources on traditional employee training based on out-of-date methods. Many courses are considered to be successfully finished without any feedback or post-training assessment. They provide zero or little real knowledge and skills, turning the investment into hemorrhaging time and money. But combining elaborate assessment with any development program boosts bench strength by an average of 30%. The issue is quite hard to address due to the lack of human resources within an organization for nurturing a leadership mindset and supervising. For instance, consider using video courses with personal feedback for each student from a coach. This approach is hard to scale because the trainer's time is limited. This problem can be solved by automating the personal leadership program so that a script carries out the role of trainers and their assistants.


The Role of DevOps in Cloud Security Management

Security on the cloud vs. security of the cloud always needs to be top of mind. Don’t forget that you are responsible for securing your own applications, data, OS, user access, and virtual network traffic. Beyond these, hone up on your configuration basics. More than 5 percent of AWS S3 buckets are misconfigured to be publicly readable. Recently, a simple misconfiguration in Kafdrop revealed the Apache Kafka stacks of some of the world’s largest businesses. While the big three clouds have invested millions to secure their stacks, the PaaS companies don’t have those budgets – so, check, check, and double check. There’s a reason it’s called “zero trust.” With SaaS and web security, again credential protection is key. Each architecture type requires its own type of security – be diligent. For example, a hybrid cloud infrastructure needs a “triple whammy” of security - the on-prem needs to be highly secure with all the ports closed, surface area tracked, and a highly active Security Operations Center (SOC). The public cloud aspect needs to be secured using the latest and greatest security tech available with that public cloud stack. 


Hidden Interfaces for Ambient Computing

While many of today’s consumer devices employ active-matrix organic light-emitting diode (AMOLED) displays, their cost and manufacturing complexity is prohibitive for ambient computing. Yet other display technologies, such as E-ink and LCD, do not have sufficient brightness to penetrate materials. To address this gap, we explore the potential of passive-matrix OLEDs (PMOLEDs), which are based on a simple design that significantly reduces cost and complexity. However, PMOLEDs typically use scanline rendering, where active display driver circuitry sequentially activates one row at a time, a process that limits display brightness and introduces flicker. Instead, we propose a system that uses parallel rendering, where as many rows as possible are activated simultaneously in each operation by grouping rectilinear shapes of horizontal and vertical lines. For example, a square can be shown with just two operations, in contrast to traditional scanline rendering that needs as many operations as there are rows. With fewer operations, parallel rendering can output significantly more light in each instant to boost brightness and eliminate flicker.


Flooded by Event Data? Here’s How to Keep Working

Today’s digital-first organizations need to create superb experiences for their customers — or risk irrelevance. Ideally, this requires resolving any operational issues before the end user has realized there’s something wrong. However, for most organizations, it’s not that easy. Digital operations teams are drowning in a tsunami of events. Existing tooling is unable to cope; manual processes and multiple point solutions translate into interruptions and escalations for overburdened responders. Solving the issues above is where event orchestration can help. Event orchestration enables users to route events toward the most appropriate set of actions. PagerDuty’s event orchestration functionality, for example, analyzes, enriches, determines logic for and automatically acts on events as they occur in real time, within microseconds. This enables our customers to take all the events coming in from 650+ integrations and apply logic and automation to figure out what should be done with each one — what the next best action is — at machine speed. Because we’re able to nest automation together, users can have one automated action, start a diagnostic process, learn more about the event and then use this information to figure out what to do next.


EdgeDB wants to modernize databases for cutting-edge apps

Unsurprisingly, companies are increasingly embracing alternatives to relational databases, like NoSQL. Driven by a lack of scalability with legacy solutions, they’re looking for modern systems — including cloud-based systems — that support scaling while reducing costs and accelerating development. Gartner predicts that 75% of all databases will be migrated to a cloud service by 2022 — highlighting the shift. “The database industry is facing a major shift to a new business model,” Yury Selivanov, the CEO of EdgeDB, a startup creating a next-gen database architecture, told TechCrunch via email. “It’s clear that there is a long tail of small- and medium-sized businesses that need to build software fast and then host their data in the cloud, preferably in a convenient and economical way.” Selivanov touts EdgeDB, which he co-founded in 2019 with Elvis Pranskevichus, as one of the solutions to the legacy database problem. EdgeDB’s open source architecture is relational, but Selivanov says that it’s engineered to solve some fundamental design flaws that make working with databases — both relational and NoSQL — unnecessarily onerous for enterprises.


Overcoming the biggest cyber security staff challenges

Too many people perceive cybersecurity as a complex, technical world dominated by geeks in hoodies. They cannot see the vast opportunity for them to add value with their own skill sets. We need to broaden the vision so that every employee can become a partner in the security family and enrich it with their own talents. Marketers, lawyers, crisis leaders, authors, and game designers can all be part of a holistic security strategy, adding value and reducing risk, without stepping away from their primary passion. ... Too many senior staff are leaving the industry due to stress and overwork. The security leadership role has become incredibly broad, having accountability to protect against risks and threats across the entire business, and yet the team remains a pyramid with a narrow base. By clearly pushing accountability back to the business units to adhere to standards and holding them (rather than the security team) accountable when they fall short, we can free the leadership from much of the stress, minimising staff turnover. 



Quote for the day:

"A tough hide with a tender heart is a goal that all leaders must have." -- Wayde Goodall

Daily Tech Digest - April 21, 2022

7 ways to avoid a cloud misconfiguration attack

We tend to focus a lot on avoiding misconfiguration for individual cloud resources such as object storage services (e.g., Amazon S3, Azure Blob) and virtual networks (e.g., AWS VPC, Azure VNet), and it’s absolutely critical to do so. But it’s also important to recognize that cloud security hinges on identity. In the cloud, many services connect to each other via API calls, requiring IAM services for security rather than IP-based network rules, firewalls, etc. For instance, a connection from an AWS Lambda function to an Amazon S3 bucket is accomplished using a policy attached to a role that the Lambda function takes on—its service identity. IAM and similar services are complex and feature rich, and it’s easy to be overly permissive just to get things to work, which means that overly permissive (and often dangerous) IAM configurations are the norm. Cloud IAM is the new network, but because cloud IAM services are created and managed with configuration, cloud security is still all about configuration—and avoiding misconfiguration.


A.I. Is Mastering Language. Should We Trust What It Says?

So far, the experiments with large language models have been mostly that: experiments probing the model for signs of true intelligence, exploring its creative uses, exposing its biases. But the ultimate commercial potential is enormous. If the existing trajectory continues, software like GPT-3 could revolutionize how we search for information in the next few years. Today, if you have a complicated question about something — how to set up your home theater system, say, or what the options are for creating a 529 education fund for your children — you most likely type a few keywords into Google and then scan through a list of links or suggested videos on YouTube, skimming through everything to get to the exact information you seek. (Needless to say, you wouldn’t even think of asking Siri or Alexa to walk you through something this complex.) But if the GPT-3 true believers are correct, in the near future you’ll just ask an L.L.M. the question and get the answer fed back to you, cogently and accurately. Customer service could be utterly transformed: Any company with a product that currently requires a human tech-support team might be able to train an L.L.M. to replace them.


NSO Group faces court action after Pegasus spyware used against targets in UK

NSO argued in a response to the legal letters that UK courts have no jurisdiction over NSO, which is based in Israel, and that legal action is barred by “state immunity”. The company also argued that there was no proper basis for showing that NSO acted as a “data controller or a data processor” under UK data protection law. There is no basis to claim that NSO joined in a “common design” with Saudi Arabia or the UAE that would make it “jointly liable” with the two countries, it said. NSO said it provides surveillance software for the “exclusive use” of state governments and their intelligence services. It claimed to pride itself on being the only company in this field “operating under an ethical governance framework that is robust and transparent”. The company said it had policies in place to ensure its “products would not be used to violate human rights”. It claimed that the legal letters repeated “misinformation” from reports and statements by non-governmental organisations, including Citizen Lab, Amnesty International and Forbidden Stories.


IP addressing could support effective network security, but would it be worth it?

VPNs actually provide pretty good protection against outside intrusion, but they have one problem—the small sites. MPLS VPNs are expensive and not always available in remote locations. Those sites often have to use the internet, and that can mean exposing applications, which means increasing the risk of hacking. SD-WAN, by adding any site with internet access to the corporate VPN, reduces that risk. Or rather it reduces that particular risk. But hacking in from the outside isn’t the only risk. These days, most security problems come from malware planted on a computer inside the company. There, from a place that’s already on whatever VPN the company might use, the malware is free to work its evil will. One thing that can help is private IP addresses. We use private IP addresses literally every moment of every day, because virtually all home networking and a lot of branch-office networking are based on them. There are a series of IPv4 and IPv6 addresses set aside for use within private subnetworks, like your home. Within the private subnet, these addresses work like any IP address, but they can’t be routed on the internet.


CIOs increasingly tap contract IT as talent gap filler

CIOs also typically find they can more easily get workers with high-demand skills on a temporary basis, Mok says. Markow agrees, adding: “You can really bring in new skill sets and capabilities more quickly in some cases.” But CIOs must be careful not to misclassify workers as contract when they should be staff, a legal distinction that could run the company afoul of labor laws, Mok says. In addition to potentially misclassifying these workers, Mok says CIOs who use contract and freelance workers too heavily or for extremely long stretches are often also operating on a more reactionary versus strategic basis, which can translate to missed opportunities, higher costs, and poor morale. “Using contract workers can be more cost effective when you have ad-hoc needs that need to be addressed,” Markow adds. “But that said, it can be more costly if those projects run far longer than anticipated or roll into other needs or if there are unintended requirements that come about and those workers need to stay on.”


Cybersecurity litigation risks: 4 top concerns for CISOs

The risk of litigation is not limited to corporations. CISOs themselves face being subject to legal action for breach of duty where insufficient steps were taken to prevent a breach, or the aftermath of the breach was handled badly, says Simon Fawell, partner at Signature Litigation LLP. Jinivizian agrees: “The role of the CISO has never been more critical for mid/large enterprises, and potentially more in the crosshairs and held accountable for security incidents and data breaches, as illustrated by the ongoing class action against SolarWinds’ CISO and other executives following the devastating supply chain attack in 2020,” he states. This is also evidenced by the charges against Uber’s CSO for allegedly trying to cover up a ransomware payment relating to the 2016 attack that compromised data of millions of users and drivers, Armstrong adds. If a CISO acts as a company director, then they could face shareholder actions for breach of duty following data and privacy breaches based on damage to company value, says Fawell. “Shareholder actions against directors have been on the rise in the UK and, where a data breach has led to a drop in value for shareholders, claims against directors are increasingly being considered.


The Cybersecurity Threats Facing Smart Buildings

IoT devices are common appliances that you might even find around your home but that are connected to the internet. Examples of IoT devices include doorbell cameras, smart meters, fitness trackers, smart speakers, and connected cars. An unprotected device is like leaving the backdoor open or a key under the mat. There were even security doubts raised about whether Joe Biden could bring his Peloton bike into the White House when he became president. Ensuring that every single device connected in a smart building has adequate security is a must if companies want to avoid data breaches. While as much autonomy as possible is better for smart buildings, there ultimately must be some human input. Often, the people using the systems are the ones who can leave them the most vulnerable. Everyone makes mistakes, but when it comes to cybersecurity, one mistake is all it takes for a network to be breached and data to be mined. Human error in this instance might be accidentally downloading or clicking a link to malware, or using an old password and not changing it.


How IT departments enable analytics operations

Modern IT organizations need a mix of infrastructure and data skills because both enable an insight-driven enterprise. Infrastructure skills are necessary to achieve an architecture that can support data analytics requirements, including scalability, data governance and data security. Today, the building, operating and innovating of the value proposition using AI is getting simpler with the aid of advanced AI cloud platforms. "This means we will need business, product and technology teams who bring skills with deep experience in leveraging data and to offer comprehensive products and capabilities that are aligned to the business context," said Rizwan Akhtar, executive vice president and CTO of business technology at real estate services company Realogy Holdings. Fundamentally, IT needs to have stronger math skills, including linear algebra, statistics, calculus and maybe inferential geometry -- skills which data scientists tend to have. However, given the mainstream adoption of AI and machine learning (ML), there are now tools that make it easier for non-data scientists to do more.


EF Core 7 Finally Divorces Old .NET Framework

In fact, it was only last summer that we reported the dev team was still playing catch-up to the old version of EF (which stopped at version 6) in the article "EF Core 6 Dev Team Plays Catch-Up with EF6." However, while it plays catch-up, the team also introduces new goodies not to be found in the .NET Framework version. Microsoft guidance says: "EF Core has always supported many scenarios not covered by the legacy EF6 stack, as well as being generally much higher performing. However, EF6 has likewise supported scenarios not covered by EF Core. EF7 will add support for many of these scenarios, allowing more applications to port from legacy EF6 to EF7. At the same time, we are planning a comprehensive porting guide for applications moving from legacy EF6 to EF Core." And in the first preview release of EF Core 7.0 that was announced last week, an important milestone was reached. "EF7 will not run on .NET Framework," said Microsoft senior program manager Jeremy Likness in a Feb. 17 blog post.


Dynamic Value Stream Mapping to Help Increase Developer Productivity

In other industries such as manufacturing, supply chain, or distribution we can find a wide adoption of process analysis and lean practices for efficiency gain. Let’s consider Value Stream Mapping, the key practice facilitating process analysis and further improvement. A Value Stream Map depicts every step in the process and categorizes each step in terms of the value it adds and what value is wasted. Despite the fact that the software industry has been adopting lean principles [Lean Software Development Principles (Poppendieck and Poppendieck, 2003) and The Principles of Product Development Flow (Reinertsen, 2009)] the technique of value stream mapping has not gained a lot of traction in the software industry. So what prevents software engineering organizations from adopting value stream mapping as a foundation for software delivery capability optimization? Even though the practice has existed for a long time and is well known among agile and lean coaches, knowledge and adoption among enterprises are falling behind.



Quote for the day:

"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has." -- Margaret Mead

Daily Tech Digest - April 20, 2022

Security-as-Code Gains More Support, but Still Nascent

"The great thing about security-as-code is that you know the configuration that you have deployed exactly corresponds to what you had specified and analyzed as meeting your security requirements," he says. "Many breaches out there are not necessarily the result of an unknown risk, but are usually the result of some control that the organization thought they had not being deployed and operating when they needed it the most." Security-as-code is an extension of the infrastructure-as-code movement that has come about as software-defined networks and systems have become more popular. DevOps teams have adopted infrastructure-as-code as the de facto standard for building and deploying software, containers, and virtual machines, but now companies are betting that the shift to cloud-native infrastructure will make security-as-code a key part of a sustainable approach to security. ... Google is joined by others blazing a trail into the security-as-code arena. The growing movement to encode security as a configuration file that can be incrementally improved led security firm Tenable Network Security to acquire Accurics, a maker of security-as-code technology.


The evolving role of the lawyer in cybersecurity

On the purely defensive side, advanced warning of a forthcoming attack can be the difference between a successful defensive posture or a damaging and costly incident. With such intelligence in hand, organizations can craft rules on an email gateway or firewall to effectively prevent attackers’ phishing email from reaching employee inboxes or to block the ability of an employee to navigate to a malicious link. Indeed, one of our attorney colleagues in New York used a bespoke threat intelligence system that he developed to identify and help to neutralize the domains that hosted a forthcoming cyberattack on the World Health Organization at the outset of the Coronavirus crisis in March 2020. That bespoke intelligence identified a domain and subdomain combination that allowed our colleague to (i) validate the data and establish that he had indeed identified an active threat to the WHO and (ii) communicate that data to trusted parties, including law enforcement, enabling defensive countermeasures to be put in place. This was a bright line example of how counsel can play a role in the critical day-to-day functions of security operations.


Disruptive Innovation: The emerging sectors applying digital technologies

Property technology, or Proptech, has been disrupting the real estate space, allowing for the buying, renting and selling of properties online. Like other industries on this list, Proptech companies have been vital for maintaining operations during lockdown as company headquarters closed their doors. Landlords and homeowners can securely list properties on an online market via a website or app, and in more recent times have been able to upload virtual viewings. Meanwhile, data analytics and algorithms powered by AI have allowed users to see what properties would be best for them. Additionally, platforms such as PlanetRent offer centralised portals for hosting relevant documents and details in one place. Proptech, and indeed real estate, also encompasses offices, and demand for companies in this area of the sector have been in high demand due to the need for organisations to move to smaller spaces, or leave the office altogether. Here, users can view office listings and make transactions online.


Flexible return-to-office policies are hammering employee experiences

The latest report observed that just more than a third of knowledge workers (34%) have reverted to working from the office five days a week, the greatest share since it began surveying in June 2020, but as this was happening, employee experience scores were plummeting for knowledge workers asked to return to the office full-time and for those who do not have the flexibility to set their own work schedules. This includes 28% worse scores on work-related stress and anxiety and 17% worse scores on work-life balance compared with the previous quarter. Moreover, the study warned there were signs that employers will pay a price for this discontent: workers who say they are unsatisfied with their current level of flexibility – both in where and when they work – were now three times as likely to look for a new job in the coming year. The data showed that non-executives were facing far more strain during the return-to-office era than leaders in the C-suite, further widening an existing executive-employee disconnect on key job satisfaction measures revealed in October 2021. 


IoT Is a Breakthrough for Automation Systems, But What About Security?

Since the IoT and smart devices collect heaps of consumer data stored over the cloud and are managed in real-time, leveraging the traditional security tools and practices to ensure security doesn’t make any sense. As far as the automation industry is concerned, providing real-time access to applications and devices and ensuring robust security without human intervention becomes an uphill battle. Moreover, cybercriminals are always looking for a loophole in the networks that provide frequent access to resources, devices, and applications that they can exploit to sneak into a network. Hence, stringent security mechanisms become crucial for businesses leveraging IoT to automate their processes. Here’s where the role of a centralized data infrastructure seamlessly routes all IoT devices through an API that further offers insights regarding machine-to-machine access and user access in real-time. Through zero-trust principles and machine-to-machine access management, businesses can deliver a seamless, secure, and monitored experience to their customers without compromising their identities and personal information.


Interim CIOs Favored as Organizations Seek Digitalization Push

This is happening because of an overall talent shortage juxtaposed with talent growth in the on-demand talent pool, coupled with more demand for the CIO role because of digital transformation. “Companies need more from the CIO; it’s harder to get this talent, they need to move faster, and there's fantastic talent that can work remotely,” he explained. Neil Price, head of practice, CIO and executive technology leadership for Harvey Nash Group, pointed out the concept of an interim CIO is not new. However, it is becoming ever more attractive due to the volume of technology transformation that businesses are looking to execute to keep pace in the post-pandemic digital environment. “They need accelerated change, and for this, an interim CIO who comes in with a very specific brief and a clear deadline or end-date can bring the concentrated focus and impetus needed,” he said. “It’s easier to deliver change with a fresh set of eyes, and this is something an interim CIO offers.” He added the concept of an interim CIO can apply equally across all types of business, and any organization that wants to change at pace can see the benefits, whether it’s a small or medium-sized business or a large enterprise.


Artificial Intelligence (AI) strategy: 4 priorities for CIOs

One of the most critical priorities is identifying high-impact areas with opportunities to embed AI-based, real-time decisions in business processes. The ability to process contextual information in real time to make on-the-fly decisions is a powerful way to differentiate products, services, and experiences in the crowded marketplace. For example, insurance firms can automate claims processing for real-time approvals based on pictures and videos provided by the claimant right from the place and time of the incident. Lenders can analyze risks in real time based on collateral and background information to offer on-the-spot loan approvals. Organizations can personalize and customize products and services across a broad array of use cases through the judicious injection of AI in their business processes. ... Since AI engineering differs from “traditional” software engineering, CIOs must establish a strategy to institutionalize AI and ML methodologies. Many enterprises have found that the most effective way to do this is to establish a robust platform supported by a governance model.


Securely Scaling the Myriad APIs in Real-World Backend Platforms

Most real-world software platforms will also interact with external APIs from business partners or third-party providers. A good authorization server and an understanding of OAuth standards will also improve your capabilities for this type of “federation.” One interesting use case is shown below, where a partner user is authorized to sign in to the company’s app. The partner authorization server could then act as an identity provider to authenticate the user according to the partner’s security policy and with familiar credentials. An embedded token from the business partner could then be used by the company’s APIs to call partner APIs. ... JWT libraries will allow APIs to use a clock skew when validating access tokens, and it is possible to configure this slightly higher for downstream APIs. It is not recommended to rely on expiry times, however, since there can be multiple reasons for a JWT to fail validation. In some setups, this might be caused by an infrastructure event such as a load balancing failover. Instead, it is recommended to implement standard expiry handling. In this case, retrying from the client is often the most resilient option.


AWS Serverless Lambda Resiliency: Part 1

This is critical as serverless Lambda functions are charged based on Memory limits (which determines the CPU allocation) along with the duration for which these functions are invoked. Without appropriate resiliency, our Lambda services could execute for a longer duration than required, our Lambda could overwhelm the backend services when they are having issues and not available or being in a degraded state, and client experiences that invoke our Lambda functions do not get an immediate fallback response. ... AWS Lambda Serverless capabilities themselves have multiple use cases when invoked synchronously or asynchronously and hence the context of how they can be made resilient can differ between the different invocations. The approaches change based on whether the solution is deployed in a single region vs. multiple regions. There is also a dependency on where the provider service is deployed, e.g., on AWS (same or another region) or outside AWS. We will identify ways to ensure that warm start Lambdas are not carrying forward issues that happened during cold start initialization. The overall objective is to reduce Lambda functions' execution time/memory consumption by optimizing them before deployment.


Cross-platform UIs ‘go live’ with .NET MAUI

You can best think of MAUI as a way to unify the various platform-specific .NET APIs so that C# and XAML code can be written once and run everywhere, with the option of providing platform-specific code to avoid a lowest common denominator approach. MAUI sits above both native code and the common base-class libraries. Your code calls MAUI APIs, which then call the requisite platform APIs. If you prefer to have native-specific features, you can call platform APIs directly if they don’t have MAUI coverage. This approach gives you a base set of common controls, much like those used by Xamarin Forms, with a layout engine that allows UI code to scale between different device form factors and screen sizes. It’s important to be aware of the capabilities of your target devices and, at the same time, come up with UI designs that can support the shift between landscape PC and Mac experiences and portrait mobile screens. Much of MAUI is the familiar XAML design experience, with a page description and code-behind to manage interactions with the rest of your application, as well as a canvas for displaying and interacting with custom graphic elements.



Quote for the day:

"The great leaders are like best conductors. They reach beyond the notes to reach the magic in the players." -- Blaine Lee

Daily Tech Digest - April 19, 2022

So you're thinking about migrating to Linux? Here's what you need to know

The Linux desktop is so easy. It really is. Developers and designers of most distributions have gone out of their way to ensure the desktop operating system is easy to use. During those early years of using Linux, the command line was an absolute necessity. Today? Not so much. In fact, Linux has become so easy and user-friendly, that you could go your entire career on the desktop and never touch the terminal window. That's right, Linux of today is all about the GUI and the GUIs are good. If you can use macOS or Windows, you can use Linux. It doesn't matter how skilled you are with a computer, Linux is a viable option. In fact, I'd go so far to say that the less skill you have with a computer the better off you are with Linux. Why? Linux is far less "breakable" than Windows. You really need to know what you're doing to break a Linux system. One very quick way to start an argument within the Linux community is to say Linux isn't just a kernel. In a similar vein, a very quick way to confuse a new user is to tell them that Linux is only the kernel. ... Yes, Linux uses the Linux kernel. All operating systems have a kernel, but you don't ever hear Windows or macOS users talk about which kernel they use.


Purpose is a two-way street

There’s a broader redefinition of purpose that’s underway both for organizations and individuals. Today, people don’t have just one single career in a lifetime but five or six—and their goals and purpose vary at each stage. At the same time, organizations can’t address or engage with the broad range of stakeholders they deal with through just one single purpose. In combination, these shifts are ushering in the concept of purpose as a “cluster” of goals and experiences, with different aspects resonating with different stakeholders at different times. The same cluster concept holds true for career paths. It is vital to expand the conversation about the varied, unique options people have to fulfill their goals. Companies must strive to make those options more transparent, more individualized, and more flexible, and less linear. For today’s employees, the point of a career path is not necessarily to climb a ladder with a particular end-state in mind but to gain experience and pursue the individual’s purpose—a purpose that may shift and evolve over time. To that end, it may make sense for organizations to create paths that allow employees to move within and across, and even outside, an organization—not just up—to achieve their goals.


How algorithmic automation could manage workers ethically

Mewies says bias in automated systems generates significant risks for employers that use them to select people for jobs or promotion, because it may contravene anti-discrimination law. For projects involving systemic or potentially harmful processing of personal data, organisations have to carry out a privacy impact assessment, she says. “You have to satisfy yourself that where you were using algorithms and artificial intelligence in that way, there was going to be no adverse impact on individuals.” But even when not required, undertaking a privacy impact assessment is a good idea, says Mewies, adding: “If there was any follow-up criticism of how a technology had been deployed, you would have some evidence that you had taken steps to ensure transparency and fairness.” ... Antony Heljula, innovation director at Chesterfield-based data science consultancy Peak Indicators, says data models can exclude sensitive attributes such as race, but this is far from foolproof, as Amazon showed a few years ago when it built an AI CV-rating system trained on a decade of applications, to find that it discriminated against women.


The changing role of the CCO: Champion of innovation and business continuity

The best CCOs partner with the business to really understand how to place gates and controls that mitigate risk, while still allowing the business to operate at maximum efficiency. One area of the business that is particularly valuable is the IT department, which can help CCOs to maintain and provide systematic proof of both adherence to internal policies and the external laws, guidelines or regulations imposed upon the company. By having a dedicated IT resource, CCOs do not have to wait for the next programme increment (PI), sprint planning or IT resourcing availability. Instead, they can be agile and proactive when it comes to meeting business growth and revenue objectives. Technical resourcing can be utilised for project governance, systems review, data science, AML and operational analytics, as well as support audit / reporting with internal / external stakeholders, investors, regulators, creditors and partners. Ultimately this partnership between IT and CCOs will allow a business to make data-driven decisions that meet compliance as well corporate growth mandates.


IT Admins Need a Vacation

An unhappy sysadmin can breed apathy, and an apathetic attitude is especially problematic when sysadmins are responsible for cybersecurity. Even in organizations where cybersecurity and IT are separate,sysadmins affect cybersecurity in some way, whether it’s through patching, performing data backups, or reviewing logs. This problem is industry-wide, and it will take more than just one person to solve it, but I’m in a unique position to talk about it. I’ve held sysadmin roles, and I’m the co-founder and CTO of a threat detection and response company in which I oversee technical operations. One of my top priorities is building solutions that won’t tip over and require significant on-call support. The tendency to paper over a problem with human effort 24/7 is a tragedy in the IT space and should be solved with technology wherever possible. As someone who manages employees that are on-call and is still on-call, I need to be in tune with the mental health of my team members and support them to prevent burnout. I need to advocate for my employees to be compensated generously and appreciate and reward them for a job well done.


The steady march of general-purpose databases

Brian Goetz has a funny way of explaining the phenomenon, called Goetz’s Law: “Every declarative language slowly slides towards being a terrible general-purpose language.” Perhaps a more useful explanation comes from Stephen Kell who argues that “the endurance of C is down to its extreme openness to interaction with other systems via foreign memory, FFI, dynamic linking, etc.” In other words, C endures because it takes on more functionality, allowing developers to use it for more tasks. That’s good, but I like Timothy Wolodzko’s explanation even more: “As an industry, we're biased toward general-purpose tools [because it’s] easier to hire devs, they are already widely adopted (because being general purpose), often have better documentation, are better maintained, and can be expected to live longer.” Some of this merely describes the results of network effects, but how general purpose enables those network effects is the more interesting observation. Similarly, one commenter on Bernhardsson’s post suggests, “It's not about general versus specialized” but rather “about what tool has the ability to evolve.


Open-Source NLP Is A Gift From God For Tech Start-ups

As of late, be that as it may, open exploration endeavours like Eleuther AI have brought the boundaries down to the section. The grassroots agency of man-made intelligence analysis, Eleuther AI expects to ultimately convey the code and datasets expected to run a model comparable (however not indistinguishable) to GPT-3. The group has proactively delivered a dataset called ‘The Heap’ that is intended to prepare enormous language models to finish the text and compose code, and that’s just the beginning. (It just so happens, that Megatron 530B was designed along the lines of The Heap.) And in June, Eleuther AI made accessible under the Apache 2.0 permit GPT-Neo and its replacement, GPT-J, a language model that performs almost comparable to an identical estimated GPT-3 model. One of the new companies serving Eleuther AI’s models as assistance is NLP Cloud, which was established a year prior by Julien Salinas, a previous programmer at Hunter.io and the organizer of cash loaning administration StudyLink.fr. 


SQL and Complex Queries Are Needed for Real-Time Analytics

While taking the NoSQL road is possible, it’s cumbersome and slow. Take an individual applying for a mortgage. To analyze their creditworthiness, you would create a data application that crunches data, such as the person’s credit history, outstanding loans and repayment history. To do so, you would need to combine several tables of data, some of which might be normalized, some of which are not. You might also analyze current and historical mortgage rates to determine what rate to offer. With SQL, you could simply join tables of credit histories and loan payments together and aggregate large-scale historic data sets, such as daily mortgage rates. However, using something like Python or Java to manually recreate the joins and aggregations would multiply the lines of code in your application by tens or even a hundred compared to SQL. More application code not only takes more time to create, but it almost always results in slower queries. Without access to a SQL-based query optimizer, accelerating queries is difficult and time-consuming because there is no demarcation between the business logic in the application and the query-based data access paths used by the application.


Lack of expertise hurting UK government’s cyber preparedness

In France, security pros tended to find tender and bidding processes more of an issue, but also cited a lack of trusted partners, budget, and ignorance of cyber among organisational leadership. German responders also faced problems with tendering, and similar problems to both the British and French. From a technological perspective, UK-based respondents cited endpoint detection and response (EDR) and extended detection and response (XDR) and cloud security modernisation as the most mature defensive solutions, with 37% saying they were “fully deployed” in this area. Zero trust tailed with 32%, and multi-factor authentication (MFA) was cited by 31% – Brits tended to think MFA was more difficult than average to implement, as well. The French, on the other hand, are doing much better on MFA, with 47% of respondents claiming full deployment, 35% saying they had fully deployed EDR-XDR, and 33% and 30% saying they had fully implemented cloud security modernisation and zero trust respectively. In contrast to this, the Germans tended to be better on cloud security modernisation, which 40% claimed to have fully implemented, followed by zero trust at 32%, MFA at 30% and EDR-XDR at 27%.


Scrum Master Anti-Patterns

The reasons Scrum Masters violate the spirit of the Scrum Guide are multi-faceted. They run from ill-suited personal traits to pursuing their agendas to frustration with the Scrum team. Some often-observed reasons are: Ignorance or laziness: One size of Scrum fits every team. Your Scrum Master learned the trade in a specific context and is now rolling out precisely this pattern in whatever organization they are active, no matter the context. Why go through the hassle of teaching, coaching, and mentoring if you can shoehorn the “right way” directly into the Scrum team?; Lack of patience: Patience is a critical resource that a successful Scrum Master needs to field in abundance. But, of course, there is no fun in readdressing the same issue several times, rephrasing it probably, if the solution is so obvious—from the Scrum Master’s perspective. So, why not tell them how to do it ‘right’ all the time, thus becoming more efficient? Too bad that Scrum cannot be pushed but needs to be pulled—that’s the essence of self-management; Dogmatism: Some Scrum Masters believe in applying the Scrum Guide literally, which unavoidably will cause friction as Scrum is a framework, not a methodology.



Quote for the day:

"No organization should be allowed near disaster unless they are willing to cooperate with some level of established leadership." -- Irwin Redlener