Daily Tech Digest - March 26, 2021

Text authentication is even worse than almost anyone thought

For years, the key argument against relying on text message confirmations is that they are susceptible to man-in-the-middle attacks, which is still true. But this peek into the authorized infrastructure for text messages means that text takeovers can happen far more simply. There are plenty of easily accessed apps that make text-like authentication far more secure, including Google Authenticator, Symantec's VIP Access, Adobe Authenticator, and Signal. Why risk unencrypted, easily stolen texts for account access or anything else? For the moment, let's set aside how relatively easy and low-cost it is to move to a more secure version of text confirmations. Let's also, for the moment, set aside the compliance and operational risks your team is taking by letting the enterprise grant account access vis unencrypted texts. How about solely looking at the risk and compliance implications of offering third-party access via unencrypted text authentications? Remember this from the Vice piece: "The (attacker) sent login requests to Bumble, WhatsApp, and Postmates, and easily accessed the accounts." Once a bad guy takes control of a customer's texts, a vast domino effect kicks in, where lots of businesses can be improperly accessed.


How to Mitigate Low-Code Security Risks

On the low-code spectrum, there is the attitude that “people aren’t really doing mission-critical applications — they’re mainly doing prototyping, or back office automations,” Wysopal said. Or, since low-code applications often do not publicly expose an application, they are deemed low-risk and fall to the bottom of the security team’s inbox and list of to-dos. “We’re still in a world where people don’t secure all applications,” he said. However, these attitudes are a bit short-sighted, since “applications aren’t islands,” Wysopal said. Even if applications are intended for internal use only, phishing attempts could expose credentials and gain access to the network. Here, attackers could leverage sensitive resources or steal infrastructure for compute power. Thus, all low-code users should be aware of potential threats and change habits to address these risks accordingly. However, the onus is also on low-code vendors to sufficiently arm their systems. Having a vulnerability disclosure, bounty program and an easy means to accept bug reports from white hat security researchers will be necessary to continually patch issues. Low-code continues to permeate more and more digital operations, opening up novel potential for citizen developers.


Smart cities are built on data

Cities must understand the full set of stakeholders who should be involved in setting data governance policies. They include civic authorities, the public, the private sector, technology providers and academic experts. Then they need to look at smart city projects as an “integrated ecosystem,” which means using the data for the collective benefit of the city, public and private sector, Chiasson said. “In a lot of cases, the costs are concentrated, but the benefits are diffuse,” he said. For instance, improving traffic flow would benefit many stakeholders, but the cost tends to be concentrated with the agency paying for sensors and algorithms. “Holistically tying that together … is the way you start to have data that’s good and data that’s valuable,” Chiasson said. And valuable means data that can be turned into usable information. For instance, a city may have ridership and traffic data, but if they can’t use it to reduce emissions, it’s not helpful. “Data is not information,” Dennis said. “Oftentimes, what we’ll see is that there’s too much data and not enough ability for the city to convert it into information holistically.” Urban SDK has a data specification for smart cities that lets them aggregate data from a range of sources and normalize it into one database. From there, the data can be analyzed into information.


GAO warns on cyber risks to power grid

The country's electrical systems are increasingly susceptible to cyberattacks, according to government auditors, and there is uncertainty about the extent to which a localized attack might cascade through power distribution systems. A new report from the Government Accountability Office examines the vulnerabilities of electricity grid distribution systems, how some states and industry actions have hardened those systems and the extent to which the Department of Energy has addressed risks by implementing the national cybersecurity strategy. Government and industry officials told GAO that a cyberattack on a grid distribution system would likely have localized effects, but a coordinated attack could have widespread consequences. However, the officials conceded that assumption is based on their professional experience, GAO noted, and none of them were aware of an assessment that confirmed their claims. "Moreover, three federal and national laboratory officials told us that even if a cyberattack on the grid's distribution systems was localized, such an attack could still have significant national consequences, depending on the specific distribution systems that were targeted and the severity of the attack's effects," according to the report.


Vaccinated Employees Returning with Un-Vaccinated Devices

With vaccines making their way throughout the country and lockdown restrictions loosening up, a debate surrounding the office return emerges. Hybrid vs. remote vs. full-time – every scenario is different, as well as the rules and accommodations to make the return safe and productive for all. User systems will be coming back as well, and their absence from the network could present new challenges. In the vast majority of cases, computers and mobile devices have not been on-premise for close to a year. Ever vigilant from a security perspective, the best way to approach the “repatriation” of these systems onto the office network is to regard them as potentially infected. At the very least, consider that these devices are not in the same state as when they left. During this time, the company office extended into the homes of employees, and the line separating home from work was essentially diminished. ... Even with device management in place, the potential for unaccounted change from field devices is significant. Consider this both a threat and an opportunity. The first goal is to ensure that workstations have been adequately patched and updated. Devices, security software and applications must be validated and brought up-to-date before actively working on networks hold sensitive, critical data.


Cyber threats, ongoing war for talent, biggest concerns for tech leaders

When it comes to talent, 44% of respondents said that finding enough qualified employees to fill open positions is the biggest risk they face over the coming year. An almost equal percentage say the task is just as difficult as it was a year ago. To close the skills gap, companies said they are trying a variety of strategies, including building flexible, on-the-job training opportunities (61%); rewriting job descriptions or job titles (42%); creating an apprenticeship program (39%); and eliminating requirements that applicants have certain types of academic degrees (24%). Some other pathways to finding the right talent: easing location restrictions; gamification of training; and prioritizing the search for candidates with a diversity of career backgrounds. In fact, nearly three-quarters of the respondents said that in the past year they’ve filled open technology positions with candidates with liberal arts degrees. An equal percentage have taken internal candidates from non-tech teams. About half of tech leaders have hired people with no college degree at all. A little over 60% of survey participants said their company is ahead of the curve when it comes to investing in new technology, and 35% said they’re about average. 


When Every Millisecond Matters in IoT

It starts with a network of seismic sensors, which are used to detect the P-waves, providing a ton of information that can be used to calculate the size and location of the damaging earthquake. The data is distributed in real-time to every subscribed party: emergency response, infrastructureand everyday users who have the app installed. The next critical piece of technology is a real-time network: a super-fast, low-latency and reliable infrastructure that is optimized to broadcast small amounts of data to huge audiences of subscribers. This may include both the earthquake data itself, as well as push notifications or alerts specified by the app developer. This is where every millisecond matters, so ensuring reliability at scale, even in unreliable environments, is mission critical. When selecting a real-time network, whether you go with a hosted service or build it yourself, app developers need to understand the underlying technology, real-time protocols and other indicators of scalability. Lastly, you need the application that connects your real-time IoT network to the deployed sensors, where notifications are transmitted and the response is automated based on incoming data.


What businesses need to know to evaluate partner cyber resilience

Protecting customer data is vital and now regulated in certain geographies with the introduction and implementation of privacy laws like the GDPR and the CCPA. Non-compliance with either of these regulations may result in large fines that can pose a serious threat to business continuity depending on the size of the company and violation. While the GDPR and the CCPA are the two of most well-known regulations, at least 25 U.S. states have data protection laws, with Virginia being the most recent to enact legislation. Legislation aside, organizations must protect data and be able to recover it in the event of any loss. Not being able to recover data, albeit at the fault of a partner, can quickly propel an organization toward financial setbacks, damaged relationships and diminished reputation. When it comes to evaluating a partner, ask them to detail their backup strategy and policies. Regular infection simulations and backup procedure tests are crucial in making sure you are prepared for a real DEFCON scenario. Businesses must have endpoint security in place as cybercriminals are constantly developing new ways to attack networks, take advantage of employee trust and steal data. In traditional office building settings, employees were better protected within the corporate network.


How one data scientist is pioneering techniques to detect security threats

It was all pretty accidental, not something I had planned. I did really well in college—I was first in my class. And I finished in 2010, in the middle of the Great Recession, which hit the Spanish labor market horribly. At that time, the unemployment rate was 25 percent. The lucky ones, like people in engineering, were getting job offers. But when you’re in technology, the only options in Spain are to work for a consulting company or to do support or sales. There weren’t any entry-level jobs in research and development. So, I started a master’s with a group doing research on biometrics. The master’s was also in computer science and very related to artificial intelligence and a lot of interconnected fields like multimedia signal processing, computer vision, and natural language processing. I did my thesis on statistics around forensic fingerprints, and the probability of a random match between a latent fingerprint found at a crime scene and a random person that could have been wrongly convicted of that crime. ... One good approach is to find an internship that has some connection between doing data science and security and fraud, even if it’s just loosely related.


What To Expect From The New US-India Artificial Intelligence Initiative

India and the US can complement each other in this collaborative effort to ensure equitable progress. “For the US, India represents a massive consumer market – and one of the world’s largest troves of data. Technology firms in the US accessing this data will be like energy firms finding oil in the Middle East,” said Prakash. “For India, the US algorithms are solutions to a variety of development challenges India faces, from bringing banking to hundreds of millions of people to modernising the Indian military to offering healthcare to the masses. At the same time, for US technology firms, India churns out massive amounts of engineers and computer scientists – critical talent that these firms need.” Another major reason for a partnership between India and the US is the new geopolitical realities. China’s growing influence in the field of AI is a pressing concern. “What India and the US bring to the table is what is a supposedly democratic governance model of emerging technology, said Basu, “Despite the change in administration from Trump to Biden, there are certain things where there is continuity – like distrust in China and Chinese technology....”



Quote for the day:

"Leadership is a dynamic process that expresses our skill, our aspirations, and our essence as human beings." -- Catherine Robinson-Walker

Daily Tech Digest - March 25, 2021

Generation Z majority left cold by data literacy

Helena Schwenk, analyst relations and market intelligence lead at Exasol, said: “Regardless of job descriptions, the ability to work with data is becoming increasingly crucial in the workplace. In theory, D/Natives should have developed the data literacy skills necessary for effective data analysis, storytelling and visualisations. Their untapped potential could spur a revolution in the way we use data to transform business and improve our daily lives. “But our survey highlights two issues: a genuine skills shortage when it comes to the more complex data skills gained through the education system, and a clear miscommunication between the language D/Natives use and the business jargon used by employers. There is work for educators, business leaders and the young people themselves to do to bridge the data literacy gap – to create not just a productive workforce, but also a richer society.” Schwenk, a former analyst at IDC and Ovum, has recently been joined at Exasol by Peter Jackson as its chief data and analytics officer. Jackson also has a high profile in the UK data community, as the co-author, with Caroline Carruthers, of The chief data officer’s playbook and a former data leader at the Pensions Regulator, Southern Water and Legal & General.


Strategies to Modernize, Maintain, and Future-Proof Systems

We tend to think about technology advancing in a straight line, with each iteration better and more sophisticated than what came before. The reality is a little more complicated than that because there are no one-size-fits-all solutions. As we make incremental improvements to technology, we are only really optimizing for a specific set of use cases. Those same improvements might make other uses more difficult. Over time what tends to happen is as one technology gets more and more optimized, the group of people for whom things are moving in the opposite direction of what they actually need gets larger and larger, until finally there are enough people to establish a market for a “new” technology to shift things back in the opposite direction. My favorite example of this is cell phone size: for a while cellphones were about staying connected to the office on the go, so each more advanced version was smaller and thinner. Then the emphasis shifted from work functions to entertainment functions, and suddenly cell phones started to get bigger and bigger. Technology is filled with these kind of cycles where it feels like we’re reinventing or repackaging old solutions.


5 Web Application Security Threats and 7 Measures to Protect Against Them

Broken authentication is another common vulnerability that is caused by poorly implemented authentication and session management controls. If an attacker is successful in identifying and exploiting authentication-related vulnerabilities, they can gain direct access to sensitive data and functionality. The goal of the attackers to exploit authentication vulnerabilities is to impersonate a legitimate user of the application. Attackers employ a wide variety of techniques such as credential stuffing, session hijacking, password brute force, Session ID URL rewriting, etc., to leverage these weaknesses. These attacks can be prevented by implementing strong session management controls, multi-factor authentication, restricting and monitoring failed login attempts. For more details on prevention, refer to this article. Sensitive data exposure occurs when the web application does not sufficiently safeguard sensitive information such as session ids, passwords, financial information, client data, etc. The most common flaw of organizations resulting in data exposure is not encrypting sensitive data.


Hidden areas of security and the future of hybrid working

Businesses should think carefully about how they utilize these platforms – starting with security. Many of the platforms, such as Microsoft Teams, do not come with built-in cybersecurity features, and don’t provide a way for data to be easily archived. In fact, Microsoft does not provide any guarantee of restorability – if a file is accidently deleted, it’s gone forever. This leaves a big gap for operations that need to ensure that they have a strong archiving strategy in place. Additionally, IT and security teams must be aware of the vulnerability of these tools to phishing or social engineering attacks. Unlike email, files shared via collaboration platforms cannot be scanned for malicious links or other content. A good example of this is a Microsoft Teams phishing campaign recently discovered by Mimecast which consisted of 772 emails and targeted recipients mainly based in the US. Those targeted were sent fake email notifications asking them to verify their password or telling them they had been added to a project via their Teams account. Similarly, another Teams attack discovered late last year was estimated to have targeted 15,000-50,000 people by the time it was detected, showing how widespread the problem can get.


Hybrid workers are stressed out, but "empathy-based management" could help

As the remote-work landscape has blurred the lines between work and personal life, workers struggle to put up boundaries, and many stay connected long after the work day is done. According to the research, workers in the hybrid world are "1.27 times more likely to struggle to disconnect from work than employees in the on-site world." And "40% of hybrid or remote employees [are] reporting an increase in the length of their workday in the past 12 months." This kind of fatigue caused by the longer workday is a main concern for 92% of HR leaders. Leaders should stop expecting employees to be always "on." The very tools that are used to ensure the smooth transition to a hybrid work model are also its Achilles' Heel. "Organizations have inadvertently been making the fatigue worse," Cambon said. There have actually been more check-ins (78%) between managers and workers, and 84% more virtual meetings with teams, for instance. According to Garter, "HR leaders must lead and support the creation of a hybrid model that mitigates the adverse impacts of digital distraction, virtual overload and the always-on mindset. Ironically, many of the actions that organizations are taking to improve the hybrid employee experience are actually exacerbating the fatigue these hybrid realities are creating."


From Digital To Physical: The Ultimate Challenge For AI

By crossing the digital/physical barrier and implementing AI-powered visual quality inspections, the industry can mitigate the crisis and labor shortage. The use of AI removes the barriers that typically slow technology adoption in that it is cost effective, easy to integrate and doesn't need specially trained staff to operate. AI-based visual inspections are used today to inspect for defects in metal engine parts, check integrity of rugs/carpet, assess whether raw material (such as meat) has foreign contaminants (e.g., plastic particles), check plastic food trays for the right item, inspect quality of baked goods (e.g., bread), determine integrity of vaccine vials and more. These are all real-world, often mission-critical applications of AI technology in challenging physical settings. The value of digital-to-physical applications of AI is clear, as well as how they can be applied in the manufacturing industry—so what's next? For anyone looking to implement AI across their organization, the next steps are simple. First, you need to take a look at your specific workflows and determine what processes could benefit from AI: Is it a quality inspection, is it predictive maintenance or is it something else?


Working with Secure Enclaves in Azure SQL Database

Encryption has always been challenging to implement, but if it is implemented infrequently, data breaches become much more damaging: If a bunch of encrypted data gets breached, it is not useful to anyone. If we think back to database encryption in SQL Server, until Always Encrypted was introduced, anyone who was a system administrator had access to the encryption keys, allowing them to view decrypted data. Always Encrypted changed that paradigm. Instead of storing encryption keys in the database, the keys that can decrypt data were stored in the client application. This meant administrators could only view the ciphertext (the result of the encrypted value) and not the plain text value. Always Encrypted supports two types of encryption: deterministic (in which the value of the ciphertext will always be the same for the same seed value) and randomized (which provides a unique encrypted value for each record). ... The key difference here is that with secure enclaves in place, the database engine can send encrypted results into the secure enclave, where data operations can take place. Then the data is returned to the database engine, and in turn to the client operation in encrypted format. While the enclave is shown in its container, it is part of the SQL Server process on the server.


The unique opportunity for Fintech in the payments space

As a society we’re becoming disengaged with the cumbersome process of card payments and more conscious than ever about typing pin codes into public machines, with antibacterial gel on stand-by. With today’s available technology there is just no need to queue, swipe, PIN and collect paper receipts. We’re moving into an age of completely contactless spending, one where people can exit a taxi without “paying”, leave a shop without visiting the till, and get instant credit at a digital checkout. Where e-wallets account for 8%-10% year on year growth of ecommerce transactions, with no sign of slowing down. We’re moving into a digital-first generation that is used to buying things with the tap of a phone screen or a scan of their face. So much so, physical wallets are becoming obsolete as phones stay glued to hands. Although as a society we’re engaging less and less in person or making payments over a counter, fintechs are leading the way with technology to trust customers are who they say they are, digitally, so that they can access frictionless payment experiences without merchants incurring the risk of fraud. 


What IT Leaders Can Do To Diminish Fear Within Their Teams

First, I take personal responsibility for team progress on the project. I do this visibly and deflect criticism of the team. I make it clear within the team that only the complete team can succeed. As a group, we will work to balance the assignments so no one person feels like the single point of failure. To our sponsors of the project, I am clear about our status and needs from senior leadership. Knowing that we are all on the same journey keeps the team together. Eventually, all businesses run into budget problems. IT spending is a necessary evil because businesses leverage mission-critical applications. But the fear within the employees is that people may not seem as necessary. The threat of possible downsizing casts an enormous shadow and can be debilitating in concentrating on complex mental work. How do I keep our focus amidst layoff rumors? My communication stresses our value. I ask the team to show our company that we are going to continue to strive for excellence. I pose this to my team: “Let’s continue to do great things. Will the company value us more if we slip on quality, complain about our situation, or spread layoff rumors?


The Evolution of Distributed Systems on Kubernetes

If you look at how microservice looks on Kubernetes, you will need to use some platform functionality. Moreover, you will need to use Kubernetes features for lifecycle management primarily. And then, most likely, transparently, your service will use some service mesh, something like an Envoy, to get enhanced networking capabilities, whether that's traffic routing, resilience, enhanced security, or even if it is for a monitoring purpose. On top of that, depending on your use case, you may need Dapr or Knative, depending on your workloads. All of these represent your out-of-process, additional capabilities. What's left to you is to write your business logic, not on top, but as a separate runtime. Most likely, future microservices will be this multi-runtime composed of multiple containers. Some of those are transparent, and some of those are very explicit that you use. ... All the interactions of your business logic with the external world happen through the sidecar, integrating with the platform and does the lifecycle management. It does the networking abstractions for the external system and gives you advanced binding capabilities and state abstraction.



Quote for the day:

"Leadership is, among other things, the ability to inflict pain and get away with it - short-term pain for long-term gain." -- George Will

Daily Tech Digest - March 24, 2021

How Machine Learning Enables Clinical Forecasting, Visualization

“The main problem with using machine learning in clinical care – and being able to make changes therein – is that there are many preprocessing design decisions that will affect the performance of the model. With this tool, healthcare experts are able to select those at their own location so that when they go to train these models, they're focused on the very specific task at hand,” said Weiss. By seeing the impact of their design choices, users can understand their data more completely and adjust machine learning settings for their analysis. The tool allows healthcare experts to develop algorithms tailored to their patients and organizations. “If you were to use a risk scoring system from another site, they might have defined the population based on the patient data that were available at entry and at the beginning of the hospitalization. But then the physician might want to have a risk score for a little bit later in treatment, maybe the first or second day after they've entered and they've already been stabilized,” Weiss explained. “The outside model will not be tailored to that population and could give misleading predictions. Using TL-Lite, the physician can quickly train a model with the risk profiles for the particular population they’re interested in evaluating.”


On the Road to Good Cloud Security: Are We There Yet?

Although most IT security teams are well past being the department of no when it comes to cloud initiatives, many are still struggling with how to best secure those cloud-based assets — at least when they are tasked with doing so ... The research also uncovered a disconnect that raises the question: Is that confidence misplaced? When asked to rate the level of visibility the security team had into their organization's use of specific cloud service types, including software-as-a-service (SaaS), platform-as-a-service (PaaS), and infrastructure-as-a-service (IaaS), that same level of confidence faltered. For example, when asked to rate the security team's level of visibility into their organization's SaaS usage on a five-point scale, with 1 being the highest level, only 18% gave it a 1 and 27% gave it a 2. Visibility into PaaS and IaaS was rated as only slightly better. At the same time, respondents' knowledge of the shared responsibility model was found to be lacking. When asked to indicate whether the customer or cloud provider was responsible for securing a list of seven different elements that make up an IaaS account, around half of respondents gave the wrong answer.


Digital Identity: Fulfilling Consumer Cravings for Elevated ‘Digital Experience’

Whilst some organisations have embraced this potential to strengthen their bond with consumers, others have not been as future-forward, even though 82% of business executives recognise that customer experience is directly intertwined with revenue growth, according to Forrester. It is no longer sufficient for organisations to ‘digitise’ through newly hosting existing products and services on online platforms. Consumer delight is won through the ability to identify market gaps, capitalise on the latest technological capabilities and improve upon existing standards and quality of life that is already on offer. If it is not clear how a product, service and experience is able to add to their existing digital portfolio it is not pushing market boundaries or entertaining consumer curiosities. The sensitivity of this digital shift is clear; companies must ensure that throughout their digital strategy they consider consumer experience as the key driver for change. This means listening to the wants and needs of consumer trends, working along the tide of consumer behaviour to ensure their business remains attractive, socially relevant, and profitable.


How agile can power frontline excellence

The strategic choices that companies make often don’t filter down to the hearts and minds of frontline workers. But what if sales employees could exercise informed judgment, become entrepreneurs within the enterprise, and conduct short-term experiments and share ideas on what works? Magic can happen if frontline employees understand how their targets link to strategic objectives and how their work contributes to wider company success. In agile sales organizations, the average frontline employee receives more information and is included in communications about the purpose of, and strategic choices for, the organization as a whole. Communication is more inclusive and interactive. These agile organizations foster dialogue and understand how sales functions can drive the strategic agenda using customer feedback. They operate from the belief that empowered employees will make more and better emotional connections with customers, leading to greater engagement on both ends and a stronger, longer, and broader relationship as a result. In addition, in agile sales organizations, the number of performance indicators is drastically reduced to a set of clear outcomes to focus energy on the things that matter most through the lens of the strategic aspiration.


Open Source vs. Proprietary DataOps

One advantage of open source is in its flexibility and availability. Open source licenses, excluding the SSPL, gives users incredible freedom over what they can do with the software. If you have the skill, you can compose a DataOps pipeline that can take any data, enrich it and route it to the right place. That flexibility, though, is also a downside. While you can do anything you want, you also have to do it. Open source projects like Kafka, Pulsar, Spark, Airflow and Flink don’t know anything about the data they’re handling. That’s up to the developer. This may not sound like a problem, but today’s data engineers are handling dozens of data types in hundreds – or even thousands – of different formats. If you add in operational data, you’re also looking at data flooding in from firewalls, containers, SNMP traps and HTTP sources. And that’s just what’s coming at you. You also need to fetch data from object stores, multiple activity hubs and other messaging sources. No open source project natively supports the variety and volumes of data required in a modern DataOps pipeline.


What’s the Difference Between Solution Architecture and Design?

As a Technical Lead, I was the communication point for my team as well as leading the actual solution delivery, getting my hands dirty. As a Solution Architect, I became untethered from delivery, sitting outside of the teams doing the actual work. But in both roles, I was producing designs. So is the distinction down to whether or not you write code yourself? Is the answer that Architects don’t get their hands dirty? Absolutely not. That’s just a feature of the organisations you work with, and what they expect from their Solution Architects. My experiences as a Solution Architect just so happen to be with organisations where the role doesn’t touch code — either as a result of outsourcing Delivery or having internal Product Engineers to do the build. Other organisations will have Solution Architects more embedded within Delivery teams, t-shaping to provide additional value. ... In the Agile organisations I’ve worked for over the last few years, Solution Architecture has been best deployed in the early stages of a change to produce a vision of the solution and how it fits into the existing landscape, identifying impacts, opportunities and risks associated with the change.


4 Ways Your Small Business Can Benefit From Blockchain

The first thing a business can do to adopt blockchain technology is to simply accept cryptocurrency as a method of payment. What signals more of a commitment to blockchain than allowing customers to pay with bitcoin or other cryptocurrencies? The rollout will require a lot of planning and testing, as traditional merchant services are not set up to accept bitcoin. As such, a small business will need to evaluate and spend money on a digital wallet, a merchant gateway or a combination of services needed to accept the cryptocurrency from customers. ... Businesses can use blockchain for smart contracts, which are basically self-verifying, self-enforcing contracts. Stored within a blockchain ledger, the contract is recorded in a way that cannot be changed or manipulated. Smart contract examples include commercial leases, agreements with vendors or suppliers and even employee contracts. Smart contracts offer small businesses a level of protection it would otherwise never be able to afford. The middleman — usually an attorney — would not be needed in a smart contract, and as such, a business would have lower costs.


What’s limiting digital transformation initiatives?

CXOs are aware of the need to adopt a cloud-first approach and change the way IT is delivered in response to the digital acceleration brought about by COVID-19. Many have already done so, with 91% increasing their cloud services usage in the first months of the pandemic, and the majority will continue to do so, with 60% planning to add more cloud services to their IT delivery strategy. However, while businesses recognize the need to accelerate their DX journeys over the next 12 months, 40% acknowledge that economic uncertainty poses a threat to their DX initiatives. As organizations increasingly adopt modern IT services at rapid pace, inadequate data protection capabilities and resources will lead to DX initiatives faltering, even failing. CXOs already feel the impact, with 30% admitting that their DX initiatives have slowed or halted in the past 12 months. The impediments to transformation are multi-faceted, including IT teams being too focused on maintaining operations during the pandemic (53%), a dependency on legacy IT systems (51%) and a lack of IT staff skills to implement new technology (49%). 


Apple’s iPhone factories are Industry 4.0 rock stars

Apple being Apple, we don’t know too much about how the company and its manufacturing partners are making use of AI, Internet of Things and connectivity on the factory floor, but we have seen a few examples, such as its Daisy recycling robot. We do know that Foxconn’s state of the art "lights off' Shenzen factory is highly-automated with robots deployed across the production line, reducing its reliance on human workers. The WEF has praised that factory, noting a 30% increase in production efficiency and a 15% lower inventory cycle. Broadening our understanding a little, it claims the factory "utilizes a fully automated manufacturing process," and has an "automated optimization system for Machine Learning and AI devices, an intelligent self-maintenance system, and an intelligent real-time monitoring system.” Foxconn’s Chengdu plant has seen efficiency increase by 200% through the adoption of mixed reality, AI, and IoT technologies. Foxconn says it put these technologies in place to resolve rapid business growth when it faced a lack of skilled workers, presumably on the iPhone production line.


Remote work, one year in: 5 ways to boost mental health

Research consistently shows that social interaction plays an essential role in well-being, which in turn has a positive impact on employee engagement and performance. Building social connections is much easier when you’re in the office; chats at the coffee machine or catch-ups over lunch are all part of normal working life. If someone is stressed, you can usually pick up on the signs. However, opportunities to communicate diminish when you’re working from home, and it can be difficult to know how people are really feeling. Make a conscious effort to encourage personal connections to help prevent people from feeling isolated. This is even more important given social distancing measures, which have left many without their usual support network. Check in regularly with your team members on an individual basis, especially those with heavy workloads or who live alone. Build in time at the start of calls for a general catch-up. Not everyone is comfortable chatting on the phone, so also consider using instant messaging to keep the channels of communication open.



Quote for the day:

“Successful people are not gifted; they just work hard, then succeed on purpose.” -- G.K. Nielson

Daily Tech Digest - March 23, 2021

How Synthetic Data Levels The Playing Field

Synthetic data can be defined as data not collected from real-world events. Today, specific algorithms are available to generate realistic synthetic data used as a training dataset. Deep Generative Networks/Models can learn the distribution of training data to generate new data points with some variations. While it is not always possible to learn the models’ exact distribution, algorithms can come close. ... The big players already have a stronghold on data and have created monopolies or ‘data-opolies’. Synthetic data generation models can address this power imbalance. Secondly, the rising number of cyberattacks, especially after the pandemic, has raised privacy and security concerns. The situation is especially worrying when huge amounts of data are stored in one place. By creating synthetic data, organisations can mitigate this risk. Thirdly, whenever datasets are created, they reflect real-world biases, resulting in the over-representation or under-representation of certain sections of society. The machine learning algorithms based on such datasets amplify such biases resulting in further discrimination. Synthetic data generation can fill in the holes and help in creating unbiased datasets.


Researchers Discover Two Dozen Malicious Chrome Extensions

While malicious extensions are an issue with all browsers, it's especially significant with Chrome because of how widely used the browser is, Maor says. It's hard to say what proportion of the overall Chrome extensions currently available are malicious. It's important to note that just a relatively small number of malicious extensions are needed to infect millions of Internet users, he says. One case in point was Awake Security's discovery last June of over 100 malicious Google Chrome extensions that were being used as part of a massive global campaign to steal credentials, take screenshots, and carry out other malicious activity. Awake Security estimated that there were at least 32 million downloads of the malicious extensions. In February 2020, Google removed some 500 problematic Chrome extensions from its official Chrome Web Store after being tipped off to the problem by security researchers. Some 1.7 million users were believed affected in that incident. In a soon-to-be-released report, Cato says it analyzed five days of network data collected from customer networks to see if it could identify evidence of extensions communicating with command-and-control servers. 


What IT Leaders Need to Know About Open Source Software

Despite conventional wisdom, open-source solutions are, by their nature, neither more nor less secure than proprietary third-party solutions. Instead, a combination of factors, such as license selection, developer best practices and project management rigor, establish a unique risk profile for each OSS solution.The core risks related to open source include: Technical risks, including general quality of service defects and security vulnerabilities; Legal risks, including factors related to OSS license compliance as well as potential intellectual property infringements; Security risks, which begin with the nature of OSS acquisition costs. The total cost of acquisition for open source is virtually zero, as open-source adopters are never compelled to pay for the privilege of using it. Unfortunately, one critical side effect of this low burden of acquisition is that many open-source assets are either undermanaged or altogether unmanaged once established in an IT portfolio. This undermanagement can easily expose both quality and security risks because these assets are not patched and updated as frequently as they should be. Finally, vendor lock-in can still be a risk factor, given the trend among vendors to add proprietary extensions on top of an open-source foundation (open core).


Applying Stoicism in Testing

To consider and look for the unknown information about a system, you need to have justice. In Stoicism, this stands for “showing kindness and generosity in our relationships with others”. And because you don’t know everything, you need other people to help you out. Gathering information is also about creativity, so you have to gather inspiration from past experience, and with your colleagues must be able to connect dots that weren’t connected before. Once I even stated, “The knowledge (and information) you gather as a tester about the software can be an interesting input for new software products and innovations”. But as a Stoic, stay humble ;-). After gathering all the information you need, you should use your wisdom (“based on reasoning and judgment”) to come to conclusions so that you can answer the question “Is this software ready to be used?” Although our customers are the best testers, we as testers are in the position that we are (or at least should be) able to answer the question at every step in the software development: if the software is going to production, what can happen? The information you put on the table for your stakeholder should be based on facts. 


Data Analyst vs. Data Scientist

The typical data analyst role is consulting-centric, as can be seen from the Indeed job spec example. What they are preoccupied with for the most part is wrangling data from Excel spreadsheets and SQL databases, extracting insightful conclusions via retrospective analyses, A/B tests, and generally providing evidence-based business advice. The last point illustrates why reporting routines with visualisation tools such as Tableau are as pivotal as pivoted tables. Data modelling on the other hand is often limited to basic supervised learning or its stats equivalent: regression analysis. ... To be fair, data scientists are for that reason expected to be more than analytical wizards. They are supposed to be builders who employ advanced programming to create pipelines that predict and recommend in production environments with near perfect accuracy. Compared with analysts, who’re like investigative reporters, they are a lot more product development than consulting oriented. Although it’s also required of a data scientist to provide data-led commercial advice. Some say the title was coined to manifest that the role was a confluence of three fields: maths/statistics, computer science and domain expertise.


Tech projects for IT leaders: How to build a home lab

If you're like most technology leaders, the closest you get to the actual technology you select and manage is creating PowerPoint decks that tell others about the performance, maintenance and updating of that technology. There's nothing fundamentally wrong with this of course; you can be a fantastic leader of a construction firm without having swung a hammer, or a cunning military strategist who has never rucked over a hill or fired a weapon. However, hands-on time with the fundamental building blocks of your domain can make you a better leader, just as the architect who spends time in the field and understands the materials and building process makes him or her more effective at creating better structures. ... Think of a home lab as the technology equivalent of the scientist's laboratory. It's a place where you can experiment with new technologies, attempt to interconnect various services in novel ways and quickly clean things up when you're done. While you might be picturing a huge rack of howling servers, fortunately for us you can now create the equivalent of a small data center on a single piece of physical equipment.


IOTA still wants to build a better blockchain, and get it right this time

What went wrong then, and how is IOTA going to fix it -- besides introducing a new wallet? Schiener focused on some key technical decisions that proved wrong, and are being retracted. IOTA wanted to be quantum-proof, and that's why it used a "special cryptography," as Schiener put it. IOTA's cryptography only allowed, for example, to utilize an address once. Reusing an address could lead to a loss of funds. Another questionable decision was choosing to use ternary, rather than binary encoding for data. That was because, according to Schiener, the hypothesis of the future was that ternary is a much better and more efficient way to encode data. The problem is, as he went on to add, that this also needs ternary hardware to work. There are more, having to do with the way the ledger is created. It's still a DAG, but it has different algorithms. Schiener said that over the last one and a half years, IOTA has been reinvented and rewritten from the ground up. This new phase of the project is Chrysalis, which is this new network upgrade. With Chrysalis, IOTA is also moving toward what it calls Coordicide.


Browser powered scanning in Burp Suite

One of the main guiding principles behind Burp’s new crawler is that we should always attempt to navigate around the web applications behaving as much as possible like a human user.. This means only clicking on links that can actually be seen in the DOM and strictly following the navigational paths around the application (not randomly jumping from page to page).Before we had browser-powered scanning, the crawler (and the old spider) essentially pretended to be a web browser. It was responsible for constructing the requests that are sent to the server, parsing the HTML in the responses, and looking for new links that could be followed in the raw response we observed being sent to the client. This model worked well in the Web 1.0 world, where generally the DOM would be fully constructed on the server before being sent over the wire to the browser. You could be fairly certain that the DOM observed in an HTTP response was almost exactly as it would be displayed in the browser. As such, it was relatively easy to observe all the possible new links that could be followed from there.Things start to break down with this approach in the modern world. 


Fintech disruption of the banking industry: innovation vs tradition?

The first was the rise of the internet. Constantly improving speeds and widespread access meant hundreds of millions of consumers were suddenly able to access digital services. The second was the rise of the smartphone. This hardware transformed consumer behaviour beyond recognition. Apps and other software products providing significant upfront value made smartphones indispensable — just think of Shopify, Google Maps and Uber. The third driver which paved the way for fintech providers’ success was the financial crisis in 2008. Not only did this bring the traditional banking system to the brink of collapse, but consumers were far less trusting of the big banks thereafter. The new breed of financial services providers was not tied down by legacy infrastructures and, with smaller teams and flexible IT infrastructures, they were more agile. And this allowed them to easily circumnavigate the new regulatory and compliance requirements that were introduced in the wake of the financial downturn. Fintech providers sought to solve problems the banks could not. Or at least to do what the banks do, but better. 


Top 3 Cybersecurity Lessons Learned From the Pandemic

As the world began relying on these new digital capabilities, new risks and challenges were introduced. Organizations that were well-equipped to extend visibility and control to this new way of working found themselves in a far better situation than those that were scrambling to completely reengineer their security capabilities. The ones that had built an empowered and proactive security team, backed by robust processes and supported by effective technology, were able to adapt and overcome. Organizations that were locked into a rigid operational model, overly reliant on vendor platforms or lacking a defined set of processes to support their new reality, struggled to keep pace. ... Since the pandemic began, we have seen an increased emphasis and shift toward zero trust and security access service edge (SASE) principles. With strong identity and access management capabilities, insights into services and APIs, and visibility into remote endpoint devices, security teams can put themselves in position for rapid and effective responses — even within this unique virtual setting. Access to sensitive and confidential data is the new perimeter for an organization's cybersecurity posture.



Quote for the day:

"A tough hide with a tender heart is a goal that all leaders must have." -- Wayde Goodall

Daily Tech Digest - March 22, 2021

Bitcoin’s Greatest Feature Is Also Its Existential Threat

The botnet’s designers are using this idea to create an unblockable means of coordination, but the implications are much greater. Imagine someone using this idea to evade government censorship. Most Bitcoin mining happens in China. What if someone added a bunch of Chinese-censored Falun Gong texts to the blockchain? What if someone added a type of political speech that Singapore routinely censors? Or cartoons that Disney holds the copyright to? In Bitcoin’s and most other public blockchains there are no central, trusted authorities. Anyone in the world can perform transactions or become a miner. Everyone is equal to the extent that they have the hardware and electricity to perform cryptographic computations. This openness is also a vulnerability, one that opens the door to asymmetric threats and small-time malicious actors. Anyone can put information in the one and only Bitcoin blockchain. Again, that’s how the system works. Over the last three decades, the world has witnessed the power of open networks: blockchains, social media, the very web itself. What makes them so powerful is that their value is related not just to the number of users, but the number of potential links between users.


India’s Quest Towards Quantum Supremacy

The digital partnership between the Indian Institute of Science Education and Research (IISER) at Pune and Finland’s Aalto University has created a high probability of getting its first quantum computer. ... Talking about the partnership, Neeta Bhushan, the joint secretary (Central Europe), external affairs ministry, stated that the idea of jointly developing a quantum computer with the use of AI and 5G technology is an important area of collaboration for both countries. Considering that Nokia and other Finnish companies are leading the world in mobile technology growth, this digital collaboration will witness the two countries collaborating on quantum technologies and computing. Hence, the partnership will have the leverage to deploy the latest technologies available with both countries. ... The partnership can lead us towards a new ecosystem altogether, and many things can be expected out of the same. The post-COVID changes in global power-sharing and the recent technological developments to handle the crisis have brought India to the centre stage. Consequently, quantum encryption is one of the basic applications derived from this collaboration.


Remote working still isn't perfect. These are the things that need fixing

A new report from O2 Business explores these insights in greater depth. The UK mobile operator surveyed 2,099 workers who had previously been office-based to understand how their needs and expectations of work had changed. It found that the majority of employees welcomed the notion of splitting their time between the office and home-working going forward, but also called for a closer alignment of operations, IT and HR in order to support individual work choices and maximize workplace productivity. Generally, employees are satisfied with their organization's response to the pandemic, O2 found: 69% of workers felt that their employers had supported them during the pandemic, with just 11% disagreeing with this statement. But less than two-thirds (65%) of employees felt confident that their organization was prepared for the future world of work. O2 said this indicated some businesses would struggle to adapt to the more flexible working arrangements that many are planning to adopt post-pandemic. The mad scramble to remote working has been one of the most trying aspects for businesses over the past year.


Fight microservices complexity with low-code development

A low-code platform takes care of nearly everything that conventionally is coded for an application. Most of the low-level programming and integration work is taken care of via tool configurations, which saves developers a lot of time and headaches. However, think carefully about where you apply low-code in a microservices architecture. As long as the app is simple, clean and doesn't require many integration points, low-code development might be the right alternative to more manual and complex microservices projects. Low-code builds are an easy choice for applications that don't need to integrate with other databases or only rely on a series of small tables. Short-lived conference apps or marketing promotions that run with user ID information are good examples of this. However, a low-code approach does not replace large-scale microservices development. Once you need to share information between applications in real time, the tools and programming techniques involved become much more sophisticated. While the low-code approach helps developers steer clear of over-engineering apps that don't need it, low-code likely won't provide the database integration, messaging or customization capabilities needed for an enterprise-level microservices architecture.


Edge Computing Growth Drives New Cybersecurity Concerns

Effectively protecting the edge means understanding how cybersecurity protection schemas work in an enterprise that uses not only edge computing, but also the cloud and traditional resources. Most enterprises are clearly focused on data security and application security, and are using tools such as web application firewalls (WAF), runtime application self-protection (RASP), data exfiltration protection and, of course, endpoint protection. Since the edge has the ability to “touch” data and applications, as well as use identity to connect and determine entitlements, a great deal of potentially sensitive information passes through the edge. Much, if not all of that traffic moves through a content delivery network (CDN), where hosts provide the connectivity and, hopefully, wrap encryption around that traffic to protect it from interception. However, intrusion and data exfiltration still happens. “Digital transformation is driving more and more applications to the edge, and with that movement, businesses are losing visibility into what is actually happening on the network, especially where edge operation occurs,” Hathaway said. “Gaining visibility allows cybersecurity professionals to get a better understanding of what is actually happening at the edge,” he said.


Move Your Automation Efforts From Pilot To Reality

Talent is another crucial part of the equation that not enough customers take into account. I’ve worked with many customers that don’t have dedicated automation centers of excellence, or specific in-house expertise to tackle automation the right way. An enterprise with multiple technologies in place must ensure that those technologies are communicating with each other. By bringing together technical experts, your processes can be better visualized and monitored end-to-end across the organization, leading to a higher chance of success. The complexity and effort involved in this kind of endeavour can be off-putting, but it’s worth the reward. Nor is it truly as complicated as it sounds — execution management systems, for example, already bring together technologies like process mining, automation and AI into a seamless, intelligent execution layer. Bring in or train the right people to champion it, and you’ve got a headstart on the next step of the journey. So while many companies haven’t been able to bring the full promise of automation to bear at scale just yet, that promise is getting closer to becoming a reality every day.


HowTo: Optimize Certificate Management to Identify and Control Risk

End-to-end certificate management gives businesses complete visibility and lifecycle control over any certificate in their environment, helping them reduce risk and control operational costs. Even in the most complex enterprise environments, certificate automation offers speed, flexibility and scale. Full visibility over all digital certificates and keys means that even the largest enterprises can have a centralized view of digital identities and security processes. Security leaders can then access expiration dates and maintain cryptographic strength while avoiding the time-consuming, demanding, and risky task of manually discovering, supervising, and renewing certificates. As organizations continue to grow and evolve, so does the range of certificates deployed and the set of people deploying them, which increases the potential for certificates to be installed in your environment that are out of sight of IT security teams and left unmanaged. To avoid being blindsided by these “rogue” certificates, enterprises are turning toward automated universal discovery.


On the Road to Good Cloud Security: Are We There Yet?

The research also uncovered a disconnect that raises the question: Is that confidence misplaced? When asked to rate the level of visibility the security team had into their organization's use of specific cloud service types, including software-as-a-service (SaaS), platform-as-a-service (PaaS), and infrastructure-as-a-service (IaaS), that same level of confidence faltered. For example, when asked to rate the security team's level of visibility into their organization's SaaS usage on a five-point scale, with 1 being the highest level, only 18% gave it a 1 and 27% gave it a 2. Visibility into PaaS and IaaS was rated as only slightly better. At the same time, respondents' knowledge of the shared responsibility model was found to be lacking. When asked to indicate whether the customer or cloud provider was responsible for securing a list of seven different elements that make up an IaaS account, around half of respondents gave the wrong answer. Specifically, 63% erroneously indicated that the cloud provider was responsible for securing virtual network connections, 55% erroneously indicated that the cloud provider was responsible for securing applications, and 50% got it wrong when they said the cloud provider was responsible for securing users who were accessing cloud data and applications.


5 AI-for-Industry Myths Debunked

Up until, and during, the AI hype in the nineties, artificial intelligence was a scientific discipline that almost exclusively dealt with data and algorithms. Over the past decades however, the field has matured, and AI has become an integral part of automated decisioning systems that are at the heart of what we do as individuals and organizations. Consequently, a large portion of AI research, development, and implementation encompasses people and processes. I remember having a business conversation with a large energy provider in which we were talking about automated systems and data-driven methods that, driven by customer data and smart meters, could enhance their customers’ experience. One hour into the meeting, they suddenly asked: “This all looks very promising, but shouldn’t we also do something with AI?” ... If you have the combined luck and skills, you can probably cook a decent meal with ingredients that come from a randomly filled refrigerator. The real question, however, is: “What do you want to achieve?” In the example of the refrigerator, it might occasionally be an effective solution if you need to quickly fill stomachs and don’t have time to go shopping. 


Cloudflare wants to be your corporate network backbone

With Magic WAN, Cloudflare aims to simplify that. Cloudflare's global Anycast network is already built for high performance and availability to serve its core CDN business. The company has data centers in more than 200 cities across over 100 countries with local peering at internet exchange points. Regardless of where branch offices or employees are located, chances are high they'll always connect to a server close to them and then the traffic will be routed through Cloudflare's private network efficiently benefiting from its performance optimizations, smart routing and security. With Magic WAN organizations only need to set up Anycast GRE tunnels from their offices or datacenters to Cloudflare and they can then define their private networks and routing rules in a central dashboard. Cloudflare's existing Argo Tunnel, Network Interconnect and soon IPsec can also be used to connect datacenters and VPCs to its network, while roaming employees will connect using Cloudflare WARP, a secure tunneling solution that's built around the highly performant Wireguard VPN protocol. This also solves the scalability and performance issues that organizations have faced with traditional VPN gateways and concentrators when they were suddenly faced with a large remote workforce due to the pandemic.



Quote for the day:

"A true dreamer is one who knows how to navigate in the dark" -- John Paul Warren

Daily Tech Digest - March 21, 2021

CompTIA creates blockchain industry group to promote new use cases

"With growing interest in the deployment of blockchain technology in business applications, the time is right for us to expand our offerings related to this emerging technology," said Nancy Hammervik, executive vice president of industry relations and CEO of the CompTIA Tech Careers Academy. Members of the group will gain access to resources and forums where the community will be able to promote different use cases, share ideas, hold in-depth discussions and make connections to network with peers. CompTIA already has the Blockchain Advisory Council and the technology industry group will build on the organization's expertise in the space. The council is currently made up of industry leaders and innovators while also finding ways for "technology companies and their customers can leverage blockchain technology in their businesses," according to a statement from the organization. "The Blockchain Technology Interest Group is designed for the curious and the experienced to share ideas and discussions related to blockchain technology," said Kathleen Martin, senior manager for CompTIA's member communities and technology interest groups.


The State of Serverless Computing 2021

Organizations going serverless with their backend services are billed by their serverless vendor based on their compute consumption and do not have to reserve and pay for a fixed amount of bandwidth or number of servers, as the service is auto-scaling with the incoming demand. In the initial days of web development, developers had to own or rent physical hardware to run and test their application code. Running production applications was another nightmare because now they had to keep those servers running during the lifecycle of the application. Cloud computing and virtualization brought much-needed relief to web developers. Now they could rent virtual servers from a cloud vendor according to their needs. The problem was they still had to over-purchase to keep up with traffic spikes or their application would break down. Except, much of the server space they were paying for was going to waste. Cloud vendors introduced auto-scaling compute models to mitigate the problem. However, auto-scaling in response to an unsolicited spike in traffic (such as a DDoS Attack) could turn quite expensive.


GDPR: Transferring Data Outside The EU

If you want to transfer or store data in other countries, you must ensure that there is adequate protection of the data. You’ll need to complete a risk assessment about the nature of the personal data being transferred and the adequacies of the other organisation’s controls and protections offered by their local legal system. For most circumstances contractual protections will be sufficient. The UK Information Commissioner’s Office has some guidance as to how to approach this. The EU has helpfully provided standard contractual clauses in the form of model contracts (for both Controllers and Processors) but these have not been fully updated for GDPR at the time of writing. For large multinationals there is the opportunity to use ‘binding corporate rules’ for transfers within multinational organisations and a certification mechanism. Lawyers Allen and Overy have produced a useful guide to binding corporate rules. There are also some exemptions and exceptions such as specific consent, a contract, substantial public interest, law enforcement, the exchange of airline passenger data etc. The UK Information Commissioner’s Office has some useful guidelines.


Blockchain Tech: Path Towards Scalable Businesses

If the past of the Indian IT sector is a great story of resilience and growth, the future cannot be less bright though the sectors of growth would have to be identified, capacity-building will have to be sustained and regulatory framework would have to be supportive. That is where blockchain tech becomes a great unfolding story for India. A country that became the back-office of global IT giants has a new tale to unwind. Millions of quality but affordable software developers would need a nudge to churn out new solutions using shared ledger architecture. Before we jump into this new wealth era, a quick introduction to the blockchain is important. In a simple sense, blockchains are ledgers but with key properties of decentralization, programmability and antifragility. That transforms ledger databases into protocols of new scalable business models. That is why blockchain is called institutional technology that can redesign a set of fragmented stakeholders to unified business partners. This new model can be merged with AI (artificial intelligence) and IoT (internet of things) to create mind-blowing businesses. India stands at the edge of this era ideally prepared to embrace it.


Distributed Ledger Technology

When it comes specifically to MSME financing, DLT brings a holy trinity of value: trust, transparency and traceability. These benefits can make it easier for MSMEs to build a digital credit history and for banks to assess MSME creditworthiness. Some DLT projects, such as the European-based we.trade, are focused specifically on serving MSME firms, amplifying the benefits of DLT for smaller firms that will be able to access solutions tailored more specifically to their needs. The increased transparency provided by DLT can also make it easier for Tier 2 suppliers and lower which are often small businesses, to access finance. Common supply chain finance solutions are usually only available to established Tier 1 suppliers, which are able to convince their big corporate buyers of their trustworthiness. By enhancing visibility into deeper-tier suppliers, DLT can make their access to finance easier. Various companies like Linklogis and Skuchain are leveraging DLT to this effect. Another potential benefit of DLT for MSMEs is its ability to allow traditional processes or sources of finance to be bypassed.


Uncertainty around India's crypto policy is making blockchain firms anxious

Optimism started to rebuild, and surging Bitcoin prices began to lure millennials. When it comes to transferring Bitcoin and other digital assets, India is of late providing more volume than China on popular peer-to-peer platforms. The risk that India would hit back with a new law to make criminals out of crypto professionals and investors was always present. So practitioners tried to educate policymakers, appealing for sensible regulation starting with definitions for what is a utility token, which digital asset is to be viewed as a security, and which is to be treated as a currency. The trouble is with bureaucrats. They say they want blockchain, but not cryptocurrencies. It’s as silly as wanting airports with duty-free shops but no flights. From the Reuters story, it doesn’t appear that the final regulation will be much different from what a draft bill had recommended in 2019. A government panel report, which had provided the backdrop for the draft legislation, said that authorities would be fine with distributed ledger technologies for delivery of any services, or “for creating value,” without involving cryptocurrencies “for making or receiving payment.”


Technical debt is costing banks innovation and agility. Can cloud help?

The reason banks often do nothing to address it is they convince themselves that they have to rewrite a 40-year-old platform, and then they’re looking at a hundred-million-dollar price tag. So they kick the can down the road. But the longer the debt persists, the greater the consequences. Banks get less agile, less able to innovate code. They become more vulnerable to cybersecurity breaches. Addressing those breaches can drain the development budget, so that all the firm can afford to do is fix emergencies rather than deliver new capabilities, entrenching it in a vicious cycle. Right! That’s actually one of the biggest drains of technical debt: not money but people. Technical debt is a talent issue, too. The more antiquated code that a company struggles to maintain, the more it inhibits the modern tooling and services that developers want to use to build applications. A good developer can work anywhere. They won’t choose to work in a place where the environment is dated. They can go work in companies that are ultra modern with an engaging culture. Banks tend to have the mindset of, “It’s going to cost a boatload of money that I’ll never get approved if I have to replace or rewrite thousands of applications.” That’s where we come in and say: You don’t have to rewrite all of this.


What is Azure Blockchain Service?

Azure Blockchain Service is designed to support multiple ledger protocols. Currently, it provides support for the Ethereum Quorum ledger using the Istanbul Byzantine Fault Tolerance (IBFT) consensus mechanism. These capabilities require almost no administration and all are provided at no additional cost. You can focus on app development and business logic rather than allocating time and resources to managing virtual machines and infrastructure. In addition, you can continue to develop your application with the open-source tools and platform of your choice to deliver your solutions without having to learn new skills. Deploying Azure Blockchain Service is done through the Azure portal, Azure CLI, or through Visual Studio code using the Azure Blockchain extension. Deployment is simplified, including provisioning both transaction and validator nodes, Azure Virtual Networks for security isolation as well as service-managed storage. In addition, when deploying a new blockchain member, users also create, or join, a consortium. Consortiums enable multiple parties in different Azure subscriptions to be able to securely communicate with one another on a shared blockchain.


Make Agile a stepping stone toward future fit adaptability

For many tech leaders, being adaptive is the same as business agility. However, the execution engine for adaptability is your software development and delivery capabilities. Your business can't be adaptive if you don't have great software delivery capabilities. That's why "Agile" has become foundational to being adaptive and for achieving business agility. Forrester research shows that over 72% of enterprise development leaders executed Agile capabilities or were planning to be more Agile in 2019–2020. The Agile Manifesto, published almost 20 years ago, is still the cornerstone for any truly Agile organization. The first point is this: Being truly Agile goes beyond the Webster dictionary description of the adjective. Webster defines agile as "having a quick resourceful and adaptable character" or "marked by ready ability to move with quick easy grace." Agile as defined by the Agile Manifesto carries a broader meaning: the core values and the 12 principles. The Manifesto's definition of Agile carries the meaning of the Webster dictionary definition as a mandatory condition, but not as a sufficient one. So, what does it mean for enterprises in 2021 to be truly Agile and therefore adaptive? Start by going beyond just adopting agile (lowercase), and develop your cultural DNA and organizational strategy around the values and principles established in the manifesto.


New Malware Hidden in Apple IDE Targets macOS Developers

The malware is executed when a developer using the Trojanized version of the TabBarInteraction Xcode project launches what is known as the build target in Xcode. The XcodeSpy malware contacts the attacker's command-and-control (C2) server and drops the EggShell backdoor on the development machine, SentinelOne said in a report this week. "An Xcode project is a repository for all the files, resources, and information required to build one or more software products," Stokes says. "A project contains all the elements used to build a product and maintain the relationships between those elements." Injecting malware into an Xcode project gives attackers a way to target developers and potentially backdoor the developer's apps and the customers of those apps, he says. With XcodeSpy itself, though, the attackers appear to be only directly targeting the developers themselves, according to SentinelOne. The security vendor said a sample of XcodeSpy was found on a US-based victim's Mac in late 2020. The company's report did not disclose the identity of the victim but described the organization as a frequent target of North Korean advanced persistent threat actors.



Quote for the day:

"Leadership has a harder job to do than just choose sides. It must bring sides together." -- Jesse Jackson