Daily Tech Digest - October 29, 2020

The European startups hacking your brain better than Elon Musk’s Neuralink

Musk may have put a top-notch hardware implant in a pig — but he didn’t mention plans for clinical trials on humans during the event earlier this year, which some expected. BIOS, however, is about to embark on human trials next year. The startup aims to treat diseases, for which we don’t currently have effective drugs, by rewiring the brain. Part of the problem with conditions such as heart failure, arthritis, diabetes and Crohn’s disease is that the signals between the brain and diseased organs are failing. By fixing this could dramatically improve the health and wellbeing of patients. But being able to understand the complex neural codes that connect the brain with organs — and to rewire them — is more complex than what Neuralink has been able to show so far. “We are a bit like Linux if Elon Musk is Microsoft,” the cofounder Emil Hewage tells Sifted. Like Neuralink, BIOS has developed its own implant but is focusing on the data that is extracted from it more instead of making the hardware less clunky. The company was founded by the computer neuroscientist Hewage and the bioengineer Oliver Armitage in Cambridge in 2015 as a way to commercialise all the science that had been achieved in the field in the last 20 years.


'Act of War' Clause Could Nix Cyber Insurance Payouts

To some degree, insurers are making the problem worse. In many ransomware attacks, insurers determine that paying the ransom is the least expensive way for their policyholders to recover. Such payouts, however, also keep extortion rackets in business and attacking other companies. If significant and widespread events become more common, it could have a dramatic impact on the cyber insurance industry, says Chris Kennedy, CISO at AttackIQ, a security-validation firm. "These black-swan events are very costly, and insurance companies are businesses, too," he says. "If we are going to see more and more of these black-swan events, the question is how can insurance companies afford to underwrite these policies? Just like the beaches in Florida or the flooding in Texas — where you can't get insurance anymore — if ransomware continues to be as rampant as it is, cyber insurers are going to back away from covering the damages." The impact of NotPetya on shipping giant A.P. Moller Maersk is a prime example of the risk. The company claimed more than $300 million in damages when the NotPetya worm shut down systems across the company's offices. However, the most significant threat to Maersk's business was that the worm infected and seemingly wiped all of the company's 150-plus domain controllers.


Should Your Enterprise Pick Angular, React or Blazor?

Aside from differences in the languages themselves, there’s also the development environment to consider. It used to be that .NET developers generally used Visual Studio, and anyone building frontend apps with something like React would use a different tool. These days tools like Visual Studio Code have successfully brought the various camps together to varying degrees. Saying that, if your team is comfortable and familiar with one particular set of tools, it may make sense to keep them there. It’s no small undertaking to switch from one coding environment to another. Over time we tend to get used to the tools/configurations we use all day every day: Shortcuts, extensions, themes all help to keep us on track. It’s not an impossible mountain to climb, to switch from one tool or editor to another, but doing so is likely to slow everything down, at least in the short term. If IDEs and text editors are an important factor when it comes to development, how you get your code to production is just as (if not more) important! Angular, React and Blazor all have their own processes for deployment. In general, it’s a matter of running the correct script to package up your apps before deploying them to some form of host.


Overcoming Software Impediments Using Obstacle Boards

The initial accomplishments reaped in the use of our first Obstacle Board were great. However, over time we learned that maintaining the same approach was quite challenging. This particular team actually stopped using the board 3 ½ months after starting the experiment. Reflecting on this stoppage, I would definitely consider changing a few aspects of how we used the board at that time to help it better integrate itself as a permanent feature of our practice, and to educate others hoping to follow in our footsteps. Firstly, while we could see from the previous burndown illustration that the proportion of completed to committed stories is veering towards 100%, we didn’t reach that point within the experiment timeframe. The most likely reason for this was that we didn’t get that initial work balance of stories to obstacles right. Just like teams will use their prior sprint velocity, or perhaps an average in their sprint planning activities, so too should we have tried to better track the time taken on obstacles to adjust that ratio. Secondly, while in this experiment we fixed the definition of an obstacle to be these data validation issues, this proved to impact the longevity of the board usage. As any team grows and develops over time, what causes them to slow down evolves. If you do not revisit the causes of what slows you down regularly, you may not think of those new blockers as obstacles.


How CIOs Can Nurture a Culture of Digital Transformation

Digital transformation projects have traditionally been grounded in the adoption of new business technologies that promise to unlock innovation by streamlining projects and enhancing workflows, but they typically work from the top down in a broad vision. This type of innovation is incapable of keeping up with drastically changing business needs, nor can it compete with today’s rapidly evolving digital landscape where every executive leader is working overtime to stay ahead of market volatility.  Those at the top must focus their attention on high-level initiatives that grow and unite the business. This means that business leaders must shift away from a one-dimensional approach to digital transformation in favor of a modern, hybrid model -- one that engages workers on the frontlines of the business to collaboratively identify lapses in business processes and develop innovative solutions. These are the folks that are closest to the actual work and are best positioned to identify and remediate the problems they face day-to-day. The value these workers can bring to innovation initiatives can be ground-breaking for the business, and in most companies, this potential remains largely untapped.


How Agile Coaching plays a role in unlocking the Future of Work

Now is the time to make empiricism new again. Slow down, bring our community back to three pillars at the heart of agility: ... Transparency - Continuous attention to revealing the system around us, and not the defined processes and procedures. Specific focus and attention to revealing the human and relationship systems within teams and organizations and how they work together to create or impede the delivery of value; Inspection - Two perspectives on inspection are needed for the transition to the future. The first starts with self - how each individual approaches their own personal development & professionalism. Second, systemic development & professionalism - how teams, communities, and cultures collectively pursue mission-driven work. Inquiry should balance ones that are deep and exploratory with others guided by the pursuit of outcome-oriented ways for creating value with customers and constituents. Adaptation - The cycle to break with adaptation is change-for-change-sake. There must be a courageous dismantling of self-limiting beliefs, engrained patterns of behavior, and historical non-value-add metrics. Dismantling these creates space to adapt based on the results of inspection, experimentation, and evaluation of evidence that indicates where and how adjustments should be made.


What is Neuralink?

Neuralink is an ambitious neurotechnology company that’s aiming to upgrade nature’s most complex organ – the human brain. Founded by serial entrepreneur Elon Musk, it hopes to surgically implant tiny devices deep inside the skull, offering the potential to treat brain disorders and other medical problems, and give us the power to interact with and control machines using our minds. The idea currently falls quite firmly in the realm of sci-fi and is either utopian or dystopian, depending on who you talk to. Musk refers to it as a “Fitbit in your skull, with tiny wires”, but this is no easy install. The company would need to insert 3,072 electrodes connected to 96 thin, flexible threads into your brain. ... The human brain has 86 billion neurons, which send and receive information through electric signals via synapses. With Neuralink, each individual thread of the device will be connected in the brain, allowing it to monitor the activity of 1,000 brain neurons. Although that sounds like a small sample, amplified signals are recorded and interpreted as digital instructions, and information is sent back to the brain to stimulate electrical spikes. Data in the prototypes has been transmitted via a wired USB-C connection but the goal has been to create a wireless system.


Mitre ATT&CK: How it has evolved and grown

Despite its gaining popularity, as the data from the joint study found, users continue to have difficulty learning to use the framework. There are two fundamental challenges, Sarukkai said. "A lot of tools didn't have the ability to support it. Enterprises who don't have these products end up doing it manually which they means they aren't fully able to adopt the Mitre ATT&CK framework because they are getting inundated with instances and because they don't have the tooling they need to be effective. That's the biggest reason," he said. The second problem, Sarukkai said, is that organizations want to use ATT&CK to automate remediation and help alleviate the workload on SOC analysts. But such use requires a level of maturity with ATT&CK, and the report found that just 19% of respondents have reached that maturity level. The biggest challenge, according to Pennington, is people being overwhelmed. "We recognize that. ATT&CK for Enterprise, the main knowledge base people are using, is 156 high-level behaviors as of right now. And so, if an organization is going in and trying to just go across and immediately in one pass figure out what their stance is against 156 behaviors, they'll be overwhelmed, and we've seen that," he said.


AIOps, DevSecOps, and Beyond: Exploring New Facets of DevOps

Pushed by the pandemic, many businesses have no choice but to rely on their digital channels, he says. As organizations focus on building up reliability and put preventive measures in place, the effort becomes data intensive, Gilfix says. “People have to sift through logs that come from applications and network devices. They have to set up monitoring and alert tools,” he says. “They have to leverage all these various forms of data to figure out where the application is working, and they have to have mature abilities to build a development staging pipeline.” That means testing the applications, simulating real world needs, and moving change management into product, Gilfix says. Finding skilled professionals capable of performing those tasks quickly with large-scale applications is a challenge. This is where AIOps, the application of artificial intelligence to make sense of that data for DevOps, comes into play, he says. “Issues can be resolved quicker,” Gilfix says. “You can pinpoint similar issues in your applications and fix them preventatively. You can leverage AI to ensure, in a decentralized manner, you’re compliant and manage risk.” AI can also be used to avoid errors downstream in the development process. 


Data Privacy in a Globally Competitive Reality

At a global level, there is a spectrum of consumer data privacy regulations. On one end, the European Union's GDPR gives individuals complete control over their personal data and who can access it. Enterprises processing such data must have strict technical and organizational measures in place to ensure data protection principles such as de-identification practices or full anonymization. When data is being processed, it must be done for one of six lawful reasons and the data subject is able to revoke permission at any time. Although strict data management protects consumers' privacy, from an artificial intelligence point of view it inadvertently may limit access to critical data elements or reduce the size of the data set which ultimately could affect the ability to create accurate algorithms. Additionally, limited-size data sets can greatly impact progress on research developments. On the other end of the spectrum is China. With the largest population of internet users in the world, organizations can collect an enormous amount of data on customers that can be used in enterprise AI solutions. Because there are fewer restrictions about who can view and leverage personal data, Chinese data scientists are in many cases able to use the country's massive data sets as a competitive advantage in developing new AI algorithms.



Quote for the day:

"Confident and courageous leaders have no problems pointing out their own weaknesses and ignorance." -- Thom S. Rainer

Daily Tech Digest - October 28, 2020

IT leaders adjusting to expanded role and importance since coronavirus pandemic

"IT had to ensure that their technical environment could handle the increased online demand, as well any downstream impacts to supply chain, logistics and payment applications all connected to the online engine keeping the company operating and in business. IT had to refocus efforts to enable more robust customer engagements remotely via applications and web portals." She said the best examples of this are insurance claims, government services and applications, most of which were not submitted or enabled via an application or web portal before the COVID-19 pandemic. Despite the increase in importance due to the pandemic, IT has been gaining prominence within enterprises for years, Doebel said. IT has long been moving towards the role of business-critical for several years now as technology and innovation have become synonymous with business growth and improved customer experiences.  IT teams rose to the occasion during the COVID-19 breakout and continue to drive innovation and transformation in these challenging times, she added. Important business decisions are now being put in the hands of IT workers who have to think of ways to future-proof their organizations.


5 famous analytics and AI disasters

In October 2020, Public Health England (PHE), the UK government body responsible for tallying new COVID-19 infections, revealed that nearly 16,000 coronavirus cases went unreported between Sept 25 and Oct 2. The culprit? Data limitations in Microsoft Excel. PHE uses an automated process to transfer COVID-19 positive lab results as a CSV file into Excel templates used by reporting dashboards and for contact tracing. Unfortunately, Excel spreadsheets can have a maximum of 1,048,576 rows and 16,384 columns per worksheet. Moreover, PHE was listing cases in columns rather than rows. ... The "glitch" didn't prevent individuals who got tested from receiving their results, but it did stymie contact tracing efforts, making it harder for the UK National Health Service (NHS) to identify and notify individuals who were in close contact with infected patients. In a statement on Oct. 4, Michael Brodie, interim chief executive of PHE, said NHS Test and Trace and PHE resolved the issue quickly and transferred all outstanding cases immediately into the NHS Test and Trace contact tracing system. PHE put in place a "rapid mitigation" that splits large files and has conducted a full end-to-end review of all systems to prevent similar incidents in the future.


Legal and security risks for businesses unaware of open source implications

The sobering reality is that compliance is not keeping up with usage of open source codebases. In view of this, businesses have to consider the impact of open source software in their operations as they move forward in a digitally connected world. Whether they are developing a product using open source components or involved in mergers and acquisitions activity, they have to conduct due diligence on the security and legal risks involved. One approach that has been proposed is to have a Bill of Materials (BOM) for software. Just like BOM used commonly by manufacturers of hardware, such as smartphones, a BOM for software will list the components and dependencies for each application and offer more visibility. In particular, a BOM generated by an independent software composition analysis (SCA) will offer advanced understanding for businesses seeking to understand the foundation on which they are building so many of their applications. Awareness is key to improvement. For starters, businesses cannot patch what they don't know they have. Patches must match source, so they know their code's origin. Open source is not only about source, either. 


Building a hybrid SQL Server infrastructure

The solution to this challenge is to build a SANless failover cluster using SIOS DataKeeper. SIOS DataKeeper performs block-level replication of all the data on your on-prem storage to the local storage attached to your cloud-based VM. If disaster strikes your on-prem infrastructure and the WSFC fails SQL Server over to the cloud-based cluster node, that cloud-based node can access its own copy of your SQL Server databases and can fill in for your on-prem infrastructure for as long as you need it to. One other advantage afforded by the SANless failover cluster approach is that there is no limit on the number of databases you can replicate. Where you would need to upgrade to SQL Server Enterprise Edition to replicate your user databases to a third node in the cloud, the SANless clustering approach works with both the SQL Server Standard and Enterprise editions. While SQL Server Standard Edition is limited to two nodes in the cluster, DataKeeper allows you to replicate to a third node in the cloud with a manual recovery process. With Enterprise Edition the third node in the cloud can simply be part of the same cluster.


Why Enterprises Struggle with Cloud Data Lakes

The success of any cloud data lake project hinges on continual changes to maximize performance, reliability and cost efficiency. Each of these variables require constant and detailed monitoring and management of end-to-end workloads. Consider the evolution of data processing engines and the importance of leveraging the most advantageous opportunities around price and performance. Managing workload price performance and cloud cost optimization is just as crucial to cloud data lake implementations, where costs can and will quickly get out of hand if proper monitoring and management aren’t in place. ... Public cloud resources aren’t private by default. Securing a production cloud data lake requires extensive configuration and customization efforts–especially for enterprises that must fall in line with specific regulatory compliance oversights and governance mandates (HIPAA, PCI DSS, GDPR, etc). Achieving the requisite data safeguards often means enlisting experienced and dedicated teams who are equipped to lock down cloud resources and restrict access to only users that are authorized and credentialed.


The No-Code Generation is arriving

Of course, no-code tools often require code, or at least, the sort of deductive logic that is intrinsic to coding. You have to know how to design a pivot table, or understand what machine learning capability is and what it might be useful for. You have to think in terms of data, and about inputs, transformations and outputs. The key here is that no-code tools aren’t successful just because they are easier to use — they are successful because they are connecting with a new generation that understands precisely the sort of logic required by these platforms to function. Today’s students don’t just see their computers and mobile devices as consumption screens and have the ability to turn them on. They are widely using them as tools of self-expression, research and analysis. Take the popularity of platforms like Roblox and Minecraft. Easily derided as just a generation’s obsession with gaming, both platforms teach kids how to build entire worlds using their devices. Even better, as kids push the frontiers of the toolsets offered by these games, they are inspired to build their own tools. There has been a proliferation of guides and online communities to teach kids how to build their own games and plugins for these platforms (Lua has never been so popular).


Digital transformation: 4 contrarian tips for measuring success

A CIO once told me that his employees felt confused about how their transformation progress was going. I asked, “How many transformations are you doing right now?” He started listing and realized that his team had 15 simultaneous ongoing changes. Worse, every change included different touchpoints for every individual end user, which created even more confusion for those who didn’t understand why the change was happening. Every incremental digitalization initiative should have a person or team responsible for it – the CIO, CTO, or CEO, or perhaps the internal services organization if it’s driving internal efficiency. In the cases of disruptive innovation, it should take place where it's easy to let go of the past ways of doing things, typically in a separate innovation unit. Measure the outcomes you’re looking to achieve and communicate from an outcome perspective, often through a story – and if your transformation does not fit into your objectives and key results or KPIs ... However, too much of either can hurt your progress and indicate a wider problem in your organization: Either you sweep negative feedback under the rug and focus only on the positive, which creates a culture of fear, or you focus only on the negative and forget to celebrate the good stuff, which can destroy motivation and cause a complaint culture.


Role Of E-Commerce In Driving Technology Adoption For Indian Warehousing Sector

Global supply chains and logistics sectors have undergone a major disruption during the past few months, thanks to the pandemic. Several first-time users logged on to e-commerce websites to make safe, virtual purchases for essentials and had a contactless delivery experience at their doorstep. The sector also witnessed a major shift in popular categories, from luxury and lifestyle purchases to shopping for basic essentials such as groceries, medicines, office and school supplies, e-learning tools and even food delivery. As per an impact report released by Uni-commerce, titled E-commerce Trends Report 2020, e-commerce has witnessed an order-volume growth of 17 per cent as of June 2020, and about 65 per cent growth in single brand e-commerce platforms. However, in-spite of challenges such as manufacturing slowdown, shortage of labour, transportation bottlenecks, and disruption in national and international movement of cargo, the massive rise of e-commerce has brought about faster digital adoption and enhanced the potential for overall growth of the sector. With a focus on meeting consumer expectations for speedy delivery, customization, product availability and easy returns while handling complex globalization of supply chains, warehousing trends have witnessed major shifts.


A robot referee can really keep its ‘eye’ on the ball

Human umps may feel hot or tired. They may have the sun in their eyes or become distracted by a mosquito. They may even unintentionally favor players of certain nationalities, races, ages or backgrounds. A machine will not experience any of these problems. So how does the machine do it? Engineers must first spend several days setting up each stadium that will use the system. They measure the precise position of all the lines and “create a virtual-reality world to mirror what is in the stadium,” explains Hicks. They also set up 12 cameras. These will watch every part of the area where the game takes place. Then the engineers run tests — lots of them — to make sure everything works as it should. During a match, those cameras capture a ball’s flight. Software finds the tennis ball in the video. It can do this in bright, overcast or shadowy conditions. A video camera doesn’t capture every single moment of the ball’s flight, however. It actually takes many still photos very quickly. The number of photos it can take in one second is called the frame rate. In each frame, the ball will be in a new position. The system uses math to calculate a smooth path between all these positions. It also takes wind conditions into account.


That dreadful VPN might finally be dead thanks to Twingate

So what does Twingate ultimately do? For corporate IT professionals, it allows them to connect an employee’s device into the corporate network much more flexibly than VPN. For instance, individual services or applications on a device could be setup to securely connect with different servers or data centers. So your Slack application can connect directly to Slack, your JIRA site can connect directly to JIRA’s servers, all without the typical round-trip to a central hub that VPN requires. That flexibility offers two main benefits. First, internet performance should be faster, since traffic is going directly where it needs to rather than bouncing through several relays between an end-user device and the server. Twingate also says that it offers “congestion” technology that can adapt its routing to changing internet conditions to actively increase performance. More importantly, Twingate allows corporate IT staff to carefully calibrate security policies at the network layer to ensure that individual network requests make sense in context. For instance, if you are salesperson in the field and suddenly start trying to access your company’s code server, Twingate can identify that request as highly unusual and outright block it.



Quote for the day:

"In simplest terms, a leader is one who knows where he wants to go, and gets up, and goes." -- John Erksine

Daily Tech Digest - October 27, 2020

How realistic is the promise of low-code?

“Grady Booch, one of the fathers of modern computer science, said the whole history of computer scientists layering is adding new layers of abstraction. On top of existing technology, low-code is simply a layer of abstraction that makes the process of defining logic, far more accessible for the most people. “Even children are being taught the code programming through languages such as MIT‘s scratch, a visual programming language. Just like humans communicate through both words and pictures with a picture, being worth roughly 1000 words. So, developers can develop using both code, and low-code or visual programming languages. “Visual language is much more accessible for many people, as well, much safer. So many business users who are great subject matter experts can make small dips into defining logic or user interfaces, through low-code systems, without necessarily having to commit hours and days to developing a feature through more sophisticated methods.” ...  Tools that use a visual node editor to create code paths are impressive but the code still exists as a base layer for advanced control. I once built a complete mobile video game using these visual editors. Once workflows get slightly more complex it’s helpful to be able to edit the code these tools generate.


“The Surgical Team” in XXI Century

In the surgical team of XXI century, every artifact shall have a designated owner. With ownership comes responsibility for quality of the artifact which is assessed by people who consume it (for example, consumers of designs are developers, and consumers of code are other developers who need to review it or interface with it). Common ownership as advocated by Extreme Programming can only emerge as the highest form of individual ownership in highly stable teams of competent people who additionally developed interpersonal relationships (a.k.a. friendship), and feel obligated to support one another. In other situations, collective ownership will end up with tragedy of commons caused by social loathing. Each team member will complete his assignments with least possible effort pushing consequences of low quality on others (quality of product artifacts becomes "the commons"). This is also the reason why software development outsourcing is not capable of producing quality solutions. The last pillar is respect. It is important for architect and administrator not to treat developers, testers and automation engineers as replaceable grunts (a.k.a. resources). An architect being the front-man of the team needs to be knowledgeable and experienced but it doesn’t mean that developers or testers aren’t. 


The great rebalancing: working from home fuels rise of the 'secondary city'

There are already signs of emerging disparity. Weekday footfall in big urban centres, which plummeted during lockdown, has not bounced back – the latest figures suggest less than one-fifth of UK workers have returned to their physical workplaces – which has led to reductions in public transport. This disadvantages low-income workers and people of colour, and has led to job losses at global chains such as Pret a Manger and major coffee franchises. Meanwhile, house prices in the Hamptons have reached record highs as wealthy New Yorkers have opted to weather the pandemic at the beach. Companies have also started capitalising on reduced occupancy costs – potentially passing them on to workers. The US outdoors retailer REI plans to sell its brand-new Seattle campus, two years in the making, in favour of smaller satellite sites. In the UK, government contractor Capita is to close more than a third of its 250 offices after concluding its 45,000 staff work just as efficiently at home. Not every community will be able to take advantage of the remote working boom, agrees Serafinelli. Those best placed to do so already have – or are prepared to invest in – good-quality schools, healthcare and transport links.


Deno Introduction with Practical Examples

Deno was originally announced in 2018 and reached 1.0 in 2020, created by the original Node.js founder Ryan Dahl and other mindful contributors. The name DE-NO may seem odd until you realize that it is simply the interchange of NO-DE. The Deno runtime: Adopts security by default. Unless explicitly allowed, Deno disallows file, network, or environment access; Includes TypeScript support out-of-the-box; Supports top-level await; Includes built-in unit testing and code formatting (deno fmt); Is compatible with browser JavaScript APIs: Programs authored in JavaScript without the Deno namespace and its internal features should work in all modern browsers; Provides a one-file executable bundler through deno bundle command which lets you share your code for others to run without installing Deno. ... Putting simplicity and security into consideration, Deno ships with some browser-related APIs which allows you to create a web server with little or no difference from a client-side JavaScript application, with APIs including fetch(), Web Worker and WebAssembly. You can create a web server in Deno by importing the http module from the official repo. Although there are already many libraries out there, the Deno system has also provided a straightforward way to accomplish this.


How to Successfully Integrate Security and DevOps

As digitalization transforms industries and business models, organizations increasingly are adopting modern software engineering practices such as DevOps and agile to become competitive in the modern marketplace. DevOps enables organizations to release new products and features faster, but this pace and frequency of application releases can conflict with established practices of handling security and compliance. This leads to the enterprise paradox to go faster and innovate but stay secure by avoiding compromises on controls. However, integrating security into DevOps efforts (DevSecOps) across the whole product life cycle rather than being handled independently or left until the end of the development process after a product is released can help organizations significantly reduce their risk posture, making them more agile and their products more secure and reliable. When properly implemented, DevSecOps offers immense benefits such as easy remediation of vulnerabilities and a tool to mitigate against cost overruns due to delays. It also enables developers to tackle security issues more quickly and effectively.


Forrester: CIOs must prepare for Brexit data transfer

According to the Information Commissioner’s Office (ICO), while the government has said that transfers of data from the UK to the European Economic Area (EEA) will not be restricted, from the end of the transition period, unless the EC makes an adequacy decision, GDPR transfer rules will apply to any data coming from the EEA into the UK. The ICO website recommended that businesses consider what GDPR safeguards they can put in place to ensure that data can continue to flow into the UK. Forrester also highlighted the lack of an adequacy decision, which it said would impact the supply chain of all businesses that rely on technology infrastructure in the UK when dealing with European citizens’ personal data. The analyst firm predicted that cloud providers will start to provide a way for their customers to make this transition. The authors of the report recommended that companies should focus on assessing compliance with UK data protection requirements, including the UK’s GDPR, and determine how lack of an adequacy decision will impact data transfers and work on a transition strategy. While the ICO is the UK’s supervisory authority (SA) for the GDPR, in July the European Data Protection Board (EDPB) stated that it will no longer qualify as a competent SA under the GDPR at the end of the transition period.


Ransomware vs WFH: How remote working is making cyberattacks easier to pull off

"You have a much bigger attack surface; not necessarily because you have more employees, but because they're all in different locations, operating from different networks, not working with the organisation's perimeter network on multiple types of devices. The complexity of the attack surface grows dramatically," says Shimon Oren, VP of research and deep learning at security company Deep Instinct. For many employees, the pandemic could have been the first time that they've ever worked remotely. And being isolated from the corporate environment – a place where they might see or hear warnings over cybersecurity and staying safe online on a daily basis, as well as being able to directly ask for advice in person, makes it harder to make good decisions about security. "That background noise of security is kind of gone and that makes it a lot harder and security teams have to do a lot more on messaging now. People working at home are more insular, they can't lean over and ask 'did you get a weird link?' – you don't have anyone do to that with, and you're making choices yourself," says Sherrod DeGrippo, senior director of threat research at Proofpoint. "And the threat actors know it and love it. We've created a better environment for them," she adds.


Machine learning in network management has promise, challenges

It’s difficult to say how rapidly enterprises are buying AI and ML systems, but analysts say adoption is in the early stages. One sticking point is confusion about what, exactly, AI and ML mean. Those imagining AI as being able to effortlessly identify attempted intruders, and to analyze and optimize traffic flows will be disappointed. The use of the term AI to describe what’s really happening with new network management tools is something of an overstatement, according to Mark Leary, research director at IDC. “Vendors, when they talk about their AI/ML capabilities, if you get an honest read from them, they’re talking about machine learning, not AI,” he said. There isn’t a hard-and-fast definitional split between the two terms. Broadly, they both describe the same concept—algorithms that can read data from multiple sources and adjust their outputs accordingly. AI is most accurately applied to more robust expressions of that idea than to a system that can identify the source of a specific problem in an enterprise computing network, according to experts. “We’re probably overusing the term AI, because some of these things, like predictive maintenance, have been in the field for a while now,” said Jagjeet Gill, a principal in Deloitte’s strategy practice.


The Past and Future of In-Memory Computing

“With the explosion in the adoption of IoT (which is soon to be catalyzed by 5G wireless networking), countless data sources in our daily life now generate continuous streams of data that need to be mined to save lives, improve efficiency, avoid problems and enhance experiences,” Bain says in an email to Datanami. “Now we can track vehicles in real-time to keep drivers safe, ensure the safe and rapid delivery of needed goods, and avoid unexpected mechanical failures. Health-tracking devices can generate telemetry that enables diagnostic algorithms to spot emerging issues, such as heart irregularities, before it becomes urgent. Web sites can track e-commerce shoppers to assist them in finding the best products that meet their needs.” IMDGs aren’t ideal for all streaming or IoT use cases. But when the use case is critical and time is of the essence, IMDGs will be have a role in orchestrating the data and providing fast response times. “The combination of memory-based storage, transparent scalability, high availability, and integrated computing offered by IMDGs ensures the most effective use of computing resources and leads to the fastest possible responses,” Bain writes. “Powerful but simple APIs enable application developers to maintain a simplified view of their data and quickly analyze it without bottlenecks. IMDGs offer the combination of power and ease of use that applications managing live data need more than ever before.”


Work from home strategies leave many companies in regulatory limbo

A solution for this crucial predicament is a potential temporary regulatory grace period. Regulatory bodies or lawmakers could establish a window of opportunity for organizations to self-identify the type and duration of their non-compliance, what investigations were done to determine that no harm came to pass, and what steps were, or will be, taken to address the issue. Currently, the concept of a regulatory grace period is slowly gaining traction in Washington, but time is of the essence. Middle market companies are quickly approaching the time when they will have to determine just what to disclose during these upcoming attestation periods. Companies understand that mistakes were made, but those issues would not have arisen under normal circumstances. The COVID-19 pandemic is an unprecedented event that companies could have never planned for. Business operations and personal safety initially consumed management’s thought processes as companies scrambled to keep the lights on. Ultimately, many companies made the right decisions from a business perspective to keep people working and avoid suffering a data breach, even in a heightened environment of data security risks. Any grace period would not absolve the organization of responsibility for any regulatory exposures.



Quote for the day:

"Our expectation in ourselves must be higher than our expectation in others." -- Victor Manuel Rivera

Daily Tech Digest - October 26, 2020

How to hold Three Amigos meetings in Agile development

Three Amigos meetings remove uncertainty from development projects, as they provide a specified time for everyone to get on the same page about what to -- or not to -- build. "The meeting exposes any potential assumptions and forces explicit answers," said Jeff Sing, lead software QA engineer at Optimizely, a digital experience optimization platform. "Everyone walks away with crystal-clear guidelines on what will be delivered and gets ahead of any potential scope creep." For example, a new feature entails new business requirements, engineering changes, UX flow and design. Each team faces its own challenges and requirements. The business requirements focus on a broad problem space, and how to monetize the product. The engineering requirements center on the technical solution and hurdles. The UX requirements define product usability. The design requirements ensure the product looks finished. All of these requirements might align -- or they might not. "This is why a formalized meeting needs to occur to hash out how to achieve everyone's goals, or which requirements will not be met and need to be dropped in order to build the right product on the right time schedule," Sing said.


Key success factors behind intelligent automation

For an intelligent automation programme to really deliver, a strategy and purpose is needed. This could be improving data quality, operational efficiency, process quality and employee empowerment, or enhancing stakeholder experiences by providing quicker, more accurate responses. Whatever the rationale, an intelligent automation strategy must be aligned to the wider needs of the business. Ideally, key stakeholders should be involved in creating the vision; if they haven’t, engage them now. If they see intelligent automation as a strategic business project, they’ll support it and provide the necessary financial and human resources too. Although intelligent automation is usually managed by a business team, it will still be governed by the IT team using existing practices, so they must also be involved at the beginning. IT will support intelligent automation on many critical fronts, such as compliance with IT security, auditability, the supporting infrastructure, its configuration and scalability. So intelligent automation can scale as demand increases, plan where it sits within the business. A centralised approach encompasses the entire organisation, so it may be beneficial to embed this into a ‘centre of excellence’ (CoE) or start moving towards creating this operating environment.,/div.


Why Most Organizations’ Investments in AI Fall Flat

A common mistake companies make is creating and deploying AI models using Agile approaches fit for software development, like Scrum or DevOps. These frameworks traditionally require breaking down a large project into small components so that they can be tackled quickly and independently, culminating in iterative yet stable releases, like constructing a building floor by floor. However, AI is more like a science experiment than a building. It is experiment-driven, where the whole model development life cycle needs to be iterated—from data processing to model development and eventually monitoring—and not just built from independent components. These processes feed back into one another; therefore, a model is never quite “done.” ... We know AI requires specialized skill sets—data scientists remain highly sought-after hires in any enterprise. But it’s not just the data scientists who build the models and product owners who manage the functional requirements who are necessary in order for AI to work. The emerging role of machine-learning engineer is required to help scale AI into reusable and stable processes that your business can depend on. Professionals in model operations (model ops) are specialized technicians who manage post-deployment model performance and are ultimately responsible for ongoing stability and continuity of operations.


Cybersecurity as a public good

The necessity to privately provision cyber security has resulted in a significant gap between the demand for cyber security professionals and the supply of professionals with appropriate skills. Multiple studies have identified cyber security as the domain with one of the highest skills gap. When a significant skills gap occurs in the market, it results in two things. The remuneration demanded by the professionals will sky rocket since there are many chasing the scarce resources. Professionals who are not so skilled will also survive — rather thrive — since lack of alternatives means they will continue to be in demand. ...  Security as a public good involves trade-offs with privacy. Whether it is police patrols, or CCTV cameras — a trade-off with privacy is imperative to make security a public good. The privacy trade-off risks will be higher in the cyber world because technology would provide the capability to conduct surveillance at larger scale and also larger depth. It is crucial , delicate — and hence difficult — to strike the right balance between security and privacy such that the extent of privacy sacrificed meets the test of proportionality. However, the complexity of the task, or the associated risks with it, should not prevent us from getting out of the path down a rabbit hole.


The Art and Science of Architecting Continuous Intelligence

Loosely defined, machine data is generated by computers rather than individuals. IoT equipment sensors, cloud infrastructure, security firewalls and websites all throw off a blizzard of machine data that measures machine status, performance and usage. In many cases the same math can analyze machine data for distinct domains, identifying patterns, outliers, etc. Enterprises have well-established processes such as security information and event management (SIEM), and IT operations (ITOps), that process machine data. Security administrators, IT managers and other functional specialists use mature SIEM and ITOps processes on a daily basis. Generally, these architectures perform similar functions as in the first approach, although streaming is a more recent addition. Another difference is that many machine-data architectures have more mature search and index capabilities, as well as tighter integration with business tasks and workflow. Data teams typically need to add the same two functions to complete the CI picture. First, they need to integrate doses of contextual data to achieve similar advantages as those outlined above. Second, they need to trigger business processes, which in this case might mean hooking into robotic process automation tools.


Fintech Startups Broke Apart Financial Services. Now The Sector Is Rebundling

When fintech companies began unbundling, the tools got better but consumers ended up with 15 personal finance apps on their phones. Now, a lot of new fintechs are looking at their offerings and figuring out how to manage all of a person’s personal finances so that other products can be enhanced, said Barnes. “We are not trying to be a bunch of products, but more about how each product helps the other,” Barnes said. “If we offer a checking account, we can see income coming in and be able to give you better access to borrowing. That is the rebuild—how does fintech serve all of the needs, and how do we leverage it for others?” Traditional banking revolves around relationships for which banks can sell many products to maximize lifetime value, said Chris Rothstein, co-founder and CEO of San Francisco-based sales engagement platform Groove, in an interview. Rebundling will become a core part of workflow and a way for fintechs to leverage those relationships to then be able to refer them to other products, he said. “It makes sense long-term,” Rothstein said in an interview. “In financial services, many people don’t want all of these organizations to have their sensitive data. Rebundling will also force incumbents to get better.”


Microsoft Glazes 5G Operator Strategy

Microsoft’s 5G strategy links the private Azure Edge Zones service it announced earlier this year, Azure IoT Central, virtualized evolved packet core (vEPC) software it gained by acquiring Affirmed Networks, and cloud-native network functions it brought onboard when it acquired Metaswitch Networks. Combining those services under a broader portfolio allows Microsoft to “deliver virtualized and/or containerized network functions as a service on top of a cloud platform that meets the operators where they are, in a model that is accretive to their business,” Hakl said.  “We want to harness the power of the Azure ecosystem, which means the developer ecosystem, to help [operators] monetize network slicing, IoT, network APIs … [and] use the power of the cloud” to create the same type of elastic and scalable architecture that many enterprises rely on today, he explained. That vision is split into two parts: the Azure Edge Zones, which effectively extends the cloud to a private edge environment, and the various pieces of software that Microsoft has assembled for network operators. On the latter, Hakl said Microsoft “could have gone out and had our customers teach us that over time. Instead, we acquired two companies that brought in hundreds of engineers that have telco DNA and understand the space.”


Artificial intelligence for brain diseases: A systematic review

Among the various ML solutions, Deep Neural Networks (DNNs) are nowadays considered as the state-of-the-art solution for many problems, including tasks on brain images. Such human brain-inspired algorithms have been proven to be capable of extracting highly meaningful statistical patterns from large-scale and high-dimensional datasets. A DNN is a DL algorithm aiming to approximate some function f ∗. For example, a classifier can be seen as a function y = f * ( x , θ ) mapping a given input x to a category labeled as y. θ is the vector of parameters that the model learns in order to make the best approximation of f ∗. Artificial Neural Networks (ANNs) are built out of a densely interconnected set of simple units, where each unit takes a number of real-valued inputs (possibly the outputs of other units) and produces a single real-valued output (which may become the input to many other units). DNNs are called networks because they are typically represented by composing together many functions. The overall length of the chain gives the depth of the model; from this terminology, the name “deep learning” arises. 


Things to Consider about Brain-Computer Interface Tech

A BCI is a system that provides a direct connection between your brain and an electronic device. Since your brain runs on electrical signals like a computer, it could control electronics if you could connect the two. BCIs attempt to give you that connection. There are two main types of BCI — invasive and non-invasive. Invasive devices, like the Neuarlink chip, require surgery to implant them into your brain. Non-invasive BCIs, as you might’ve guessed, use external gear you wear on your head instead. ... A recent study suggested that brain-computer interface technology and NeuraTech in general could measure worker comfort levels in response to their environment. They could then automatically adjust the lights and temperature to make workers more comfortable and minimize distractions. Since distractions take up an average of 2.1 hours a day, these BCIs could mean considerable productivity boosts. The Department of Defense is developing BCIs for soldiers in the field. They hope these devices could let troops communicate silently or control drones with their minds. As promising as BCIs may be, there are still some lingering concerns with the technology. While the Neuralink chip may be physically safe, it raised a lot of questions about digital security. 


Microsoft did some research. Now it's angry about what it found

A fundamental problem, said Brill is the lack of trust in society today. In bold letters, she declared: "The United States has fallen far behind the rest of the world in privacy protection." I can't imagine it's fallen behind Russia, but how poetic if that was true. Still, Brill really isn't happy with our government: "In total, over 130 countries and jurisdictions have enacted privacy laws. Yet, one country has not done so yet: the United States." Brill worries our isolation isn't too splendid. She mused: "In contrast to the role our country has traditionally played on global issues, the US is not leading, or even participating in, the discussion over common privacy norms." That's like Microsoft not participating in the creation of excellent smartphones. It's not too smart. Brill fears other parts of the world will continue to lead in privacy, while the US continues to lead in inaction and chaos. It sounds like the whole company is mad as hell and isn't going to take it anymore. Yet it's not as if Microsoft has truly spent the last 20 years championing privacy much more than most other big tech companies. In common with its west coast brethren, it's been too busy making money.



Quote for the day:

"Leadership is about carrying on when everyone else has given up" -- Gordon Tredgold

Daily Tech Digest - October 25, 2020

Meet modern compliance: Using AI and data to manage business risk better

Strong, tech-enabled, third-party risk management capabilities can strengthen corporate governance, which will in turn enhance reputation and build trust. In essence, compliance should no longer be seen simply as a backroom cost center. Rather, it is a means of strengthening the business brand, increasing productivity, and driving growth of market share, with relevance at the C suite and at the board level. ... “By engaging early in the sales contract life cycle and providing compliance oversight and ongoing risk education, we [at Microsoft] have been able to realize better, more compliant deal construction. This is critical at quarter-end when deal volumes spike. Sellers internalize the risk guidance and proactively ensure their contract meets the company’s compliance standards — often reducing monetary concessions that improve margin and profitability.” Four years ago, PwC and Microsoft worked closely together to further develop a tech-enabled compliance analytics suite of tools called Risk Command. “We started the journey to respond to internal and external pressures to embrace a ‘data-driven’ approach,” Gibson recalled. “But it appears to be what regulators are now expecting and serves as a benchmark for what others may want to do.”


Is The Cybersecurity Industry Selling Lemons? Apparently Lots Of Important CISOs Think it Is

If it’s true that poor products have contributed to the success of cyberattacks then something must be wrong, but what? The report’s thesis – which borrows its title from economist George Akerlof’s Nobel Prize winning 1970 paper on the same topic – doesn’t sugar coat it: cybersecurity has become an industry that keeps churning out lemons that not enough people complain about. Searing tech skepticism is nothing new of course – Clifford Stoll’s Silicon Snake Oil or Michael Lewis’s satirical The New New Thing come to mind – but those were about issues (the Internet will go wrong, dotcom excess), people have already processed. Cybersecurity, by contrast, is all that stands between us and a world where criminality contorts the system in ways that cost livelihoods and whole economies. Bad cybersecurity isn’t just inconvenient, it’s dangerous and somebody needs to say this now. The authors believe the underlying problem is economic rather than technical. Technology doesn’t work as claimed because the market relationship between customers and the vendors has broken down. This manifests as an ‘information asymmetry’ where vendors know how good their product is, but their customers not only don’t know but don’t have time to find out.


How advanced AI language tools could change the workplace

Within the last decade, some of the most notable breakthroughs in artificial intelligence (AI) have come in the form of computer vision. Essentially giving robotics systems ‘eyesight’, in the ability to identify and classify objects using image or video recognition, the technology has been put to use in anything from facial recognition systems and quality control in manufacturing to anomalies in MRI scans and self-driving vehicle systems. And while computer vision applications are still comparatively nascent, the ‘breakthrough’ AI applications of the decade ahead might well come in the form of advances in language-based applications. AI research and deployment company OpenAI developed the largest language model ever created this year, GPT-3. The software can generate human-like text on demand and is set to be turned into a commercial product later this year, as a paid-for subscription via the cloud for businesses. It represents a leap forward from previous language processing models that used hand-coded rules, statistical techniques, and increasingly artificial neural networks — which can learn from raw data, with less reliance on data labelling — to perform language processing.


Open-source software detects potential collisions in radiotherapy plans

The RadCollision software needs to be embedded into each TPS database and a folder (STL files) with the 3D models of the machines prepared. RadCollision is currently limited to use with the RayStation TPS, but versions for use with other commercial TPS are planned, says first author Fernando Hueso-González. The researchers quantitatively evaluated their software using the RayStation TPS with four patient treatment plans that were found infeasible during previous collision checks by therapists. The software reported collisions with the couch at similar angles to those reported experimentally. The team also tested the software with a model of a proton treatment room and a robotic patient positioning system. “In one case, we tested in the RadCollision software a beam where the dosimetrist doubted that there was enough clearance with the toe of a patient’s foot. RadCollision predicted that clearance would be very tight, but the irradiation-optimized TPS was feasible,” comments Remillard. “When we performed a dry run, there was no collision.” The team note that the reliability of the collision assessment depends upon the accuracy of the input data.


Technology is about to destroy millions of jobs. But, if we're lucky, it will create even more

CIOs are effectively banking on AI systems and machines to carry out tasks that would have previously been taken on manually. For example, the WEF predicts that in 2025, machines will be performing up to 65% of information and data processing and retrieval, leaving only 35% of the job to humans. This means that some roles are set to become increasingly redundant in the next few years. Data entry clerks, accountants and auditors, and factory workers are among the jobs that the WEF expects to be particularly displaced by automation. At the same time, growth in so-called "jobs of tomorrow" will offset the lack of demand for workers in jobs that can be filled by machines. Leading the polls for positions in growing demand are roles linked to the green economy, data, AI, and cloud computing. Think data analysts, machine-learning specialists, robotics engineers or software developers. Jumping from a redundant job to one in high demand is no easy challenge. The "jobs of tomorrow" will require new skills; in fact, the vast majority of employers (94%) surveyed by the WEF said that they expected employees to pick up new skills on the job. The past few months have seen employees and employers alike getting started with tackling the issue. 


How Artificial Intelligence is Transforming the Insurance Space

Although insurance CEOs are conscious of the herald of digital disruption breaking through the industry, it will be a whole new challenge to keep up with these revolutionary changes and to see it beyond the plain integration of modern technology. Intelligent solutions must be innovative enough to foster better customer relationship and deliver customer experience in a way that inspires much-needed poise between incipient market expectations and cost optimization. Apart from these, another pressure point is coming from emerging InsurTech entrants who are giving rise to tough competition by creating affordable solutions to reach and serve customers. What is relaxing is that to surpass this challenge, industry leaders are prepared to embrace new innovative possibilities and appreciate the role of creativity in evolving the processes and becoming a beloved brand in the financial marketplace. Over the last two years, we have seen the widespread advent and adoption of AI across multiple industries (be it hospitality or be it healthcare). The idea of digital technologies ruling the financial market isn’t exactly new since Nasdaq in its early days established a secure connected network of trading desks for integrated customer data records.


Voice Payment in Banking: The New Revolution in Fintech

Voice recognition methods use biometrics data to identify who’s speaking with virtual assistants. The robotic assistants have gone through so much changes and updating that you won’t be able to differentiate whether it’s a human or AI is talking to you. However there are a lot of privacy concerns around smart speakers. 33% of U.S. surveyed adults said they had security concerns which restrain them from purchasing the devices. Estimated that in January 2019 26% of people showed a strong concern about speaker’s privacy risk. The number jumped to 30% in January 2020. The reason is exposure to recorded conversations. All of the world’s biggest voice assistant providers Amazon, Google, Apple, Microsoft, Facebook are listening to some utterances because machine learning won’t be efficient if there would be no improvements in conversations between humans and devices. However, some situations were real leakage of consumer secret information which caused many doubts and indicated privacy as key risk in voice assistant technology.AI updates will facilitate the ability to understand accents, dialects, intonations and more. Fingertips is unique biometrics data which is an important secure measure. 48% of people have used biometrics to make payment.


How blockchain is used to transform the lives of people in marginalised communities

A key aim of the Building Blocks project is to provide people in refugee camps with the means of buying food and necessities quickly and securely using direct cash transfers. Another objective is to ensure they no longer have to worry about food vouchers being lost or stolen or about third party organisations, such as banks, having access to their personal data. Direct cash transfers, according to WFP research, are often the most effective and efficient way to distribute humanitarian assistance as well as support local economies. But being able to distribute it relies on the support of local financial institutions, which are not always in a position to do so, not least because many refugees face restrictions in opening bank accounts. To try and address the situation, in early 2017 the WFP introduced a proof-of-concept blockchain-based system to register and authenticate transactions in Sindh province, Pakistan, which did not require a bank to act as an intermediary to connect both parties. The system is now being used to support 106,000 Syrian refugees in the Azraq and Zaatari camps in Jordan and 500,000 Rohingyas in the Cox's Bazar camp in Bangladesh.


Chip industry is going to need a lot more software to catch Nvidia’s lead in AI

"Software is the hardest word," quipped Gwennap, referring to the struggles of competitors. He noted how companies either don't support some aspects of popular AI frameworks, such as TensorFlow, or how some AI applications for competing chips may not even compile properly. "To compete against deep software stacks from companies such as Nvidia and Intel, these vendors must support a broad range of frameworks and development environments with drivers, compilers, and debug tools that deliver full acceleration and optimal performance for a variety of customer workloads." ... The use of AI is spreading from cloud computing data centers where it has traditionally been developed to embedded devices in automobiles and infrastructure. Vendors such as the UK's Imagination and Think Silicon, a division of chip equipment giant Applied Materials, are pushing the boundaries in low-power designs that can go into power-constrained devices, such as battery-powered, microcontroller gadgets.  The stakes seem suddenly higher since Nvidia announced last month that it intends to buy Arm Plc for $40 billion. Arm makes the intellectual property at the heart of all the chips made by all the challengers in the chip industry. Hence, Nvidia's software is poised to gain even greater sway.


JP Morgan Veteran Daniel Masters Explains How Blockchain Will End Commercial Banks

The most interesting aspect of CBDCs is the impact they will have on commercial banks and the financial system as a whole. Today, central banks issue currency to a slew of commercial banks like Chase and Bank of America. These banks do two things—create products and services such as mortgages, and deal with the end users. I think we are going into a new paradigm where central banks issue CBDCs, commercial banks cease to exist and the service layer is filled by crazy new emerging companies like Compound Finance, Uniswap, SushiSwap, and people that are really getting distributed, decentralized finance done today. Then the final interesting layer is who actually faces the consumer. You can already see that there are multiple choices. Coinbase would like to get to all the users, as would Binance though probably not in America. You’ve got wallet infrastructures like Blockchain.com that already have 50 million outstanding wallets. That said, you could get incumbents as well. Samsung is putting chips into phones now, making them essentially hardware wallets. Amazon could come out with a digital wallet. Whoever owns that level at the bottom is critical.



Quote for the day:

"We are drowning in information, but starved for knowledge." -- John Naisbitt