Daily Tech Digest - April 08, 2020

‘Fake Fingerprints’ Bypass Scanners with 3D Printing

galaxy s10 fingerprint
The fake fingerprints achieved an 80 percent success rate on average, where the sensors were bypassed at least once. Researchers did not have success in defeating biometrics systems in place on Microsoft Windows 10 devices (though they said that this does not mean they are not necessarily safer; just that this particular approach did not work). However, the bigger takeaway is the sheer amount of time and budget that it still takes when creating threat models to bypass fingerprint sensors. At the end of the day, researchers said they had to create more than 50 molds and test them manually, which took months – and, they struggled to stay under a self-imposed budget of $2,000. These challenges point to the fact that a scalable, easy type of attack is not yet possible for bypassing biometrics. “Biometrics are not an Achilles heel,” Craig Williams, director of Cisco Talos Outreach, told Threatpost. “Biometrics are something that makes it very, very easy to use. You don’t have to remember a password. You don’t have to enter a password, which makes it very fast and easy. You don’t have to carry anything around with you. And so I think for most users, it’s still perfectly fine.”


Robotic Process Automation (RPA): 6 open source tools

RPA Robotic Process Automation lessons
Open source might sound intimidating to non-developers, but there’s good news on this front: While some open source projects are particularly developer-focused, multiple options stress ease of use and no- or low-code tools, like their commercial counterparts. One reason for this: RPA use cases abound across various business functions, from finance to sales to HR and more. Tool adoption will depend considerably on the ability of these departments to manage their RPA development and ongoing management themselves, ideally in a collaborative manner with IT but not wholly dependent on IT. ... TagUI is a command-line interface for RPA that can run on any of the major OSes. TagUI uses the term and associated concept of “flows” to represent running an automated computer-based process, which can be done on demand or on a fixed schedule. ... Robocorp might have our favorite name of the lot – it kind of conjures up some of the darker, Terminator-esque images of RPA – but that’s a bit beside the point. This is a relatively new entry into the field, and somewhat unique in that it’s a venture-backed startup promising to deliver cloud-based, open source RPA tools for developers.



Inverting a matrix is one of the most common tasks in data science and machine learning. In this article I explain why inverting a matrix is very difficult and present code that you can use as-is, or as a starting point for custom matrix inversion scenarios. Specifically, this article presents an implementation of matrix inversion using Crout's decomposition. There are many different techniques to invert a matrix. The Wikipedia article on matrix inversion lists 10 categories of techniques, and each category has many variations. The fact that there are so many different ways to invert a matrix is an indirect indication of how difficult the problem is. Briefly, relatively simple matrix inversion techniques such as using cofactors and adjugates only work well for small matrices (roughly 10 x 10 or smaller). For larger matrices you should write code that involves a complex technique called matrix decomposition. The code presented in this article will run as a .NET Core console application or as a .NET Framework application. Many of the newer Microsoft technologies, such as the ML.NET code library, specifically target .NET Core so it makes sense to develop most new C# machine learning code in that environment.



PMI offers free project management courses during COVID-19 quarantines

Professional project manager with icons about planning tasks and milestones on schedule, cost management, monitoring of progress, resource, risk, deliverables and contract, business concept
This is the first time that the group has offered these online training and consulting resources at no charge, said DePrisco. The Project Management for Beginners course introduces participants to the foundational knowledge necessary to join a project team and provides insights into taking steps on the path to a project management career. The Agile in the Project Management course walks participants through their role as a project management office director and introduces a series of scenarios designed to improve their project management office's performance using agile principles and processes. The Business Continuity course offers information and lessons on rethinking work processes, which may be particularly helpful today as companies and their leaders and workers seek ways to cope with continuing their operations during the pandemic. ... Project management skills can be extremely beneficial during times of emergency such as the pandemic, he said. "Project management initiatives play an important role in preparing for these types of disruptions. All work is accomplished through programs and projects, and project managers are used to changing methods and approaches."


These hackers have been quietly targeting Linux servers for years


Linux is not typically a user-facing technology, so security companies tend to focus on it less, he explained. As a result, these hacking groups have zeroed in on that gap in security and leveraged it for their strategic advantage to steal intellectual property from targeted sectors for years without anyone noticing, he said. "It's critical for these servers to be up all the time; so what better place to put a root kit or a pervasive active tool than on a machine that's going to be turned on all time?" said Cornelius. The attackers scan for Red Hat Enterprise, CentOS, and Ubuntu Linux environments across a wide range of industries, attempting to identify unpatched servers. From there it's simply a case of establishing persistence on the network with malware. Not only can this provide the attackers the access they need to sensitive information and data, but with the infection on the servers themselves, they can create a persistent back door into the network that provides them with a way back in whenever they like – so long as the compromise isn't uncovered. The attackers are careful to do as little damage as possible to the networks so as to avoid detection – and therefore keep campaigns up and running for as long as possible, which might be years.


Is It Possible To Become A Successful Self-Taught Data Scientist?

Data scientist
Although a university degree is a great accomplishment, self-taught aspirants can rejoice as this is not enough to land a good data science job. While a degree may lay down a foundation for a career in this field – and may get one a job interview – it is not a key qualifying factor when applying for tech positions. Even though you may be competing against applicants who have relevant degrees, you can garner a competitive advantage with upskilling using the world of resources available online. What is more, self-study also signals a candidate’s motivation to succeed. But you need to first narrow down what you need to learn to substitute for your lack of formal training. Data science is a broad discipline and comprises a wide collection of jobs – from statisticians to machine learning (ML) experts, to business analysts to data visualization experts. Since the skills required for each vary, it is important to first narrow down the skill sets you need to acquire, and then create a plan around it.


9 Security Podcasts Worth Tuning In To

(Image: Boyarkina Marina -- stock.adobe.com)
The cybersecurity industry changes every day, sometimes multiple times a day, and it can be overwhelming for professionals to keep up with the constant flow of breaking news, new threats, defensive strategies, reports, mergers, valuations, product releases, and trends. Podcasts can help you stay in the loop on security news by hearing the latest updates and analysis from experts across the industry. Some of the best security podcasts offer insight from practitioners, CISOs, analysts, and reporters who take a closer look at industry events and aim to educate their listeners with digestible information and discussions with other security pros. Many cybersecurity podcasts offer informative takes on recent incidents and shed light on how current events; for example, COVID-19, are affecting the IT security community. Others discuss specific parts of the industry, like the Dark Web or the relationship between CISOs and vendors. The handy thing about podcasts is they help you stay on top of cybersecurity news and trends, and learn from the pros, when you're not sitting in front of a screen or attending a conference.


How to Integrate Security Into Your Application Infrastructure


Cequence describes the threats they address, stating that the web, mobile, and API-based apps that power organizations are also targets for relentless cyberattacks. These include automated bot attacks focused on business logic abuse (such as credential stuffing, site scraping, fake account creation, and more), as well as targeted attacks designed to exploit both known and unknown application vulnerabilities. Cequence Security stops these attacks with an AI-powered, container-based software platform that can be easily deployed on-premises or in the cloud, wherever your apps need to be protected. Matt told us, “We look at our customer’s web or application traffic and use machine learning algorithms to look for patterns of automation to determine if it is malicious. While doing this, we mustn’t introduce additional friction to the user experience.  “We collect telemetry and look at the patterns within the traffic. We watch for underlying behavior characteristics that may indicate potentially malicious traffic.



Zero-day exploits increasingly commodified, say researchers


In new research published this week, FireEye said it had documented more zero-day exploitations in 2019 than in the previous three years, and although not every attack could be pinned on a known and tracked group, a wider range of tracked actors do seem to have gained access to these capabilities. The researchers said they had seen a significant uptick, over time, in the number of zero-days being leveraged by threat actors who they suspect of being “customers” of private companies that supply offensive cyber capabilities to governments or law enforcement agencies. “We surmise that access to zero-day capabilities is becoming increasingly commodified based on the proportion of zero-days exploited in the wild by suspected customers of private companies,” they said. “Private companies are likely to be creating and supplying a larger proportion of zero-days than they have in the past, resulting in a concentration of zero-day capabilities among highly resourced groups.


Chrome 81 released with initial support for the Web NFC standard

chrome-new-ui.png
Plans to remove the TLS 1.0 and TLS 1.1 encryption protocols from Chrome, also initially scheduled for Chrome 81, are now delayed to Chrome 84. The decision to delay removing these two protocols is related to the current COVID-19 outbreak, as removing the two protocols might have prevented some Chrome 81 users from accessing critical government healthcare sites that were still using TLS 1.0 and 1.1 to set up their HTTPS connections. Removing support would have prevented users from accessing those sites altogether, something that Google wanted to avoid. Today's Chrome 81 release marks the most turbulent release in Chrome's history. Because the browser maker had to shift features around from version to version, and because the three-week Chrome 81 delay also disrupted Google's regular six-week release schedule, Google has now taken a first-of-its-kind step to scrap a Chrome version. Google said the next version of Chrome is v83, and that work on v82 has been permanently abandoned.



Quote for the day:


"Every great leader can take you back to a defining moment when they decided to lead." -- John Paul Warren


Daily Tech Digest - April 07, 2020

Hybrid Instead of All-Flash

All-flash Array vendors claim that because of the continuing decline in flash pricing and because of deduplication, there is no longer a financial reason to choose hybrid instead of all-flash. They claim that the unpredictable performance concerns of hybrid arrays outweigh any remaining cost advantage. AFA vendors, though, ignore the fact that the price of hard disk drives is reducing in terms of cost per terabyte. They also ignore the new reality that hard disk isn’t the only option for the second tier of storage. Deduplication, while bringing down the cost per terabyte of flash, brings a set of “taxes” that make it less cost-efficient than customers are led to believe. First, in primary storage, deduplication is far less efficient than when IT uses the technology for backup storage. Second, there is a performance overhead associated with its use, and all-flash arrays that use deduplication have an inferior cost per IOPS rating. Finally, most all-flash vendors don’t pass the full savings of deduplication on to the customer. The customer receives some of the cost savings value, but not all of it.


How to stay motivated when you work from home

Apple, remote working, iOS, mobile, iPhone
Those conversations you have with friends and family in restaurants, bars, on the street don’t need to stop when you’re self-isolating, just contact your people on FaceTime. With these suggestions in mind I’ve identified a selection of iOS tools that may help you take control of working from home while also helping you make the best of the motivation you still have available to you. Don’t be too frustrated if you’re not as motivated as normal – it really isn’t your fault. Things are happening. They are quite frightening. You are already doing what you can to challenge them by staying at home. Cut yourself some slack first and then see if these (mostly free) tools help you feel a little more in control. Assuming you can find the ingredients, then FoodPlanner lets you find healthy recipes you like the sound of online, add them into the app and then generates nutritional data, create a shopping list (including inventory management features to help you track ingredients you already have) and create meal plans for the next week or more. Foodplanner doesn’t aim to pester you into exercise, it lets you choose the food you want and then gives you the information you need in order to make it.



Compromising a 2FA system is lot easier than it seems. One of the easiest methods, especially in America, is a sim-swap, where a malicious actor switches a target’s mobile phone number to a new phone. Any subsequent text messages, such as those for 2FA, are sent to this new phone, thereby giving the malicious actor access. Certain malware has also been found to compromise 2FA systems. Cerberus, a type of Android-based malware, was found to have stolen 2FA codes for Google Authenticator in February 2020. There is also the TrickBot malware, which bypasses 2FA solutions by intercepting the one-time codes used by banking apps, sent by SMS and push notifications. Social engineering is also used to bypass 2FA security. Malicious actors may pose as a target’s bank, calling the target to “confirm their identity” by quoting the secure code that has just been sent to them, in response to an attempt to access their banking profile. “A lot of this stuff doesn’t require any real technical skill, and that’s the really scary part,” says Harding.


ms-teams-commands.jpg
It might not be immediately obvious, but the Search box at the top of the Teams desktop app doubles as a command line. Click in that box and then tap the slash key (/) to display a list of all available commands. ... Pressing Ctrl+E takes you to the Search box, for example, just as it does in File Explorer and your web browser. You can use Ctrl+number to go to the corresponding node in the navigation pane on the left. In the default arrangement, Ctrl+1 goes to the Activity pane, Ctrl+2 takes you to Chat, and so on. Press Ctrl+Shift+X to toggle between the bare compose box and the full editor with all its formatting options. And just as in your web browser, you can hold down Ctrl as you tap the plus or minus keys to zoom in or out, then press Ctrl+0 to go back to normal (100%) magnification. ... When posting a new conversation/thread, it's a good idea to add a subject, as I've done in the opening post here. That makes it easier to spot a specific conversation by scrolling through a channel, and also makes it easier to use the search tools to find that conversation.


What is power over Ethernet (PoE)?

ethernet cable declantm flickr
Using PoE in wireless rollouts may be the technology’s primary application but many think it will find a home in the internet of things where wired IoT devices can receive power from their network connection. Versa technology wrote a blog about the use of PoE and IoT by the city of San Diego, Calif., which is using Ethernet cabling to deliver power to thousands of interconnected LED streetlights, which are integrated into the city’s IoT network. Power to the smart lamps can be turned up and down to optimize illumination for each space. Such lighting systems have low power requirements, making them cheaper to use. The PoE streetlights are integrated with the city’s IoT network, which makes it possible to monitor and control them remotely. The smart lamps are fitted with motion sensors to conserve energy by optimizing lighting based on the needs of each space. The system saves the city $250,000 or more per year, Versa stated. IP security cameras, which are often placed in difficult-to-access locations, are another key PoE application target.


IBM CEO Throws Down Hybrid-Cloud Challenge

IBM CEO Throws Down Hybrid-Cloud Challenge
In a LinkedIn message to IBM’s employees, Krishna said he wants the company to add a greater presence in the hybrid-cloud space to its already established positions in the mainframe, services, and middleware ecosystem. “The fundamentals are already in place,” Krishna wrote. “Our approach to hybrid cloud is the most flexible and the most cost effective for our clients in the long term. Coupled with our deep expertise, IBM has unique capabilities to help our clients realize the potential of a hybrid cloud business model.” Krishna stated that IBM would take advantage of its already established presence in cloud, artificial intelligence (AI), blockchain, and quantum computing. He noted that two “strategic battles” were taking place in the journeys to hybrid cloud and AI. “We all need to understand and leverage IBM’s sources of competitive advantage,” Krishna explained. “Namely, our open source and security leadership, our deep expertise and trust, and the fact that we enable clients to build mission-critical applications once and run them anywhere.”


Cybercriminals increasingly using SSL certificates to spread malware

Internet browser window showing lock icon during SSL connection
Recent studies have shown that cybercriminals building phishing sites now use SSL as well, complicating efforts by enterprises to keep their employees safe. The Menlo Security research revealed that while 96.7% of all user-initiated web visits are being served over https, only 57.7% of the URL links in emails turn out to be https, which means that web proxies or firewall will be oblivious to the threats unless enterprises turn on SSL inspection. "If you think the little green lock of https equals security, think again," the report said. "The bad news is that the bad guys use encryption, too. Many people mistakenly assume that as long as an SSL certificate is present, they're safe from attack, but that couldn't be further from the truth. From Reductor to Godlua and numerous other variants, it has become all too clear that new types of malware are being secreted behind a symbol that was once seen as secure." According to the report, enterprises have long relied on on-premises proxies and next-generation firewalls for visibility and control of web access. But when it comes to decrypting and inspecting SSL sessions, the report said, "many enterprises have held back partly driven out of privacy issues and partly around performance of these proxies with SSL decryption turned on.


What are the five main barriers to digital transformation and their solutions? 

What are the five main barriers to digital transformation and their solutions? image
One of the biggest barriers to digital transformation initiatives is when there is no clear direct return on investment. To overcome this barrier, White explains that “businesses should rectify this by defining a clear set of digital success criteria at the start, defined based on what you are trying to achieve — are you creating additional revenue streams or enhancing internal operations? Measures can be based on anything from metrics to softer statements such as ‘we will be able to provide customers with a way to engage 24/7 around the world’.” ... The most common barrier to digital transformation, according to Steve White, head of transformation accounts at Yotta, is data and department silos. He explains: “One common example is that the software applications being used by departments are very specific to those service areas, often require specialist knowledge to use effectively and are locked down via account profiles and permissions. They also incorporate different user interface designs (UI) and user experience designs (UX) which all make access by other departments or users within the organisation extremely difficult.


Data scientists: White House issues a call to arms

data-science.jpg
It's an opportunity for service for data scientists, a way to help healthcare workers and policymakers understand a growing dataset that holds the key to making informed decisions. At the moment, we lack the most basic knowledge about COVID-19, including an answer to the most fundamental question: how many people have been infected? Health experts agree that reliable data answering this question and other fundamental questions are needed to guide difficult decisions ahead. ... "The good news is we have lots of data," says McDonald. "The bad news is the organization and accessibility of that data is very spread out or difficult to access." Given the difficulties with the dataset, McDonald points to AI deep learning as a necessary tool. "Deep Learning is not a typical algorithm. A user literally "teaches" the platform with hundreds of examples of the various classifications or predictions. Once taught, then future classifications and predictions are in the hands of the deep learning platform." This can be applied to health data in general, which is a growing trend in data-driven medicine.


Cisco goes after wireless IoT with Fluidmesh acquisition

Industry 4.0 / Industrial IoT / Smart Factory
In January Cisco rolled out an overarching security architecture for industrial IoT (IIoT) environments that includes existing products but also new software called Cisco Cyber Vision, for the automated discovery of industrial assets attached to Cisco’s extensive IIoT networking portfolio. The new security rollout also included Cisco Edge Intelligence software to simplify the extraction of IoT data at the network edge. Together with the new software, IT and operational technology groups will be able to work together to provide advanced anomaly detection in IIoT environments, Cisco stated. Also in 2019, Cisco expanded its IoT security and management offerings by acquiring Sentryo, a company that offers anomaly detection and real-time threat detection for IIoT networks. Founded in 2014 Sentryo products include ICS CyberVision – an asset-inventory, network-monitoring and threat-intelligence platform – and CyberVision network-edge sensors, which analyze network flows. Last year Cisco rolled out a family of switches, including the Catalyst IE3x00 ruggedized edge switches, software, developer tools and blueprints to incorporate IoT and industrial networking into intent-based networking and classic IT security, monitoring and application-development support.



Quote for the day:


"When you expect the best from people, you will often see more in them than they see in themselves." -- Mark Miller


Daily Tech Digest - April 06, 2020

How DevOps is integral to a cloud-native strategy

How DevOps is integral to a cloud-native strategy image
Containerisation allows applications to be made environment-agnostic and eliminates application conflicts between developers and operations teams, in turn allowing greater collaboration between developers and testers. Breaking down monolithic applications into constituent microservices also increases agility and creates a common toolset, terminology, and set of processes between development and operations teams, which makes it easier for these teams to work with one another. This enables the advanced automation of processes and contributes to an organisation’s move towards agile software development (defined by the continuous delivery of software created in rapid iterations). It’s important to stress that these technologies will only be successfully implemented if that cultural shift happens too, which is where embracing DevOps becomes key. Going cloud-native is a gradual process and a learning experience. Most organisations have established IT environments that use on-premise applications.


"An increase in state digital surveillance powers, such as obtaining access to mobile phone location data, threatens privacy, freedom of expression, and freedom of association, in ways that could violate rights and degrade trust in public authorities -- undermining the effectiveness of any public health response. Such measures also pose a risk of discrimination and may disproportionately harm already marginalized communities," the joint statement said. "These are extraordinary times, but human rights law still applies. Indeed, the human rights framework is designed to ensure that different rights can be carefully balanced to protect individuals and wider societies. "States cannot simply disregard rights such as privacy and freedom of expression in the name of tackling a public health crisis. On the contrary, protecting human rights also promotes public health. Now more than ever, governments must rigorously ensure that any restrictions to these rights is in line with long-established human rights safeguards." As part of the statement, the signatories set out eight proposed conditions for all governments to adhere to if increased digital surveillance is used to respond to the COVID-19 pandemic.


Fog and Edge Computing: Principles and Paradigms provides a comprehensive overview of the state-of-the-art applications and architectures driving this dynamic field of computing while highlighting potential research directions and emerging technologies. Exploring topics such as developing scalable architectures, moving from closed systems to open systems, and ethical issues arising from data sensing, this timely book addresses both the challenges and opportunities that Fog and Edge computing presents. ... The Cloud Adoption Playbook helps business and technology leaders in enterprise organisations sort through the options and make the best choices for accelerating cloud adoption and digital transformation. Written by a team of IBM technical executives with a wealth of real-world client experience, this book cuts through the hype, answers your questions, and helps you tailor your cloud adoption and digital transformation journey to the needs of your organisation. ... The updated edition of this practical book shows developers and ops personnel how Kubernetes and container technology can help you achieve new levels of velocity, agility, reliability, and efficiency.


Applications: Combining the old with the new


There are a few reasons why mainframes applications cannot be migrated to public cloud infrastructure easily. Cresswell says mainframe applications will not run on the underlying cloud hardware without significant refactoring and recompilation. “They are typically compiled into mainframe-specific machine code and the mainframe instruction-set architecture is substantially different from the x86 platforms that underpin almost all cloud services,” he says. “Legacy mainframe applications rely on infrastructure software to manage batch and online activity, data access and many other legacy mainframe features. Like the applications themselves, this infrastructure software is also tied to the physical mainframe hardware and will not run in a conventional x86 cloud environment.” Another barrier to migrating mainframe systems is that the mainframe software development pipeline cannot support many of the rapid deployment features that cloud-native applications rely on, says Cresswell, and it is virtually impossible to spin up testing environments on mainframes without extensive planning.


7 Key Principles to Govern Digital Initiatives


An important starting point is to take an inventory of digital initiatives. This may sound like a straightforward task, but it is often quite challenging. People are reluctant to share information for fear they may lose control over their initiatives. Thus, it is helpful to stress that the inventory phase is about the centralization of information about digital initiatives, not control over them. Fred Herren, senior vice president, digital and innovation at SGS, the world’s largest provider of inspection, testing, and certification services, understood that applying a top-down approach to rules rarely works in decentralized cultures. He noted, “I think it’s necessary to walk the talk rather than give instructions. I’ve managed to get a lot of information because I’m not telling employees to stop [their activities]. I walk around and ask people what’s new and I always react positively.” ... Establishing appropriate key performance indicators (KPIs) is a critical exercise, particularly for digital initiatives that are highly dependent on strategic priorities related to the company’s future vision, success, and implementation objectives. However, when we asked leaders how they measure the performance of digital initiatives, most of them answered in one of two ways: either “we don’t” or “it depends.”


Emerging from AI utopia

Embedded Image
Facial recognition is a good example of an AI-driven technology that is starting to have a dramatic human impact. When facial recognition is used to unlock a smartphone, the risk of harm is low, but the stakes are much higher when it is used for policing. In well over a dozen countries, law enforcement agencies have started using facial recognition to identify “suspects” by matching photos scraped from the social media accounts of 3 billion people around the world. Recently, the London Metropolitan Police used the technology to identify 104 suspects, 102 of whom turned out to be “false positives.” In a policing context, the human rights risk is highest because a person can be unlawfully arrested, detained, and ultimately subjected to wrongful prosecution. Moreover, facial recognition errors are not evenly distributed across the community. In Western countries, where there are more readily available data, the technology is far more accurate at identifying white men than any other group, in part because it tends to be trained on datasets of photos that are disproportionately made up of white men. Such uses of AI can cause old problems—like unlawful discrimination—to appear in new forms. Right now, some countries are using AI and mobile phone data to track people in self-quarantine because of the coronavirus disease 2019 pandemic. The privacy and other impacts of such measures might be justified by the scale of the current crisis, but even in an emergency, human rights must still be protected. Moreover, we will need to ensure that extreme measures do not become the new normal when the period of crisis passes.


Is Blockchain Necessary? An Unbiased Perspective

Is Blockchain Necessary? An Unbiased Perspective
Bankers hate blockchain. It’s obvious why they would; the greatest advantage of blockchain is that it cuts down on costs, only requiring infrastructure costs. No transaction fees, no maintenance charges, nothing. Effectively, blockchain makes banking obsolete, and honestly, I feel it should. The banking industry has remained unchanged over millennia. It is an integral part of society whose mismanaged monetary transactions have incited myriad wars. Unfortunately, the banking industry is in a pathetic state. Bankers have too much power, control and streams of revenue. It needs to topple. It’s a legacy system, and the pain points of this system haven’t changed since the days of Venetian merchants. There is so much abuse of power involved, and the fact that it is legal paints a grim picture. For example, the man who invented the credit card never wanted interest rates to go over 8%. Today, banks on average charge from 12% to 18% not including transaction, processing and various other fees. Blockchain can destroy and recreate this system. However, this brings us to the greatest chink in blockchain’s armor: This transformative process is expensive and decentralized.


Remote Working: What It Means For RPA


RPA still has considerable risks with remote working. If anything, companies will need to engage in even more planning with their systems. “Enterprise grade security needs to be baked into any RPA platform from the start, which helps provide greater resilience and business continuity,” said Jason Kingdon, who is the Executive Chairman at Blue Prism. There will also need to be more attention paid to managing bot development and deployment. Otherwise there could be much more sprawl across an organization, lessening the benefits of the technology. This is why its important to have a Center-of-Excellence or COE (you can learn more about this from one of my recent Forbes.com posts). “You need to have a group of champions who control the system, and monitor what bots are being built and who is building them,” said Tabakman. “It’s best to provide regular training around bot design and consider an approval process, where your champions review bots before they’re deployed. You’ll want to ensure that a bot being created doesn’t create more problems than it solves, such as bots that go into infinite loops, resulting in more work for IT teams.


Overcoming flat data to unlock business insight and productivity

Overcoming flat data to unlock business insight and productivity image
Artificial intelligence is eliminating entire swathes of manual intervention in the processing of documents, and, more importantly, adding context to them. It’s not enough to simply scan a document and store it along with a reference number: the technology must be able to add meaning to it and to create links with other related data, structured or unstructured. This type of technology falls into a category that we call Context Driven Productivity. At its core is the ability to extract information from flat data and transform it into semantic data, whereby links are created to other data sources, both internal and external, building relationships, connections and additional meaning. Semantic data allows humans or AI robots to gain contextual information automatically, rather than having to rely on a limited number of hard-wired connections. In practical terms, the possibilities are enormous. Not only will administrative workers be freed from the tedious task of manually processing incoming documents, but the resulting context-driven data will be infinitely more useful to any organisation.


How cloud computing is changing the laboratory ecosystem


Cloud computing allows labs to partake in immense computing processes without the cost and complexity of running onsite server rooms. Switching from an onsite solution to the cloud alleviates the costs of IT infrastructure, reducing the cost of entry into the industry, while also leveling the playing field for smaller laboratories. Moreover, cloud computing can allow data to be extracted from laboratory devices to be put in the cloud. Device integration between lab equipment and cloud services allows real-life data from experiments to be collated in a cloud system. One of the most popular products in the market is Cubuslab, a plug-and-play solution that serves as a laboratory execution system and collects instrument data in real time as well as managing devices remotely. This new collection of high amounts of data requires a centralised system that integrates the scientists protocols and experimental annotations. The electronic lab notebook, is starting to become a common tool in research by allowing users to organise all their different data inputs and retrieve this data at any point. This also allows for large R&D projects to effectively control data over their scalability potential.



Quote for the day:


"The art of communication is the language of leadership." -- James Humes


Daily Tech Digest - April 05, 2020

AI Transforming & Automating The Consumer Goods Industry

AI
Utilizing AI algorithms, machines outfitted with intelligent automation can assess emerging production issues and are liable to mess quality up. At the point when they detect a potential issue, they can automatically notify manufacturing personnel and may even autonomously execute corrective actions. By improving the customer experience, retailers can release altogether new ways to deal with customer engagement and interaction. With intelligent automation, they can identify customers’ anticipated needs at exact times and catch the correct minute with the correct idea in the quest for competitive advantage. The automation of customer experience processes is seeing somewhat less footing compared to different parts of intelligent automation. Today, brands and retailers have started to use AI-fueled engines to automatically trigger email campaigns. A much progressively amazing utilization of this capability is to apply it to the order fulfillment process, empowering users to make purchases legitimately from within the campaign.


Corporate culture complicates Kubernetes and container collaboration


When it comes to navigating corporate culture, things get a bit difficult for Kubernetes and container proponents. For example, 40% of survey respondents cited a lack of internal alignment as a problem when selecting a Kubernetes distribution. Surprisingly, in some cases, business leaders want to get their hands in the process. Plus, there are many other hands involved in the decision -- 83% say more than one team is involved in choosing a Kubernetes distribution.  The primary decision-maker varies from organization to organization, depending in part on whether Kubernetes is running in development or production. Development teams are the primary decision makers 38% of the time when Kubernetes is deployed only for development, while infrastructure teams are the primary decision makers 23% of the time in production environments. It's notable that C-level executives are involved 18% of the time. "This involvement is occurring because enterprises are choosing their next-generation platform, and that earns executive attention," the survey's authors relate. The survey also finds a significant disconnect between the views of upper-level company executives and developers: 46% of executives think the biggest impediment to developers is integrating new technology into existing systems.


Accelerating data-driven discoveries

Paradigm4 allows users to integrate data from sources like genomic sequencing, biometric measurements, environmental factors, and more into their inquiries to enable new discoveries across a range of life science fields.
Matz says SciDB did 1 billion linear regressions in less than an hour in a recent benchmark, and that it can scale well beyond that, which could speed up discoveries and lower costs for researchers who have traditionally had to extract their data from files and then rely on less efficient cloud-computing-based methods to apply algorithms at scale. “If researchers can run complex analytics in minutes and that used to take days, that dramatically changes the number of hard questions you can ask and answer,” Matz says. “That is a force-multiplier that will transform research daily.” Beyond life sciences, Paradigm4’s system holds promise for any industry dealing with multifaceted data, including earth sciences, where Matz says a NASA climatologist is already using the system, and industrial IoT, where data scientists consider large amounts of diverse data to understand complex manufacturing systems. Matz says the company will focus more on those industries next year. In the life sciences, however, the founders believe they already have a revolutionary product that’s enabling a new world of discoveries.



Cyber Attack Disrupts COVID-19 Payouts: Hackers Take Down Italian Social Security Site

Web browser screen showing error message, in Italian, as the INPS site was shutdown by hackers
We've already seen supposed "elite hackers" attacking the World Health Organization, cyber criminals hitting a COVID-19 vaccine testing facility with ransomware and healthcare workers being targeted with Windows malware using coronavirus information as the lure. Now, it has been reported, hackers have forced the Italian social security website to shut down for a period, as the most vulnerable in society started their claims for a €600 ($655) crisis payout. The general director of Italian welfare agency INPS, Pasquale Tridico, told the state broadcaster RAI on April 1 that there had been several hacker attacks across the previous few days. "They continued today, and we had to close the website," Tridico said. This at the same time as the site was receiving 100 application requests per second, according to Tridico. Italian police have been informed of the ongoing cyberattacks, and the ruling Democratic Party has suggested that national security services could be put on the case of finding out who is responsible.


What is a design sprint? A 5-day plan for improving products and services

What is a design sprint? A 5-day plan for improving products and services
Design sprints start with a team of around four to seven people, which is the recommended team size according to GV. Teams include a facilitator, designer, decision maker, product manager, engineer and someone from a relevant business unit. The decision maker on the team is often the CEO, especially at smaller companies or startups. A design sprint is intended to move quickly, lasting just five days, and it’s designed to spur ideas and create learning opportunities without having to build and launch a completed product or service. With a design sprint, you can get fast feedback, improve products and services and find new opportunities throughout the five-day sprint by creating a testable prototype. The prototype will allow your team to get a better sense of how customers and clients will react to the finished product, what needs to be changed and what customers enjoyed about the product or service. Design sprints are broken out into five major phases that take place over the five-day sprint. These phases are intended to help you develop the best team to tackle a project and to guide your business through the design sprint.


Distributed disruption: Coronavirus multiplies the risk of severe cyberattacks

coronavirus cyberattacks
When it comes to remote work, VPN servers turn into bottlenecks. Keeping them secure and available is a number-one IT priority. Hackers can launch DDoS campaigns on VPN services and deplete their resources, knocking out the VPN server and limiting its availability. The implications are clear: Since the VPN server is the gateway to a company’s internal network, an outage can keep all employees working remotely from doing their job, effectively cutting off the entire organization from the outside world. During an unprecedented time of peak traffic, the risk of a DDoS attack is growing exponentially. If the utilization of the available bandwidth is very high, it does not take much to cause an outage. In fact, even a tiny attack can become the last nail in the coffin. For instance, a VPN server or firewall can be taken down by a TCP blend attack with an attack volume as low as 1 Mbps. SSL-based VPNs are just as vulnerable to an SSL flood attack, as are web servers. Making matters worse, many organizations either use in-house hardware appliances or rely on their Internet carrier to ward off incoming attacks.


How to Prepare for Your Next Cybersecurity Compliance Audit

post 8 image 1
Reading a list of cybersecurity compliance frameworks is like looking at alphabet soup: NIST CSF, PCI DSS, HIPAA, FISMA, GDPR…the list goes on. It’s easy to be overwhelmed, and not only because of the acronyms. Many frameworks do not tell you where to start or exactly how to become compliant. Cybersecurity best practices from the Center for Internet Security (CIS) provide prioritized, prescriptive guidance for a strong cybersecurity foundation. And, they support your efforts toward compliance with the aforementioned alphabet soup. CIS offers multiple resources to help organizations get started with a compliance plan that also improves cyber defenses. Each of these resources is developed through a community-driven, consensus-based process. Cybersecurity specialists and subject matter experts volunteer their time to ensure these resources are robust and secure. What they are: The CIS Controls approach cyber defense with prioritized and prescriptive security guidance. There are 20 top-level CIS Controls and 171 Sub-Controls, prioritized into three Implementation Groups (IGs). The CIS Controls IGs prioritize cybersecurity actions based on organizational maturity level and available resources.



Trustworthy AI must be designed and trained to follow a fair, consistent process and make fair decisions. It must also include internal and external checks to reduce discriminatory bias. Bias is an ongoing challenge for humans and society, not just AI. However, the challenge is even greater for AI because it lacks a nuanced understanding of social standards—not to mention the extraordinary general intelligence required to achieve “common sense”— potentially leading to decisions that are technically correct but socially unacceptable. AI learns from the data sets used to train it, and if those data sets contain real-world bias then AI systems can learn, amplify, and propagate that bias at digital speed and scale. For example, an AI system that decides on-the-fly where to place online job ads might unfairly target ads for higher paying jobs at a website’s male visitors because the real-world data shows men typically earn more than women. Similarly, a financial services company that uses AI to screen mortgage applications might find its algorithm is unfairly discriminating against people based on factors that are not socially acceptable, such as race, gender, or age. In both cases, the company responsible for the AI could face significant consequences, including regulatory fines and reputation damage.


AI runs smack up against a big data problem in COVID-19 diagnosis

qure-ai-covid-19-lung-xray-april-2020.png
It's simple in theory to identify what a computer should look for. An X-ray or a CT scan will show formations in the lung that are associated with a number of respiratory conditions including pneumonia. The feature in an image most often linked to a COVID-19 case, although not exclusive to COVID-19, is what's called "ground-glass opacity," a kind of haze hovering in an area of the lung, caused by a build-up of fluid. Opacities and other anomalies can show up even in asymptomatic COVID-19 patients. What slows things down is that neural networks have to be tuned to pick out opacities in the pixels of a high-resolution image, and that takes data. It also takes time working with physicians who know what to look for in the data. Both data and expertise are in short supply at the outset of a pandemic.  The neural network programs that Xu and others are deploying have been refined by computer scientists to a high degree of sophistication over many years and they are providing ready tools with which to build new systems. The system that Xu and team built combines two deep learning neural networks, a "ResNet-50," the standard for many years for image recognition, and something called "UNet++" that was developed at Arizona State University in 2018 for the specific purpose of processing chest CT scans.


Code Search Now Available to Browse Google's Open-Source Projects

Code Search is used by Google developers to search through Google's huge internal codebase. Now, Google has made it accessible to everyone to explore and better understand Google's open source projects, including TensorFlow, Go, Angular, and many others. CodeSearch aims to make it easier for developers to move through a codebase, find functions and variables using a powerful search language, readily locate where those are used, and so on. Code Search provides a sophisticated UI that supports suggest-as-you-type help that includes information about the type of an object, the path of the file, and the repository to which it belongs. This kind of behaviour is supported through code-savvy textual searches that use a custom search language. For example, to search for a function foo in a Go file, you can use lang:go:function:foo. For repositories that include cross-reference information, Code Search is also able to display richer information, including a list of places from where a given symbol is referenced. Code Search repositories that provide cross-reference information include Angular, Bazel, Go, etc.



Quote for the day:


"Change your friends if they are holding you back - pick the new ones with caution and care." -- Tim Fargo


Daily Tech Digest - April 04, 2020

"Unlike regular times when you could dispatch a technician to hospitals, or you could actually show the doctors how to operate equipment, fix it, and so on, they need to do it remotely," Churchill said. "So we combined them with video and AR." Once TechSee receives an inquiry, it is given to a technician and the technician sends a web link via SMS to a hospital staff member. This allows the hospital support person to use their smartphone camera or tablet camera to show the technician the issue, Churchill noted. The user shows the technician the problem, and then the technician diagnoses the issue and uses AR to visually guide the hospital employee to a resolution, he added. Churchill said that TechSee works with more than 100 enterprises in a variety of sectors, with Medtechnica being one of its biggest clients in healthcare. While TechSee's solution can be applied to any system--including X-rays, routers, smart thermostats, and more--the demand for ventilators is amplifying that use case. This solution is completely web-based, so the user isn't forced to download an app. The AI-powered platform can recognize devices and technical issues, as well as automate the support process, Churchill said.


Very rarely, can risk be completely eliminated. However, inherent risk can be mitigated through a combination of risk mitigation strategies, risk shifting, and at the end of the day, acceptance of the residual risk. When addressing big data risks, in particular, two types of risks must be discussed: the risk of data breaches and the risk of data misuse. The former is addressed through data security, while the latter is most commonly addressed through data privacy and regulation. When it comes to data security, one of the most significant sources of risk is the overreliance on fairly immutable data elements for identification such as, for example, social security number, names, addresses, dates of birth, credit card numbers, and the like. When any long-lived data element is exposed and misused, the damage is usually broad and long-lasting because changing those data elements is difficult and costly. The mechanism that I’m referring to is known as public-key cryptography and digital signatures, which was invented in the ’80s. While this is widely spread as the method that web browsers use to identify websites (adding the “secure” or “SSL/TLS” labels to the URL bar), it has not had enough traction outside of that specific domain.


secured vpn tunnel
For one, the WireGuard protocol does away with cryptographic agility -- the concept of offering choices among different encryption, key exchange and hashing algorithms -- as this has resulted in insecure deployments with other technologies. Instead the protocol uses a selection of modern, thoroughly tested and peer-reviewed cryptographic primitives that result in strong default cryptographic choices that users cannot change or misconfigure. If any serious vulnerability is ever discovered in the used crypto primitives, a new version of the protocol is released and there’s a mechanism of negotiating protocol version between peers. WireGuard uses ChaCha20 for symmetric encryption with Poly1305 for message authentication, a combination that’s more performant than AES on embedded CPU architectures that don’t have cryptographic hardware acceleration; Curve25519 for elliptic-curve Diffie-Hellman (ECDH) key agreement; BLAKE2s for hashing, which is faster than SHA-3; and a 1.5 Round Trip Time (1.5-RTT) handshake that’s based on the Noise framework and provides forward secrecy. It also includes built-in protection against key impersonation, denial-of-service and replay attacks, as well as some post-quantum cryptographic resistance.


How to start your career in cyber security

Unlike many professions, you don’t need cyber security experience to get into the field, although many people entering the field will come from jobs that have similar skillsets, such as systems administration or information analysis. If you can demonstrate the relevance of your existing experience – what recruiters call ‘transferable skills’ – there’s no reason why you can’t get a foothold on the cyber security career ladder. There are also plenty of entry-level positions available. Account executives and junior penetration testers, for example, tend to have little work experience, and can learn while on the job. ... The best way to gain an advantage over other prospective cyber security professionals is to become qualified. The qualifications you need will depend on your career path. If you don’t have this mapped out yet, or you simply want a strong overall understanding of how to navigate security risks, you should seek out a course that covers general topics, such as our Certified Cyber Security Foundation Training Course. This one-day course explains the fundamentals of cyber security and shows you how to protect your organisation from a range of threats.


Is COVID-19 Driving a Surge in Unsafe Remote Connectivity?

Is COVID-19 Driving a Surge in Unsafe Remote Connectivity?
As more organizations shift to a remote workforce, new working patterns and technology adoption - including shadow IT - may lead to corporate data suddenly being poorly secured or stored in a manner that violates regulatory requirements. And more systems may be spun up that fail to secure commonly used protocols, such as RDP. "Changes to the network perimeter can also create unanticipated threats, as a higher burden is placed on remote-access systems, and if not correctly implemented, may expose systems to the internet," says Matt Linney, a senior security consultant at 7 Elements. "Looking at this now could save substantial loss in the future." The problem may be exacerbated by COVID-19 driving many organizations to rapidly embrace the equivalent of bootstrap approaches to digital transformation and moving to cloud-based platforms and core services without having first carefully planned, tested, validated and secured their approach (see: Zoom Fixes Flaw That Could Allow Strangers Into Meetings).



Why Continuous Monitoring of Critical Data Is So Essential

To ensure business continuity, manufacturers in India that now have a 100 percent remote workforce because of the COVID-19 pandemic must be vigilant about ensuring critical data is protected through continuous monitoring, says Ravikiran S. Avvaru, head of IT and security at the Gurgaon-based manufacturing group Apollo Tyres Ltd. "As part of our business continuity plan, we identified critical applications for the business which are integrated with the dealers, customers and suppliers and discussed with our third-party vendors, such as Amazon and Microsoft, how to extend support in ensuring the applications are up and running and in secure fashion," Avvaru says in an interview with Information Security Media Group. In addition to enhancing security for business-critical applications accessible in the cloud, for accessing legacy applications housed at a data center, the company has deployed personal firewalls, a VPN along with remote desktop protocols and data leak prevention tools, he explains.


According to Microsoft, Fabrikam called in Microsoft's Cybersecurity Solutions Group's Detection and Response Team (DART) eight days after the employee had opened the phishing email, by which time its computers and critical systems were failing and its network bandwidth had been completely overrun by Emotet. The malware used the victim's compromised computers to launch a distributed denial of service (DDoS) and overwhelm its network. "The virus threatened all of Fabrikam's systems, even its 185-surveillance camera network. Its finance department couldn't complete any external banking transactions, and partner organizations couldn't access any databases controlled by Fabrikam. It was chaos," Microsoft's DART team writes. "They couldn't tell whether an external cyberattack from a hacker caused the shutdown or if they were dealing with an internal virus," it explains further. "It would have helped if they could have even accessed their network accounts. Emotet consumed the network's bandwidth until using it for anything became practically impossible. Even emails couldn't wriggle through."


CSO Pandemic Impact Survey

As of March 23rd, that number had climbed to 77.7%, an increase of 4.7-fold. Notable was high tech firms grew which grew from 31.9%, to 90.2%. While 81% expressed confidence that their existing security infrastructure could handle their employees working from home, 61% were more concerned about security risks targeting WFH employees today than they were three months ago. ... Despite the high levels of confidence that their security infrastructures are up to the task at hand, 22% of organizations have found themselves out shopping for new security solutions/services to address the new work dynamic. Businesses least likely to be investing in new technology or services came from the same industries that identified as most prepared: financial services (12%) and healthcare (14%). Only 7% of SMB organizations (fewer than 1,000 employees) indicated that they had to make security purchases in response to the current conditions, which may indicate either a lack of visibility into their risk environments, a lack of available budget to support new investments, or a combination of both.


young man on video conference coronavirus remote communication telecommuting by gcshutter getty ima
If your company strongly encourages workers to stay home in response to the virus a significant portion of your company might be working from home for extended periods of time. From a data-protection standpoint; this significantly increases the chances that important intellectual property will be created outside of your data center. If your company currently relies on storing such data on file servers or similar systems, remote employees will probably not be able to use such systems easily. As a result, they will create and store important data directly on their laptops, leaving centralized company storage out of the picture. This means that you should probably examine your company's policy regarding data protection of laptops and mobile devices. Many companies don’t provide backup and recovery for mobile devices, despite the fact that most experts feel they should. Now might be a good time to do so. The main reason early attempts at laptop backup failed was users would kill the backup process because it slowed them down, and it cost too much. The good news is several providers can back up your laptops and mobile devices in such a way that users never realize backups are running.


AI needs to show return
One key driver of lack of return from AI is the simple failure to invest enough. Survey data suggest most companies don’t invest much yet, and I mentioned one above suggesting that investment levels have peaked in many large firms. And the issue is not just the level of investment, but also how the investments are being managed. Few companies are demanding ROI analysis both before and after implementation; they apparently view AI as experimental, even though the most common version of it (supervised machine learning) has been available for over fifty years. The same companies may not plan for increased investment at the deployment stage—typically one or two orders of magnitude more than a pilot—only focusing on pre-deployment AI applications. Of course, with any technology it can be difficult to attribute revenue or profit gains to the application. Smart companies seek intermediate measures of effectiveness, including user behavior changes, task performance, process changes, and so forth—that would precede improvements in financial outcomes. But it’s rare for these to be measured by companies either. Along with several other veterans of big data and AI, I am forming the Return on AI Institute, which will carry out programs of research and structured action, including surveys, case studies, workshops, methodologies, and guidelines for projects and programs.



Quote for the day:

"Leadership development is a lifetime journey, not a quick trip." -- John Maxwell