Daily Tech Digest - July 24, 2018

Rapid7 penetration tests reveal multitude of software flaws, network misconfigurations

Rapid7 penetration tests reveal multitude of software flaws
People are simply too predictable when it comes to creating passwords, and that’s even if an organization enforces password length and complexity standards. For example, “Summer2018!” meets the objectives of a password that is required to have at least one uppercase letter, one lowercase letter, one number, and one special character. But Rapid7 noted that it is one of the worst passwords a person can choose. Seasonal passwords came in as the third most common type of password. ... What do organizations most care about protecting? Despite the almost-daily data breach announcements, Rapid7 found that organizations are more concerned with protecting their own sensitive data such as internal communications and financial metrics than protecting the sensitive data of their customers or employees. As for organizations’ top five biggest priorities for protecting information, sensitive internal data is at the top with 21 percent, PII was second at 20 percent, authentication credentials were third at 14 percent, protecting payment card data came in at 7.8 percent, and bank account data was fifth at 6.5 percent.



Three AI And Machine Learning Predictions For 2019


The U.S. Army is currently using machine learning to predict when combat vehicles need repair. Think about it, there are millions of pieces of equipment that our Army uses each and every day. To keep track of the data involved, they are recruiting the help of an AI assistant. For the first implementation, a few dozen armored infantry transports will receive sensors inside of the vehicles’ engines. These sensors will record temperature and RPM and will transmit it to the software. Machine learning capabilities will look for patterns in the data that match engine failures in similar vehicles. What if your car did this? AAA might become obsolete if your car could tell you that the transmission is about to crap out on you. If the army is using the technology, I'm sure it won't be long till we see it in the civilian world. Automotive isn't the only industry that is seeing potential new uses for this tech, healthcare is about to see some changes too. As if Google wasn’t already on the AI map, they have begun to predict the likelihood of a patient’s death using machine learning – with staggering 95% accuracy.



data center technician
NVMe is a protocol for accessing high-speed storage media that’s designed to reduce latency and increase system and application performance. It's optimized for all-flash storage systems and is aimed at enterprise workloads that require low latency and top performance, such as real-time data analytics and high-performance relational databases. Storage vendors have been re-tooling their systems to support the faster interconnect protocol, and IBM is no exception. A key change in the FlashSystem 9100 is the use of small form factor NVMe drives. IBM redesigned its FlashCore technology to fit into a standard 2.5-inch SSD form factor with NVMe interfaces – a move that reduced the physical size of the drives by more than half. That redesign made an impression on Owen Morley, director of infrastructure at online dating platform Plenty Of Fish. Morley is among a group of users of IBM's all-flash storage who came together at an event in Mexico City to share their thoughts on the new 9100 system and the potential for NVME-accelerated storage in their own enterprises.



Edge computing will be vital for even simple IoT devices

The evolution of wearables required each generation to monitor and collate a greater number of measurements (raw data). Developers found optimal ways of doing this by processing raw data locally (on the edge of the application using the Bluetooth chips’ increasingly powerful onboard processors) and then forwarding to a smartphone app and the cloud (for data sharing and tracking) only the essential information (desired data). The technology enabled continuous (low-latency) monitoring, and the modest Bluetooth wireless throughput was sufficient to update apps and cloud servers of the key tracking information without requiring extended on-air duration that would otherwise be needed to stream raw data. Sending only the key information also minimized the impact on the user’s cellphone data allowance (data cost). Things go wrong, hackers never quit Because users didn’t always carry their smartphones, wearables had to operate autonomously when not connected. Resiliency was built into the systems. They didn’t depend on a continuous network or internet connection for successful operation (redundancy).


Nation-State Spear Phishing Attacks Remain Alive and Well

Nation-State Spear Phishing Attacks Remain Alive and Well
The trouble with phishing is that it relies on social engineering - meaning it's designed to trick users - and it can potentially be used to compromise any online account. Unfortunately, we humans are both easy to trick - at least some of the time - as well as fallible. And attackers can pummel would-be victims with phishing attacks until one succeeds. The scale of the phishing challenge is reflected by the number of video interviews touching on phishing that I recently conducted at the London Infosecurity Europe conference. Experts described everything from the increasingly targeted nature of phishing attacks and the importance of never forgetting the human factor as well as training users, using technology to extract data from emails and attachments and implementing the practice of tracking malicious domains to better block phishing campaigns. But as this patchwork of practices, procedures and technology demonstrates, there's no single fix for the phishing problem. Furthermore, with more of our business and personal lives now living in the cloud, the impact of falling victim to a phishing attack continues to increase.



Privacy pros gaining control of technology decision-making over IT

“This global survey is critical in our efforts to better understand how privacy professionals are addressing compliance challenges and the technologies that are being deployed now and in the near future,” said Chris Babel, CEO of TrustArc. “Though security budgets remain larger, we’re seeing a marked shift in privacy teams’ influence over technology purchasing decisions. This trend confirms what we’re seeing among our customers – that they have a growing need for technology solutions to help them manage privacy compliance at scale on a global basis.” The EU GDPR and other global and domestic legal reforms, combined with technological advancements, have made the task of operationalizing privacy and data protection vastly more complicated. Businesses now must account for how data is entering the organization, how it is being used, what permissions are attached to it and who has the responsibility for managing it. To address these challenges, the demand for privacy technology continues to grow rapidly.


Measuring Tech Performance: You’re Probably Doing It Wrong


First, velocity is a relative and team-dependent measure, not an absolute one. Teams usually have significantly different contexts which make comparing velocities inappropriate. (Seriously, don’t do this.) Second, when velocity is used as a productivity measure, teams are very likely to game it: they inflate their estimates and focus on completing as many stories as possible at the expense of collaboration with other teams (which might decrease their velocity and increase the other team's velocity, making them look bad). Not only does this destroy the utility of velocity for its intended purpose, it also inhibits collaboration between teams. Velocity as a productivity metric violates our guidelines by focusing on focusing on local measures and not global measures. This is particularly obvious in the second critique above: by (understandably) making choices to optimize their own velocity, teams will often not collaborate with other teams. This often results in scenarios where subpar solutions are available to the organization because there isn't’ a focus on global measures.


How to spot bad data, and know the limitations when its good

A 2016 survey of CEOs found 84 percent of them felt concerned about the quality of data they used while making decisions. And they have valid reasons for feeling wary — bad data could cause financial repercussions if business leaders put too much trust in material that’s ultimately lacking. It’s also crucial to consider the wasted time from bad data. When professionals engage in data-driven marketing, they may be relying on content filled with non-human influences such as bots or malware. If that happens, they could get false perceptions of customers’ journeys at websites or the factors that cause them to linger on certain pages versus others. There are reputational risks, too. If a company releases public research that later gets proven inaccurate, it’ll be difficult for that entity to encourage trust in future material. When business leaders blindly trust data — especially when making decisions — they inevitably set the stage for problems. Staying aware of the characteristics of bad data discussed here is an excellent first step in being proactive.


Law firms failing to meet their client’s digital expectations, according to study

Law firms failing to meet client̢۪s digital expectations image
Martin Flick, CEO of Olive Communications, said: “Today’s busy, always on and mobile first consumer wants to buy goods and services, and communicate with sellers whenever, wherever, and however they choose.” “Increasingly this is through digital interaction. When it comes to their lawyer or solicitor, they want to engage in the same way, without the frustration of having to wait days for paper documents to arrive in the post or for an email to come through with the answer to a question that could be easily resolved with an instant message or automated response.” “Consumers want more control over their legal affairs with sometimes, little or no human intervention, and with the speed, efficiency, and security that multiple channel web-based communications offer.” The study found that a significant portion of law firms are embracing new technology internally, for example, 69% are using IM and chat to communicate with each other. However, few of these firms are extending the use of technology externally to enhance the client experience.


Backup best practices: A NAS is not enough

The idea of 3-2-1 is to have three copies of every file, two of which are on different physical devices, and one of which is located off-site. Our guy didn't have that. He counted entirely on one NAS for all his backups. He has an offsite backup, but it hadn't been updated.The "off" part of my strategy is to have at least one full backup air-gapped from the Internet. I do this for my stuff by keeping one backup server shut down, except for a once a week quick incremental backup nibble ... The point of this article, though, is to remind you of the 3-2-1-off-and-away strategy and to not be dumb. A single NAS as your backup strategy is not enough. As a rule, I have two NAS boxes running all the time. One is my hot, live working environment. Another is an offline backup. In my case, I was fortunate that the ioSafe folks sent me their flood-and-fire-proof ioSafe 1515+, so my backup NAS isn't just a second NAS, it's an armored bomb-proof bunker of a backup NAS. At some point in the future, I'll take you through my whole storage architecture.



Quote for the day:


"You may not control all the events that happen to you, but you can decide not to be reduced by them." -- Maya Angelou


Daily Tech Digest - July 23, 2018

Most of AI’s Business Uses Will Be in Two Areas


The business areas that traditionally provide the most value to companies tend to be the areas where AI can have the biggest impact. In retail organizations, for example, marketing and sales has often provided significant value. Our research shows that using AI on customer data to personalize promotions can lead to a 1-2% increase in incremental sales for brick-and-mortar retailers alone. In advanced manufacturing, by contrast, operations often drive the most value. Here, AI can enable forecasting based on underlying causal drivers of demand rather than prior outcomes, improving forecasting accuracy by 10-20%. This translates into a potential 5% reduction in inventory costs and revenue increases of 2-3%. While applications of AI cover a full range of functional areas, it is in fact in these two cross-cutting ones—supply-chain management/manufacturing and marketing and sales—where we believe AI can have the biggest impact, at least for now, in several industries. Combined, we estimate that these use cases make up more than two-thirds of the entire AI opportunity.



How SD-WAN Will Make The Cloud Much Much Bigger

cloud balloon inflate cloud computing grow big blow up
The need to be connected to the mother ship is what brings the Cloud into its meaningful existence because we live and work at the edges of the Cloud. SD-WAN is not just a market but a platform as well that will eventually evolve into user-defined WAN (UD-WAN). To clarify, the term applies to enterprise users and not consumers. And the purpose of SD-WAN is to connect and fully integrate the very edges of the enterprise – be it corporate headquarters, branch/remote offices or the mobile millions. In other words, us, the users. But if we look at the concept of the cloud it is pretty clear that it is referenced in an abstract form. After all what is this cloud thing? Some physical space in a non-descript windowless warehouse? Without its tentacles, the cloud is nothing more than a collection of computers, storage and cooling systems created by geeks and for what purpose? It is those very tentacles in the form of wide-area networks (WAN) that give the Cloud its purpose. And given the explosive adoption of cloud-based applications (Box, Dropbox, Salesforce, SAP, Slack, etc.) cloud computing is not a fad, it is here to stay. However, that is just the beginning.


The value of superior UX? Priceless, but awfully hard to measure

The problem, Cooper continues, is that managers and executives outside of the bubble remain skeptical about investing any more than they have to in UX -- to them, it's a dark art. So, they ask: "What is the ROI of UX?" Asking about ROI, of course, is a manager's way of expressing doubts. "They aren't seeking enlightenment," Cooper says. ... In UX design, he continues "ROI is often about eliminating poor design." Some industry specialists have attempted to put a monetary value on superior UX design. A recent report from CareerFoundry estimates that UX design work delivers a 100-fold return on investment, without even counting the soft benefits. Every $1 investment in UX translates to returns of at least $100 dollars, the report's authors illustrate -- mainly through e-commerce and customer-facing interactions. Add to this the softer, but just as important, ancillary benefits: "fewer support calls, increased customer satisfaction, reduced development waste, and lower risk of developing the wrong idea."


Why techmatters – the challenge for everyone in the UK tech community


If there is a magic recipe for digital innovation, then the UK surely has all the ingredients. We have created and attracted some of the world’s best and most diverse digital talent. We have world-leading businesses, universities and powerful ecosystems that enable expertise to spill over from one part of the economy to another. In almost every sector, I can point to world leaders on the cutting edge of digital transformation. Above all, we have ambition and we have each other. What sets us apart from any other country is that in the UK technology community, we stand on the shoulders of each other. But to really thrive, three things are important. We must stay focused on making tech work for people and our economy. We must not underestimate our international competitors. And, perhaps most importantly, we must accept the enormous responsibility that comes with developing powerful technology. We do have great people in this sector – but we simply don’t have enough of them. And we don’t have the depth of skills and talent that the economy needs as a whole. This, surely, is our biggest challenge.


Why Artificial Intelligence Is Not a Silver Bullet for Cybersecurity

While AI is likely to work quite well over a strictly controlled network, the reality is much more colorful and much less controlled. AI's Four Horsemen of the Apocalypse are the proliferation of shadow IT, bring-your-own-device programs, software-as-a-service systems, and, as always, employees. Regardless of how much big data you have for your AI, you need to tame all four of these simultaneously — a difficult or near-impossible task. There will always be a situation where an employee catches up on Gmail-based company email from a personal laptop over an unsecured Wi-Fi network and boom! There goes your sensitive data without AI even getting the chance to know about it. In the end, your own application might be protected by AI that prevents you from misusing it, but how do you secure it for the end user who might be using a device that you weren't even aware of? Or, how do you introduce AI to a cloud-based system that offers only smartphone apps and no corporate access control, not to mention real-time logs? There's simply no way for a company to successfully employ machine learning in this type of situation.


Unsecured server exposes 157 GB of highly sensitive data from Tesla, Toyota and more

data breach, Level One, Tesla, Toyota, Ford
The unsecured trade secrets and corporate documents had been exposed via the file transfer protocol rsync. UpGuard wrote, “The rsync server was not restricted by IP or user, and the data set was downloadable to any rsync client that connected to the rsync port. The sheer amount of sensitive data and the number of affected businesses illustrate how third- and fourth-party supply chain cyber risk can affect even the largest companies. The automation and digitization of manufacturing has transformed the industry, but it has also created a new area of concern for industries, and one that must be taken seriously for organizations to thrive in a healthy digital ecosystem.” Not only could anyone connect to Level One’s rsync server, but it was also “publicly writable, meaning that someone could potentially have altered the documents there, for example replacing bank account numbers in direct deposit instructions, or embedding malware.” The exposed rsync server was discovered on July 1. Attempts to contact Level One started on July 5, but contact wasn’t established until July 9. The exposure was closed within a day, by July 10.


Organizations Need IT Experts Who Know Basic LAN/WAN Switching and Routing

Responsiveness, security, and reliability are the new hallmarks of networking. Automation, analytics, IoT, policy-based network management, programmability, and virtualization are enabling these changes. The technologies, and the ways they’re being applied, are new. So, IT and networking professionals need new skills to make them work for businesses. In order to appeal to hiring managers, boost their careers, and bring greater value to employers there are fundamental skills that IT and networking professionals need. At a very fundamental level, it’s critical that IT experts know the basics of LAN and WAN switching and routing. These skills will help network engineers configure, verify, troubleshoot, and secure today’s networks. In addition, the evolution of the network creates a growing need for IT professionals who can implement and manage software-centric networks. This involves using APIs, controllers, policies, and virtualization. These technologies and tools allow for greater automation, network intelligence, and agility.


The Engineer’s guide to the future


If AR is hyped, AI is basically the buzzword of the century. Lots of people aren’t really sure what it means, but they know it’s important and that their business needs it. The first thing to know is that modern day Artificial Intelligence doesn’t actually mean a computer being intelligent — it’s basically a catch-all term for computer programs that can “learn”, to improve their operational efficiency or their success. Even at that, lots of applications that say they use AI actually don’t. A chatbot that has a big decision tree in the background isn’t AI, it’s just a big decision tree. If you ask “What is Ragnarok?” and get back the answer “It is simultaneously a great action movie and the ruin of a good character” — it’s probably not artificial intelligence, just quite wise. However, there is plenty of amazing work being done with proper AI and Machine Learning, for a whole heap of use-cases. We don’t need a crystal ball to say that knowing about AI will be beneficial for a future engineering career. Similar to Apple and Google releasing tools to “democratise” Augmented Reality development, each year there are more tools available to enable developers to build AI solutions


In the wake of GDPR, college IT security programs need to evolve

While U.S. universities who offer information security programs typically cover a range of compliance concepts related to U.S. regulations, such as the Health Insurance Portability and Accountability Act (HIPAA) or Sarbanes-Oxley (SOX), the GDPR is something of a game changer because it is not a regulation enacted by a U.S. agency, yet it requires compliance on the part of U.S. entities. The GDPR is only the first of several proposed global regulations governing data privacy. Before 2015, data exchanges between the U.S. and the EU were governed by the Safe Harbor program which allowed the personal data of EU citizens to be exchanged with U.S. providers as long as both sides of the transaction complied loosely with the EU Data Protection Directive. The directive wasn’t as tightly defined as the GDPR and lacked teeth in the form of significant fines or penalties. As a result, up to this point in time, U.S. businesses have not had to unduly concern themselves with regulations enacted outside U.S. borders. GDPR demands a change in that mindset.


Can businesses use blockchain to solve the problem of data management?

Can businesses use blockchain to solve the problem of data management? image
Since the nodes are distributed and operate peer-to-peer, the possibility of bottleneck formation is nonexistent. One of the most important features of blockchain systems, however, is immutability: once an entry is appended to the database, it cannot be removed. Using blockchain for databases seems like a logical step forward. There’s definitely an emerging movement seeking to lay the foundations for a decentralised architecture across industries. With blockchain, a marketplace akin to AirBnB or Uber can materialise for storing data – nodes on the network can be incentivised to replicate and retain information using a blockchain protocol’s inbuilt payment layer. This concept can be taken a step further with the use of sharding and swarming. Sharding offers a greater degree of privacy whereby, instead of sending a file to other nodes, you distribute fragments of said file. In this way, the owner can be sure that those in possession of their data cannot access it, as they will only hold a small (and unreadable) piece – much like torrenting.



Quote for the day:


"Authentic leaders are not afraid to make mistakes, but they fix them faster than they make them." -- George Bernard Shaw


Daily Tech Digest - July 22, 2018

nullBy reducing manual intervention, automated processes can minimise mistakes and human error – but there is still the chance that something can go wrong. Designers of automated processes need to ensure that the appropriate quality outcomes are being measured and assessed against a given specification. Importantly, this must happen throughout the entire process. Let’s think about the car production line again. The cost of finding out that something went wrong at the start of the production process after the car has been built is significant. Instead, process designers will want to identify errors quickly and allow the process to make the necessary changes to ensure a quality product is delivered.  A significant quantity of data is generated through automated processes, but the quantity of data does not compensate for the quality of the data. In order to deliver a quality product at the end of an automated process, a quality data management process is critical. But what is bad data? And, if everything is being automated anyway, why should we care?


Python has brought computer programming to a vast new audience


Not all Pythonistas are so ambitious, though. Zach Sims, Codecademy’s boss, believes many visitors to his website are attempting to acquire skills that could help them in what are conventionally seen as “non-technical” jobs. Marketers, for instance, can use the language to build statistical models that measure the effectiveness of campaigns. College lecturers can check whether they are distributing grades properly. For professions that have long relied on trawling through spreadsheets, Python is especially valuable. Citigroup, an American bank, has introduced a crash course in Python for its trainee analysts. A jobs website, eFinancialCareers, reports a near-fourfold increase in listings mentioning Python between the first quarters of 2015 and 2018. The thirst for these skills is not without risk. Cesar Brea, a partner at Bain & Company, a consultancy, warns that the scariest thing in his trade is “someone who has learned a tool but doesn’t know what is going on under the hood”. Without proper oversight, a novice playing with AI libraries could reach dodgy conclusions.


Top 10 Data Science Use Cases in Insurance


The customers are always willing to get personalized services which would match their needs and lifestyle perfectly well. The insurance industry is not an exception in this case. The insurers face the challenge of assuring digital communication with their customers to meet these demands. Highly personalized and relevant insurance experiences are assured with the help of the artificial intelligence and advanced analytics extracting the insights from a vast amount of the demographic data, preferences, interaction, behavior, attitude, lifestyle details, interests, hobbies, etc. The consumers tend to look for personalized offers, policies, loyalty programs, recommendations, and options. The platforms collect all the possible data to define the major customers` requirements. After that, the hypothesis on what will work or won`t work is made. ... Modern technologies have brought the promotion of products and services to a qualitatively new level. Different customers tend to have specific expectations for the insurance business. Insurance marketing applies various techniques to increase the number of customers and to assure targeted marketing strategies. In this regard, customer segmentation proves to be a key method.


The Evolution Of Data


Traditionally, a platform was used to address an enterprise process workflow — human resources (HR), finance, manufacturing, etc. They are what we categorize as enterprise resource planning (ERP), customer relationship management (CRM), human capital management (HCM), functional setup manager (FSM), information technology operations (ITOps), etc. The data generated by these workflows was then analyzed using analytics or business intelligence applications to make further modifications to workflow. These workflow applications were customized as the data warranted any changes in the workflow. ... The workflow actions will be passed on to the traditional applications or directly to the people or system that will perform the actions. These new systems of intelligence will emerge and will force existing workflow applications to change to be end-user targeted. We are already seeing a trend where AI platforms are slowly becoming a playground for new intelligent applications. More importantly, because open source intelligent platforms in this area are as rich as the enterprise platforms, we are also noticing new generations of applications.


6 trends that are changing the face of UX

An flat, vector-style illustration showing various elements of UX including wireframing and featuring services whose UX is being improved, such as social media companies and driverless cars.
A pattern library acts as a centralised hub for all components of the user interface. Effective pattern libraries provide pattern descriptions, annotations and contextual information. They also showcase the code and pattern variations, and have the ability to add real data into the pattern structure. Once a design system is up and running, it’s only the first step in the journey. It needs to be living. Nathan Curtis, a co-founder of UX firm EightShapes, says: “A design system isn’t a project. It’s a product, serving products.” Like any good product, a design system needs maintenance and improvements to succeed. Both Google and Salesforce have teams dedicated to improving their design systems. The goal is a workflow where changes to the design system update the documentation and the code. The benefits realised by a thoughtful, unified design system outweigh the effort involved in establishing one. There is a consistency across the entire user experience. Engineers and designers share a common language and systems are more sustainable. Designers can spend their time solving harder problems and the actual user experience.


What’s so special about 5G and IoT?

5G mobile wireless network
If we think about our current needs for IoT, what we care about are three things: price, coverage, and lower power consumption. But 5G is focused on increasing bandwidth, and while increased data transfer and speeds are nice, they are not entirely necessary for IoT products. The GSMA outlines 5G will possibly offer 1000x bandwidth per unit area. However, as they state in their own report, bandwidth per unit area is not dependent upon 5G, but more devices connecting with higher bandwidths for longer durations. While it is great that 5G aims to improve this service, the rollout of LTE has already had a significant effect on bandwidth consumption. We should be excited about continued incremental improvements on Cat-M1 and NB-IoT as we get even lower cost and lower power solutions for our IoT applications. Unlike LTE, 5G lacks a solid definition, which means cellular providers could eventually label a slightly-faster-than-LTE connection as 5G. And truly, the only thing that is certain about 5G is we won’t know what it can and cannot do until it arrives. 


Microsoft's Linux love-in continues as PowerShell Core comes to Ubuntu Snap Store

Evidence of that newfound affection has been evident throughout 2018: with Ubuntu 18.04 being made available in the Microsoft Store, Windows File Explorer gaining the ability to launch a Linux Shell and a new option to install Windows Subsystem for Linux (WSL) distros from the command line. That's without mentioning Microsoft's release of the Linux-based Azure Sphere operating system. Now Microsoft has released its command-line shell and scripting language PowerShell Core for the Ubuntu Snap Store, as part of PowerShell Core's release as a snap package. Snap packages are containerized applications that can be installed on many Linux distributions, which Joey Aiello, PM for PowerShell at Microsoft, says has several advantages. "Snap packages carry all of their own dependencies, so you don't need to worry about the specific versions of shared libraries installed on your machine," he said, adding updates to Snaps happen automatically, and are "safe to run" as they don't interact with other applications or system files without your permission.


Why Design Thinking Should Also Serve As A Leadership Philosophy

The key here, from a leadership standpoint, is simply to drop the ego. Sweep aside titles and preconceptions about where audience insight should come from. Instead of defaulting to traditional techniques for collecting customer insight, seek it out wherever you can. Find the people who are best equipped to provide an insider's look at your customers' preferences and dislikes, whether those people are sitting in a focus group or across from you on the subway, so you can be sure you'll be giving your customers exactly what they want. Adopting a human-centric mindset can help you turn even the most fragmented experiences into seamless interactions between customer and brand. It's an investment in the customer journey that can build long-term loyalty and trust. Often, dissecting the user experience also reveals new product markets, audience segments and customer service platforms that can lead to future growth. When you consider what's at the heart of your business problem and break down the barriers between your company and your customers, it quickly becomes clear that design thinking can alter your leadership approach for the better.


Managing Engineering Complexity: Are You Ready?

So, here is the complexity loop we are in: customers demanding more capabilities leads to more complexity in IoT systems, constantly feeding data into the development processes, leading to new security and safety standards requirements, new use cases, and the need to adapt fast to changes that companies cannot always predict. These actions lead to the demand for even more complex IoT systems. And, with these new changes, new customers’ demands arise and the loop continues perpetually. Let’s zoom in for a second and see what that means for one of the most exciting industries today – automotive engineering, i.e., how we build a car. What characterizes the OEM leaders today is the desire for speed in product development and a capability of overcoming the complexity of connecting requirements, design, development, validation, and deployment within their engineering process and throughout their supply chain. And how they do that?


The Role of Randomization to Address Confounding Variables in Machine Learning

Machine learning practitioners are typically interested in the skill of a predictive model and less concerned with the statistical correctness or interpretability of the model. As such, confounding variables are an important topic when it comes to data selection and preparation, but less important than they may be when developing descriptive statistical models. Nevertheless, confounding variables are critically important in applied machine learning. The evaluation of a machine learning model is an experiment with independent and dependent variables. As such, it is subject to confounding variables. What may be surprising is that you already know this and that the gold-standard practices in applied machine learning address this. ... Randomization is a simple tool in experimental design that allows the confounding variables to have their effect across a sample. It shifts the experiment from looking at an individual case to a collection of observations, where statistical tools are used to interpret the finding.



Quote for the day:

"Leaders must be good listeners. It_s rule number one, and it_s the most powerful thing they can do to build trusted relationships." -- Lee Ellis

Daily Tech Digest - July 21, 2018


If we are looking for traffic spikes to detect an attack, how do we know it’s actually an attack and not simply a busy time for the business? Maybe our business is experiencing the result of a successful marketing campaign that had pushed more buyers to our website. On the flip side, maybe the hacker is using a low and slow attack on a backend server that takes out that service, but never creates a large enough traffic spike to detect an issue. The answer is that traditional detection methods (rate-based) will not offer you certainty in dealing with next-generation DDoS issues. When you rate-limit traffic you are blocking some good traffic from reaching its destination, and you are allowing some bad traffic into your network. Yes, you stay up, but your users may still experience some discomfort. Traditional devices are not able to discriminate bad from good so anything over a certain threshold gets dropped. These devices lack specificity. Why would I not keep all of my good traffic and simply drop the bad?




Artificial intelligence has great potential in the realm of economics and business. It not only relieves workers from having to do repetitive or even dangerous tasks, it is also much faster in analyzing data volumes, making decisions based on this, and completing tasks. What’s more, robots will further automate production, which will open many new doors. For example, countries such as Germany will become a more attractive production location, thus increasing its competitiveness. There will no longer be any economic reasons for outsourcing production to low-wage countries. Whole new business areas will emerge as a result of AI joining up with connected products, processes and machines. AI is developing more and more into a disruptive core technology. It will revolutionize our working lives and current software applications! Just like humans, machines are also capable of making mistakes. As long as human health, life and death are not at stake or people are not being assessed, mistakes are acceptable. Using a percentage tolerance level, we humans will define probabilities which allow us to decide if a computation is correct.



Machines Teaching Each Other Could Be the Biggest Exponential Trend

Intelligent systems, like those powered by the latest round of machine learning software, aren’t just getting smarter: they’re getting smarter faster. Understanding the rate at which these systems develop can be a particularly challenging part of navigating technological change. Ray Kurzweil has written extensively on the gaps in human understanding between what he calls the “intuitive linear” view of technological change and the “exponential” rate of change now taking place. Almost two decades after writing the influential essay on what he calls “The Law of Accelerating Returns”—a theory of evolutionary change concerned with the speed at which systems improve over time—connected devices are now sharing knowledge between themselves, escalating the speed at which they improve. “I think that this is perhaps the biggest exponential trend in AI,” said Hod Lipson, professor of mechanical engineering and data science at Columbia University, in a recent interview.


Artificial Intelligence : The Potential For Better Corporate Governance

Artificial Intelligence : The Potential For Better Corporate Governance
With banks in emerging markets with strong ties to US investments, investors and regulators are interested in these ratings. "Corruption and the erosion of trust in many of these countries are among the biggest challenges banks face doing business there” said Ms Haddad. "Risk is evolving and credit risk only gives you part of the picture. Risk associated with counterparty illicit finance and conduct risk is equally – if not more – important for companies operating in complex markets. For most responsible boards we talk to, it's a top three issue" said Mr Jones, Sigma CEO. "What is unique about the Sigma Ratings team is that they saw a growing problem from inside the corridors of government and international development finance, and found an innovative way to marry their unique knowledge with cutting-edge technology," said Gareth Jones, Managing Partner at FinTech Collective. "We are excited to partner with Sigma Ratings as it changes how financial institutions around the world consume and apply risk information to conduct business" he added, describing the launch as "a game changer in today's environment where illicit finance is directly related to issues of economic development and national security."


What the Incident Responders Saw

Close to 60% of attacks involve lateral movement, or where the attacker travels from its initial victim machine to other machines in a targeted organization. PowerShell is one of the most popular tools for moving about the victim's network: 100% of IR pros say they've seen the Microsoft Windows automation and configuration management tool employed by attackers, and 84% see Windows Management Interface (WMI) as a key tool weaponized by attackers. This so-called "living off the land" approach of running legitimate tools to remain under the radar is classic behavior of persistent hacker teams such as nation-states. Some 54% of IR pros say legit operating system applications like these are being abused by attackers. In addition, 16% have spotted attackers running Dropbox to assist in their movements. "The uptick of WMI is concerning," notes Kellermann, as well as the use of process-hollowing and unsigned digital certificates. "It speaks to the level of sophistication [being used] to colonize that infrastructure."


Chaos is needed to keep us smart with Machine Learning


The “relative noise reduction” paradigm of AI is reducing the outliers that you see in your interactions with the technology. However, the noise that you don’t see is somehow reducing your frontiers. As a data scientist, I am happy to see my algorithm achieve high accuracy to predict, to recommend well and to self-correct. But these algorithms learn from what they see, exactly or partially bearing resemblance to the history. They do not and cannot propose things my brain would propose in exploratory process without any proof of obvious interest from history. And that brings us to a “side effect” of AI- the unconscious trimming of creativity. I have my interest to watch astronomy related documentaries, and google, Facebook, Instagram, YouTube, LinkedIn and even Pinterest seems to be aware of this. However, somehow, a chain reaction has triggered that takes me to a gazillion resources around astronomy. How are we allowing the users of the AI-powered technology to broaden their horizons by showing them something absolutely out of the blue once in a while? How are we ensuring that the power of human nature and ability to learn is embedded in the recommendations we design?


Hyper-Localization Is Key To FinTech Expansion

“Some of the major investments we’re doing is localizing our services and opening offices worldwide to cater our value prop to the local businesses. A Chinese exporter’s needs are different than the needs of exporters in other areas of Asia, for example,” she says. The company has partnered with Chinese giants WeChat Pay and AliPay to try to increase acceptance rates in Western countries through Chinese mobile wallets. In this partnership, Chinese consumers who are traveling abroad and would like to purchase a good in store or online can pay using their Chinese mobile wallet and it will be accepted in the international store they are trying to buy from. They also made waves in 2016 with their Rakuten partnership. Payoneer’s first major client was Getty Images back in 2007, which used their services to pay photographers around the globe. As their early clients have grown into major companies, Payoneer has grown up with them. “We’ve constantly delivered more services to address their needs. We also became more licensed and more regulated, which gave us more credibility for a small business back then,” says Levy


How Artificial Intelligence will Transform IT Operations and DevOps


While Artificial Intelligence (AI) used to be the buzzword a few decades ago, it is now being commonly applied across different industries for a diverse range of purposes. Combining big data, AI, and human domain knowledge, technologists and scientists have become able to create astounding breakthroughs and opportunities, which used to be possible in science fiction novels and movies only. As IT operations become agile and dynamic, they are also getting immensely complex. The human mind is no longer capable of keeping up with the velocity, volume, and variety of Big Data streaming through daily operations, making AI a powerful and essential tool for optimizing the analyzing and decision-making processes. AI helps in filling the gaps between humans and Big Data, giving them the required operational intelligence and speed to significantly waive off the burden of troubleshooting and real-time decision-making. ... To identify that single log entry putting cracks in the environment and crashing your applications, wouldn’t it be easy if you just knew what kind of error you are looking for to filter your log data?


The future of banking


The technology that ties decentralized and open banking together is the blockchain. With the blockchain accelerating the potential to realize the benefits of these approaches to banking, the industry is touting the blockchain as the key to a future where banking is more accessible. Open and decentralized banking is enabling companies like HSBC to work with startups like R3 to put the blockchain to the test. Projects like WeTrust, Kiva, and OPTIX are working on blockchain-based solutions to address specific issues in money management, lending, and personal finance. ... The goal is to make existing online services more accessible through blockchain technology. For the unbanked, this means being able to get a loan, send remittance, or cover medical expenses just with their smartphone or walking over to their local neighborhood store. This is “universal”: decentralizing these services to make them within reach for anyone who needs them. Imagine a world where it doesn’t matter if you’re from Nigeria or Bangladesh, from the city or a small town.


Digital transformation in logistics: Delmar begins tech overhaul


The Rackspace agreement represents a shift in IT strategy for the transportation and logistics firm. "We have always taken the approach that we have to own everything," said Ron McIntyre, CTO at Delmar. But the task of managing hardware, software and data globally -- Montreal-based Delmar has operations in North America, Asia and Latin America -- became difficult. And McIntyre said the company's existing infrastructure isn't scalable enough to react quickly to business events such as an acquisition or a new market venture. The Rackspace managed services deal is part of a modernization journey in which Delmar aims to position itself to become "much more competitive and efficient within our marketplace" in the next few years, McIntyre said. In that marketplace, established providers such as Delmar face challenges from startups and emerging technologies -- hence the need for digital transformation in logistics. "There are startups around the Uber model in terms of matching shippers with carriers," McIntyre said. "That is where the push is coming from."



Quote for the day:


"Don't waste your time with explanations: people only hear what they want to hear." -- Paulo Coelho


Daily Tech Digest - July 20, 2018

An an individual, I can demand to receive my personal data from a supplier, and I can demand that the supplier then deletes all the personal information they have on me (subject to legal constraints – e.g. companies have a right to keep prior billing information, since it’s an statutory part of the accounting record). This is far from being an agreed-upon full “legal ownership right” to the information, but (as the famous saying goes), possession is nine-tenths of the law. As any economist can tell you, ownership rights have a profound impact on how markets are structured — so this might just be the start of a truly fundamental change to the entire industry. Revenge. What can you do today if you have an awful customer experience with a company? You can complain directly to the company, or to the world in general on social media. But you now have another potent weapon: you can demand that the company gives you all your personal data, and then exercise your “right to be forgotten”. Given the complex nature of computer systems in most organizations, this is currently likely to be a very manual and expensive process for the companies involved. 


How to control state for so-called stateless microservices


Front-end state control is suitable if the transactions are handled by a web or mobile GUI/app, and this front-end server controls the sequence of steps being taken. Sometimes, the front-end process can make truly stateless requests to microservices. When it doesn't, the front end can provide the state as part of the data it sends to the microservice, and the microservice can then adjust its processing based on that state. This approach doesn't add any complexity or processing delay to the app's design. Back-end state control is the more complex approach to take with stateless microservices, from developmental and operational perspectives. With back-end state control, the microservice maintains a database of state information, and it accesses that database when it has to process a message. If the microservice supports numerous transactions, a problem arises because it must determine which back-end database record corresponds with the current message. Sometimes, a transaction ID, timestamp or other unique identifier is provided for logging and can be used for state control as well.


Switzerland seeks to regain cryptocurrency crown


Swiss banks are urging the authorities to give them more clarity on the rules that apply to cryptocurrency projects before providing services to the market, and at least two important players have withdrawn for now. Zuercher Kantonalbank (ZKB), the fourth largest Swiss bank and one of the few big banks in the world to welcome issuers of cryptocurrencies, has closed the accounts of more than 20 companies in the last year, industry sources told Reuters. A spokesman for ZKB declined to comment on any former or existing clients relationships, but said the bank does not do business with any cryptocurrency groups. Another large Swiss bank kicked out crypto project Smart Valor at around the same time, said a person familiar with the project. The source declined to name the bank. Only a handful of Switzerland’s 250 banks ever allowed companies to deposit the cash equivalent of cryptocurrencies raised in ICOs. At least two still do, Reuters has established. But the involvement of a large bank like ZKB helped to establish Switzerland as an early cryptocurrency hub.


Tracking Disinformation by Reading Metadata


When we read metadata that’s been exploited or gamed in social media platforms as data craft, we can decode the signals and noise found in automated disinformation campaigns. Data craftwork not only gives us insight into the emerging techniques of manipulators, it is also a way of understanding the power structures of platforms themselves, a means of apprehending the currents and flows of personalization algorithms that underwrite the classification mechanisms that now structure our digital lives. But before we can understand how metadata categories are harnessed and hacked, it’s necessary to have a fuller picture of what platform metadata is, how it is encoded and decoded, and how it is created and collected for use by a range of actors — from technologists and providers, to individual users, to governments, to media manipulators. Currently, there is a range of known manipulation tactics for gaming engagement data. Social media professionals are known to inflate engagement by increasing likes, views, follower counts, and comments for profit.


Justice Department unveils strategy to fight election meddling, cybercrime

In the encryption section, DOJ notes that it cannot rely solely on purchasing workarounds like Cellebrite or GrayKey. “Expanding the government’s exploitation of vulnerabilities for law enforcement purposes will likely require significantly higher expenditures — and in the end it may not be a scalable solution,” the report warns. “All vulnerabilities have a limited lifespan and may have a limited scope of applicability.” Another problem relevant to election security is that the Computer Fraud and Abuse Act only empowers DOJ to prosecute people who hack internet-connected devices. “In many conceivable situations, electronic voting machines will not meet those criteria, as they are typically kept off the Internet,” the report notes. “Consequently, should hacking of a voting machine occur, the government would not, in many conceivable circumstances, be able to use the CFAA to prosecute the hackers.” At the Aspen event, Rosenstein said the report underscored how DOJ “must continually adapt criminal justice and intelligence tools to combat hackers and other cybercriminals.”


The cashless society is a con – and big finance is behind it

An ATM in west London
A cashless society brings dangers. People without bank accounts will find themselves further marginalised, disenfranchised from the cash infrastructure that previously supported them. There are also poorly understood psychological implications about cash encouraging self-control while paying by card or a mobile phone can encourage spending. And a cashless society has major surveillance implications. Despite this, we see an alignment between government and financial institutions. The Treasury recently held a public consultation on cash and digital payments in the new economy. It presented itself as attempting to strike a balance, noting that cash was still important. But years of subtle lobbying by the financial industry have clearly paid off. The call for evidence repeatedly notes the negative elements of cash – associating it with crime and tax evasion – but barely mentions the negative implications of digital payments. The UK government has chosen to champion the digital financial services industry. This is irresponsible and disingenuous. We need to stop accepting stories about the cashless society and hyper-digital banking being “natural progress”. 


Putting the promises of artificial intelligence to the test


To ensure valid, reliable, safe and ethical AI decision-making, we therefore need to develop robust approaches to teaching and training AI applications. This calls for a new testing regime, tailor-made for AI applications, that ensures adequate transparency in the decisioning mechanism in a way that users can understand; and provides an assurance of fairness and non-discrimination in the decision process. The key challenge in developing such a testing regime is that AI software has many moving parts. When testing AI applications, engineers must consider many variables which include processing unstructured data, managing the variety and veracity of data, the choice of algorithms, evaluating the accuracy and performance of the learning models and ensuring ethical and unbiased decisioning by the new system along with regulatory and compliance adherence. New testing and monitoring processes which account for the data-dependent nature of these systems also need to be developed. One way to break down the development and validation requirements for AI is to divide the work into two stages. The first stage is the ‘Teach’ stage, where the system is trained to produce a set of outputs by learning patterns in training data through various algorithms.


Are you scared yet? Meet Norman, the psychopathic AI

Norman, psychopathic AI
The program flagged that black people were twice as likely as white people to reoffend, as a result of the flawed information that it was learning from. Predictive policing algorithms used in the US were also spotted as being similarly biased, as a result of the historical crime data on which they were trained. Sometimes the data that AI "learns" from comes from humans intent on mischief-making so when Microsoft's chatbat Tay was released on Twitter in 2016, the bot quickly proved a hit with racists and trolls who taught it to defend white supremacists, call for genocide and express a fondness for Hitler. ... "When we train machines by choosing our culture, we necessarily transfer our own biases," she said. "There is no mathematical way to create fairness. Bias is not a bad word in machine learning. It just means that the machine is picking up regularities." What she worries about is the idea that some programmers would deliberately choose to hard-bake badness or bias into machines. To stop this, the process of creating AI needs more oversight and greater transparency, she thinks.


UK alerted to potential cyber risks of Huawei equipment


The report said Huawei is failing to follow agreed security processes around the use of third-party components. “In particular, security critical third-party software used in a variety of products was not subject to sufficient control.” ... A company spokesman said: “We are grateful for this feedback and are committed to addressing these issues. Cyber-security remains Huawei's top priority, and we will continue to actively improve our engineering processes and risk management systems.” The report said the National Security Adviser Mark Sedwill had been alerted to the issues in February and that work continues to remediate the engineering process issues in other products that are deployed in the UK, prioritised based on risk profiles and deployment volumes. “This work should give us the ability to provide end-to-end assurance that the code analysed by HCSEC is the constituent code used to build the binary packages executed on the network elements in the UK,” the report said, adding that until this work is completed, the Oversight Board can offer only limited assurance due to the lack of the required end-to-end traceability from source code examined by HCSEC through to executables use by the UK operators.


The reasons are simple. Reactive maintenance work costs four to five times as much as proactively replacing worn parts. When equipment fails because there is a lack of awareness of degraded performance there are immediate costs as a result of lost productivity, inventory backup, delays in completing the finished product, and more. A study by The Wall Street Journal and Emerson reported that unplanned downtime, which is caused 42% of the time by equipment failure, amounts to an estimated $50 billion per year for industrial manufacturers. Even after production begins again, the costs of interrupting operations continue. According to the Customers‘ Voice: Predictive Maintenance in Manufacturing report by Frenus, approximately 50% of all large companies face quality issues after an unplanned shutdown. In addition to savings, predictive maintenance can also result in competitive differentiation. When machine data can be used to perform predictive maintenance with a high level of precision, manufacturers can focus on differentiating products using digital capabilities like self-healing based on an awareness of technical health.



Quote for the day:


"Do not follow where the path may lead. Go instead where there is no path and leave a trail." -- Muriel Strode


Daily Tech Digest - July 19, 2018

6 usability testing methods that will improve your software

6 usability testing methods that will improve your software
Successful software projects please customers, streamline processes, or otherwise add value to your business. But how do you ensure that your software project will result in the improvements you are expecting? Will users experience better performance? Will the productivity across all tasks improve as you hoped? Will users be happy with your changes and return to your product again and again as you envisioned? You don’t find answers to these questions with a standard QA testing plan. Standard QA will ensure that your product works. Usability testing will ensure that your product accomplishes your business objectives. Well planned usability testing will shed a bright light on everything you truly care about: workflow metrics, user satisfaction, and strength of design. How do you know when to start usability testing? Which usability tests are right for your product or website? Let’s examine the six types of usability testing you can use to improve your software.



Facial Recognition Backlash: Technology Giants Scramble

Microsoft's president responded specifically to those allegations in his blog post, first touching on Microsoft's work with ICE, a law enforcement agency that is part of the U.S. Department of Homeland Security. "We've since confirmed that the contract in question isn't being used for facial recognition at all. Nor has Microsoft worked with the U.S. government on any projects related to separating children from their families at the border, a practice to which we've strongly objected," Smith said. Instead, the contract involves supporting the agency's "legacy email, calendar, messaging and document management workloads," Smith said. But at what point should an organization put down its foot with a federal agency operating in a manner to which at least some of its employees object? "This type of IT work goes on in every government agency in the United States, and for that matter virtually every government, business and nonprofit institution in the world," Smith said. "Some nonetheless suggested that Microsoft cancel the contract and cease all work with ICE."


How to Query JSON Data with SQL Server 2016


JSON (JavaScript Object Notation) is now the ubiquitous language for moving data among independent and autonomous systems, the primary function of most software these days. JSON is a text-based way to depict the state of an object in order to easily serialize and transfer it across a network from one system to the next -- especially useful in heterogeneous environments. Because a JSON string equates to a plain text string, SQL Server and any other relational database management system (RDBMS) will let you work with JSON, as they all allow for storing strings, no matter their presentation. That capability is enhanced in SQL Server 2016, the first-ever version that lets developers query within JSON strings as if the JSON were organized into individual columns. What's more, you can read and save existing tabular data as JSON. For a structured and comprehensive overview of the JSON functions in SQL Server 2016, read the "JSON Data (SQL Server)" MSDN documentation. Also, the "JSON Support in SQL Server 2016" Redgate Community article provides a more business-oriented view of JSON in SQL Server 2016, along with a scenario-based perspective of the use of JSON data in a relational persistence layer.


Heuristic automation prevents unmitigated IT disasters


IT platforms are constantly under attack from all sorts of possible malicious efforts, ranging from open port sweeping to intrusion attacks and denial-of-service assaults, such as the sophisticated distributed DoS move that took down Dyn in 2016. Historically, IT and security professionals identify that an attack is happening and then simply apply a defined means to deal with the problem. With heuristic automation in the mix, automation becomes responsive to changes in the IT environment caused by the attack. Instead of applying a simple and often ineffective fix, a heuristic IT management system looks at the IT deployment as an overall entity and applies the right fix for the situation. In this example, heuristic automation could change traffic patterns to offload incoming streams to a separate area of the platform and block certain traffic from access to those streams. It also could reallocate running workloads to a public cloud instead of the private cloud, or vice versa, to prevent service disruption. Provide the heuristics engine with information about possible attacks, and it can harden the platform in real time to prevent them from ever happening.


What’s new in the Anaconda distribution for Python

What̢۪s new in the Anaconda distribution for Python
Anaconda, the Python language distribution and work environment for scientific computing, data science, statistical analysis, and machine learning, is now available in version 5.2, with additions to both its enterprise and open-source community editions. ... This enterprise edition of Anaconda, released this week, adds new features around job scheduling, integration with Git, and GPU acceleration. Earlier versions of Anaconda Enterprise were built to allow professionals to leverage multiple machine learning libraries in a business context—TensorFlow, MXNet, Scikit-learn, and more. In version 5.2, Anaconda offers ways to train models on a securely shared central cluster of GPUs, so that models can be trained faster and more cost-effectively. Also new in Anaconda Enterprise is the ability to integrate with external code repositories and continuous integration tools, such as Git, Mercurial, GitHub, and Bitbucket. A new job scheduling system allows tasks to be run at regular intervals—for instance, to retrain a model on new data. 


Are organizations over-engineering their data centers?


With such incredible off-premise computing momentum, the potential impact of a wide-spread outage from a major data center provider grows daily. Enterprises are acutely aware of how outages could impact their mission-critical data – security was listed as a major concern for 77 percent of cloud users in RightScale’s report. Understandably, data center owners and operators have placed resiliency at the top of their priorities and turn to third-party certifiers to help address the most common root causes of outages, including human error, software issues, network downtime, and hardware failure with a corresponding failure of high availability architecture. However, there are limited offerings for data center operators to get a holistic audit of all factors that contribute to the resiliency of their services. We’ve been hearing directly from providers that existing offerings have not kept up with the pace of change in the industry. Incumbent programs will sometimes require a facility to be unnecessarily over-engineered. It’s not cost effective, and takes the focus away from what truly matters to enterprise users: security and reliability.


Raspberry Pi supercomputers: From DIY clusters to 750-board monsters

octapi-system.png
While the $35 Pi is by no means a computing powerhouse, in recent years enthusiasts have begun harnessing the power of armies of the tiny boards. There's a wide range of Pi clusters out there, from modest five-board arrangements all the way up to sprawling 750-Pi machines.If you're curious to find out more, then here's five Pi clusters built in recent years, starting with some you can try yourself and moving on to the Pi-based supercomputers being built by research labs. ... The Los Alamos National Lab (LANL) machine serves as a supercomputer testbed and is built from a cluster of 750 Raspberry Pis, which may later grow to 10,000 Pi boards. According to Gary Grider, head of its LANL's HPC division, the Raspberry Pi cluster offers the same testing capabilities as a traditional supercomputing testbed, which could cost as much as $250m. In contrast 750 Raspberry Pi boards at $35 each would cost just under $48,750, though the actual cost of installing the rack-mounted Pi clusters, designed by Bitscope, would likely be more. Grider highlights power-efficiency benefits too, and estimates that each board in a several-thousand-node Pi-based system would use just 2W to 3W.


LabCorp. Cyberattack Impacts Testing Processes

LabCorp. Cyberattack Impacts Testing Processes
"LabCorp immediately took certain systems offline as part of its comprehensive response to contain the activity," the company said in its SEC filing. "This temporarily affected test processing and customer access to test results on or over the weekend. Work has been ongoing to restore full system functionality as quickly as possible, testing operations have substantially resumed [Monday], and we anticipate that additional systems and functions will be restored through the next several days." Some customers of LabCorp Diagnostics may experience brief delays in receiving results as the company completes that process, LabCorp added. "The suspicious activity has been detected only on LabCorp Diagnostics systems. There is no indication that it affected systems used by Covance Drug Development," a research unit of LabCorp, the company said. "At this time, there is no evidence of unauthorized transfer or misuse of data. LabCorp has notified the relevant authorities of the suspicious activity and will cooperate in any investigation."


An introduction to ICS threats and the current landscape


An ICS is a key underlying element of the OT world. According to the National Institute of Standards and Technology report NIST SP 800-82 R2, "Guide to Industrial Control Systems (ICS) Security," ICS is a "general term that encompasses several types of control systems, including supervisory control and data acquisition (SCADA) systems, distributed control systems (DCS), and other control system configurations such as skid-mounted Programmable Logic Controllers (PLC) often found in the industrial sectors and critical infrastructures." ICS is used in the industrial, manufacturing and critical infrastructure sectors. For instance, railway controls are a type of SCADA. A street light controller may be a PLC, but it can also be part of a SCADA system. Finally, an ICS includes combinations of control components, including electrical, mechanical, hydraulic or pneumatic, that act together to achieve an industrial objective, such as manufacturing, transportation, or the distribution of material or energy.


Q&A on the Book Testing in the Digital Age

A good example for generating test cases can be the use of an evolutionary algorithm in testing automated parking on a car. You can imagine that with automatic parking, the amount of situations the car can be in are nearly infinite. The starting position may vary with surrounding cars positioned in many different ways, or other attributes that cannot be hit are around the car. The automatic parking function may not hit anything when parking and the car needs to be parked in a correct way. In this case we can generate a series of starting positions that the automatic park function needs to tackle. Ideally this is virtual so we can run a lot of tests quickly. It could be physical tests of course, but it would take more time in test execution. We need to define a fitness function that is evaluated with each test execution run. In this case it would be a degree of passing for the parked car. You can imagine some points for not hitting anything, and points for how well the car is parked in the end. Now we generate a series of tests and run them. Each outcome is evaluated and assigned a total points value.



Quote for the day:


"Strength lies in differences, not in similarities." -- Stephen R. Covey