Daily Tech Digest - October 19, 2019

Lip-Reading Drones, Emotion-Detecting Cameras: How AI Is Changing The World

Specific lip-reading programs can decipher what people are saying from a distance while gait-analysis software can identify an individual just by the way they walk. "Even if the drone is at 300ft, it can still operate effectively,” Dronestream CEO Harry Howe said. While these particular drones are still in the testing phase, many intruding technologies are being used around the country. Take China, for example. It's Skynet system claims it can scan all 1.3 billion citizens within seconds. There are 200 million cameras scattered around the country which can track identity thieves, find fugitives, catch sleeping students and spot jaywalkers. This particular surveillance system led to 2000 arrests from 2016 to 2018. Countries like Malaysia, Jamaica, Germany and Poland are considering installing similar systems, while a number of facial recognition trials have been conducted right here on Australian soil.

7 mistakes that ISO 27001 auditors make

Checklists are a great way of quickly assessing whether a list of requirements are met, but what they offer in convenience they lack in in-depth analysis. Organisations are liable to see that a requirement has been ticked off and assume that it’s ‘mission accomplished’. However, there may still be room to improve your practices, and it might even be the case that your activities aren’t necessary. A good auditor will use the checklist as a summary at the beginning or end of their audit, with a more detailed assessment in their report, or they’ll use a non-binary system that doesn’t restrict them to stating that a requirement either has or hasn’t been met. ...  In theory, they are a perfect fit. You already have a working relationship and you’ll save time finding a consultant and bringing them up to speed on your organisation’s needs. Unfortunately, there’s clearly a conflict of interest in this relationship, as you run the risk of allowing the auditor to manipulate their findings to persuade you to use them as a consultant.

Looking at the Enterprise Architecture Renaissance

In their enterprise architecture report, Ovum looked at the paradigm shift going on now that’s responsible for transforming EA into architect everything. They reviewed seven EA solutions that have begun the transition from EA to AE. Interestingly, Ovum found that the vendors shared a similar idea on the direction that EA should move toward. Most regarded non-EA features that help with business modeling, business process mapping and analysis, GRC, and portfolio management to be standard features that EA platforms should include in their solutions. ... Today’s enterprise architecture approach needs to promote stronger collaboration and teamwork throughout the organization, so that everyone is on the same page with regard to company goals and desired outcomes. One example on an EA platform that does this effectively is Planview Enterprise One. Planview Enterprise One comes with collaboration and workflow tools that enable process and project-driven work. Elements like Kanban boards and collaborative workspaces make it easy to bring stakeholders and contributors together under one roof, where they can share information and work together to push the company forward.

Top 6 email security best practices to protect against phishing attacks ...

Complicated email flows can introduce moving parts that are difficult to sustain. As an example, complex mail-routing flows to enable protections for internal email configurations can cause compliance and security challenges. Products that require unnecessary configuration bypasses to work can also cause security gaps. As an example, configurations that are put in place to guarantee delivery of certain type of emails (eg: simulation emails), are often poorly crafted and exploited by attackers. Solutions that protect emails (external and internal emails) and offer value without needing complicated configurations or emails flows are a great benefit to organizations. In addition, look for solutions that offer easy ways to bridge the gap between the security teams and the messaging teams. Messaging teams, motivated by the desire to guarantee mail delivery, might create overly permissive bypass rules that impact security. The sooner these issues are caught the better for overall security. Solutions that offer insights to the security teams when this happens can greatly reduce the time taken to rectify such flaws thereby reducing the chances of a costly breach

How operators can make 5G pay

Some operators have started to partner with over-the-top (OTT) service providers to bundle their offerings with connectivity subscriptions, sometimes with an explicit charge and sometimes without (for example, by making certain streams unmetered against the customer’s data bundle). “With the improvements in network capabilities in the 5G era, customers can expect to enjoy more network services bundled with content provider services — including accelerated gaming — and the operator could offer its network service to the customer as part of that bundle,” said a senior executive at an Asian Internet player. In the 5G world, in which the network technology allows a far greater range of functionality that can be monetized, telecom companies have many more opportunities to develop collaborations with a variety of businesses and public agencies. We see four main options for how operators could monetize this greater functionality. The higher the relevance of the telecom operator’s brand to the use case, the greater the operator’s ability to own the customer relationship and claim a bigger share of revenues.

Beyond their value in ensuring consistent, predictable service delivery, SLOs are a powerful weapon to wield against micromanagers, meddlers, and feature-hungry PMs. That is why it’s so important to get everyone on board and signed off on your SLO. When they sign off on it, they own it too. They agree that your first responsibility is to hold the service to a certain bar of quality. If your service has deteriorated in reliability and availability, they also agree it is your top priority to restore it to good health. Ensuring adequate service performance requires a set of skills that people and teams need to continuously develop over time, namely: measuring the quality of our users’ experience, understanding production health with observability, sharing expertise, keeping a blameless environment for incident resolution and post-mortems, and addressing structural problems that pose a risk to service performance. They require a focus on production excellence, and a (time) budget for the team to acquire the necessary skills. The good news is that this investment is now justified by the SLOs that management agreed to.

How open source software is turbocharging digital transformation

Make no mistake, the ever-expanding palette of vendor solutions on the market today remains an indispensable resource for enterprise-scale digital transformation. But there are compelling reasons to explore OSS’s possibilities as well. For example, OSS in emerging technology domains often includes work contributed by highly creative developers with hard-to-find expertise. By exploring OSS projects for artificial intelligence (AI), blockchain, or other trending technologies, companies that lack in-house experts can better understand what the future holds for these disruptive tools. Moreover, CIOs are realizing that when coders can engage with domain experts and contribute their own work to an OSS ecosystem, job satisfaction and creativity often grow, along with engineering discipline, product quality, and efficiency. As any software engineer knows, the ability to take established and tested code from an existing library, rather than having to create it from scratch, can shrink development timelines significantly. These findings spotlight OSS’s formidable promise. But they also make clear that open source is not an all-or-nothing proposition. IT leaders should think of OSS as a potentially valuable complement to their broader ecosystem, vendor, or partner strategy.

Yubico security keys can now be used to log into Windows computers

Starting today, users can use hardware security keys manufactured by Swedish company Yubico to log into a local Windows OS account. After more than six months of testing, the company released today the first stable version of the Yubico Login for Windows application. Once installed on a Windows computer, the application will allow users to configure a Yubico security key (known as YubyKey) to secure local Windows OS accounts. The Yubico key will not replace the Windows account password but will work as a second authentication factor. Users will have to enter their password, and then plug in a Yubico key into a USB port to finish the login process. Yubico hopes the keys will be used to secure high-value computers storing sensitive data that are used in the field, away from secured networks. Such devices are often susceptible to theft or getting lost. If the devices are not encrypted, attackers have various ways at their disposal to bypass normal Windows password-based authentication. Securing local Windows accounts with a YubiKey makes it nearly impossible for an attacker to access the account, even if they know the password.

The Fallacy of Telco Cloud

First, proving the viability of virtualizing Telco workloads, with the investment in defining Network Function Virtualization (NFV) and a global set of trials, beginning in and around the first ETSI NFV working group meeting in 2012. Then, we focused on the optimization of that virtualization technology – investment in Virtual Infrastructure Managers (VIMs), I/O acceleration technologies like Data Plane Development Kit (DPDK), and para-virtualization technologies, such as Single Root Input/Output Virtualization (SR-IOV) for performance and manageability of SLA-backed network functions. Now, we’ve embarked on the next set of technology advancements: separating control and user planes, accelerating I/O functions with FPGAs and SmartNICs, and starting the migration of applications towards containers and cloud native functions. This is the beginning of a second wave of technology-led investments into the Telco Cloud. ... In short, the technology is mature. The real question is – are we actually achieving the benefits of cloud in the Telco network? 

Challenges of Data Governance in a Multi-Cloud World

The traditional contracts that worked in typical telecom network services to mitigate security breaches or other types of noncompliance events have failed to deliver the goods for the cloud. Highly scaled, shared, and automated IT platforms, such as the cloud can hide the geographic location of data — both from the customer and the service provider’s sides. This can give rise to regulatory violations. Thus, contracting for the cloud is still in its infancy, and till some litigation sheds light on regulatory issues and serves to set precedents for future cases, the data-cloud breach issues will remain unresolved. Moreover, data aggregation will increase the potential data risk as more valuable data will occupy the common storage location. On the flip side, multi-cloud environments offer more transparency through event logging, and enterprise-wide solutions via automation tools. Solutions, once detected, can be instantly deployed across cloud networks. In recent years, risk management strategies specifically for the cloud have emerged, and these just have to be tested for the multi-cloud environments.

Quote for the day:

"There are some among the so-called elite who are overbearing and arrogant. I want to foster leaders, not elitists." -- Daisaku Ikeda

Daily Tech Digest - October 18, 2019

Critical PDF Warning: New Threats Leave Millions At Risk

Keyboard key - pdf file
The PDFex vulnerability exploits ageing technology which was not designed with contemporary security considerations in mind. In essence, taking advantage of the very universality and portability of the PDF format. And while it might seem like a fairly specific attack, most companies rely on secured PDF documents for the transmission of contracts, board papers, financial documents, transactional data. There is an expectation that such documents are secure. Clearly, they are not. The PDFex attack is designed to exfiltrate the encrypted data to the attacker when the document is opened with a password—being decrypted in the process. The PDFex researchers, “in cooperation with the national CERT section of BSI,” have contacted all vendors, “provided proof-of-concept exploits, and helped them fix the issues.” Of even more concern, is the multiple vulnerabilities that have been disclosed and which impact the popular Foxit Reader PDF application specifically—Foxit claims it has 475 million users. Affecting Windows versions of Foxit’s reader, the vulnerabilities enable remote code execution on a target machine. 

Much-attacked Baltimore uses ‘mind-bogglingly’ bad data storage

After the attack in May, Baltimore Mayor Bernard C. “Jack” Young not only refused to pay, he also sponsored a resolution, unanimously approved by the US Conference of Mayors in June 2019, calling on cities to not pay ransom to cyberattackers. Baltimore’s budget office has estimated that due to the costs of remediation and system restoration, the ransomware attack will cost the city at least $18.2 million: $10 million on recovery, and $8.2 million in potential loss or delayed revenue, such as that from property taxes, fines or real estate fees. The Robbinhood attackers had demanded a ransom of 13 Bitcoins – worth about US $100,000 at the time. It may sound like a bargain compared with the estimated cost of not caving to attackers’ demands, but paying a ransom doesn’t ensure that an entity or individual will actually get back their data, nor that the crooks won’t hit up their victim again. The May attack wasn’t the city’s first; nor was it the first time that its IT systems and practices have been criticized in the wake of attack.

'The Dukes' (aka APT29, Cozy Bear) threat group resurfaces with three new malware families

'The Dukes' threat group resurfaces with three new malware families
According to researchers, three new malware samples, dubbed FatDuke, RegDuke and PolyglotDuke, linked to a cyber campaign most likely run by APT29. The most recent deployment of these new malwares was tracked in June 2019. The ESET researchers have named all activities of Apt29 (past and present) collectively as Operation Ghost. This cyber campaign has been running since 2013 and has successfully targeted the Ministries of Foreign Affairs in at least three European countries. The researchers compared the techniques and tactics used by APT29 in its recent attacks to those used in group's older attacks. They found many similarities in these campaigns, including the use of Windows Management Instrumentation for persistence, use of steganography in images to hide communications with Command and Control (C2) servers, and use of social media, such as Reddit and Twitter, to host C2 URLs. The researchers also found similarities in the targets hit during the newer and older attacks - ministries of foreign affairs.

Misconfigured Containers Open Security Gaps

Image: Coloures Pic - stock.adobe.com
The knowledge gap surrounding security risks and the blunders it causes are, by far, the biggest threat to organizations using containers, observed Amir Jerbi, co-founder and CTO of Aqua Security, a container security software and support provider. "Vulnerabilities in container images -- running containers with too many privileges, not properly hardening hosts that run containers, not configuring Kubernetes in a secure way -- any of these, if not addressed adequately, can put applications at risk," he warned. Examining the security incidents targeting containerized environments over the past 18 months, most were not sophisticated attacks but simply the result of IT neglecting basic best practices. he noted. ... While most container environments meet basic security requirements, they can also be more tightly secured. It's important to sign your images, suggested Richard Henderson, head of global threat intelligence for security technology provider Lastline. "You should double-check that nothing is running at the root level."

Microsoft and Alibaba Back Open Application Model for Cloud Apps

OAM is a standard for building native cloud applications using "microservices" and container technologies, with a goal of establishing a platform-agnostic approach. It's kind of like the old "service-oriented architecture" dream, except maybe with less complexity. The OAM standard is currently at the draft stage, and the project is being overseen by the nonprofit Open Web Foundation. Microsoft apparently doesn't think too highly of the Open Web Foundation as its "goal is to bring the Open Application Model to a vendor-neutral foundation," the announcement explained. Additionally, Microsoft and Alibaba Cloud disclosed that there's an OAM specification specifically designed for Kubernetes, the open source container orchestration solution for clusters originally fostered by Google. This OAM implementation, called "Rudr," is available at the "alpha" test stage and is designed to help manage applications on Kubernetes clusters. ... Basic OAM concepts can be found in the spec's description. It outlines how the spec will account for the various roles involved with building, running and porting cloud-native apps.

Why AI Ops? Because the era of the zettabyte is coming.

“It’s not just the amount of data; it’s the number of sources the data comes from and what you need to do with it that is challenging,” Lewington explains. “The data is coming from a variety of sources, and the time to act on that data is shrinking. We expect everything to be real-time. If a business can’t extract and analyze information quickly, they could very well miss a market or competitive intelligence opportunity.” That’s where AI comes in – a term originally coined by computer scientist, John McCarthy, in 1956. He defined AI as “the science and engineering of making intelligent machines.” Lewington thinks that the definition of AI is tricky and malleable, depending on who you talk to. “For some people, it’s anything that a human can do. To others, it means sophisticated techniques, like reinforcement learning and deep learning. One useful definition is that artificial intelligence is what you use when you know what the answer looks like, but not how to get there.” No matter what definition you use, AI seems to be everywhere. Although McCarthy and others invented many of the key AI algorithms in the 1950s, the computers at that time were not powerful enough to take advantage of them.

Fake Tor Browser steals Bitcoin from Dark Web users

Purchases made in these marketplaces are usually done so using cryptocurrency such as Bitcoin (BTC) in order to mask the transaction and user's identity. If a user visits these domains and tries to make a purchase by adding funds to their wallet, the script activates and attempts to change the wallet address, thereby ensuring funds are sent to an attacker-controlled wallet instead. The payload will also try to alter wallet addresses offered by Russian money transfer service QIWI. "In theory, the attackers can serve payloads that are tailor-made to particular websites. However, during our research, the JavaScript payload was always the same for all pages we visited," the researchers say. It is not possible to say how widespread the campaign is, but the researchers say that PasteBin pages promoting the Trojanized browser have been visited at least half a million times, and known wallets owned by the cybercriminals have 4.8 BTC stored -- equating to roughly $40,000. ESET believes that the actual value of stolen funds is likely to be higher considering the additional compromise of QIWI wallets.

Server Memory Failures Can Adversely Affect Data Center Uptime

The Intel® MFP deployment resulted in improved memory reliability due to predictions based on the capture of micro-level memory failure information from the operating system’s Error Detection and Correction (EDAC) driver, which stores historical memory error logs. Additionally, by predicting potential memory failures before they happen, Intel® MFP can help improve DIMM purchasing decisions. As a result, Tencent was able to reduce annual DIMM purchases by replacing only DIMMs that have a high likelihood to cause server crashes. Because Intel® MFP is able to predict issues at the memory cell level, that information can be used to avoid using certain cells or pages, a feature known as page offlining, which has become very important for large scale data center operations. Tencent was therefore able to improve their page offlinging policies based on Intel® MFP’s results. Using Intel® MFP, server memory health was analyzed and given scores based on cell level EDAC data.

Three Keys To Delivering Digital Transformation

More digitally mature organisations are beginning to view digital transformation as not just an internal technology infrastructure upgrade. It is more than an opportunity to move costly in-house capabilities to the cloud, or shift sales and marketing to online multi-channel provision. The focus today is on a more fundamental review of business practices, a realignment of operations toward core values, and a stronger relationship between creators and consumers of services. Within this context, digital modernisation programmes taking place across many organisations are accelerating the digitisation of their core assets, rebalancing spending toward digital engagement channels, fixing flaws in their digital technology stacks, and replacing outdated technology infrastructure with cloud-hosted services. Such programmes are essential for organisations to remain competitive and relevant in a world that increasingly rewards those that can adapt quickly to market changes, raise the pace of new product and service delivery, and maintain tight stakeholder relationships.

Virtual voices: Azure's neural text-to-speech service

Microsoft Research has been working on solving this problem for some time, and the resulting neural network-based speech synthesis technique is now available as part of the Azure Cognitive Services suite of Speech tools. Using its new Neural text-to-speech service, hosted in Azure Kubernetes Service for scalability, generated speech is streamed to end users. Instead of multiple steps, input text is first passed through a neural acoustic generator to determine intonation before being rendered using a neural voice model in a neural vocoder. The underlying voice model is generated via deep learning techniques using a large set of sampled speech as the training data. The original Microsoft Research paper on the subject goes into detail on the training methods used, initially using frame error minimization before refining the resulting model with sequence error minimisation. Using the neural TTS engine is easy enough. As with all the Cognitive Services, you start with a subscription key and then use this to create a class that calls the text-to-speech APIs.

Quote for the day:

"A person must have the courage to act like everybody else, in order not to be like anybody." -- Jean-Paul Sartre

Daily Tech Digest - October 17, 2019

4 tips to help data scientists maximise the potential of AI and ML
4 tips to help data scientists maximise the potential of AI and ML
With machine learning, business process scalability has made leaps and bounds, but it’s important not to get side-tracked by that, according to Edell. Instead, focus on the things that are going wrong, rather than attempting to improve the things that are already working. “The most common mistake really anyone can make when building an ML solution is to lose sight of the problem they are trying to solve,” he said. “As such, we can spend a lot of time making the tech better, but forgetting why we’re using the tech in the first place. “For example, we may spend a lot of time and money improving the accuracy of a face recognition engine from 92pc to 95pc, when we could have spent that time improving what happens when the face recognition is wrong – which might bring more value to the customer than an incremental accuracy improvement.” The potential that emerging technologies can have for overcoming challenges with data science, no matter the industry, is monumental. But for the sectors that are client and consumer-facing, the needs of customers should still come first.

Velocity and Better Metrics: Q&A with Doc Norton
First of all, as velocity is typically story points per iteration and story points are abstract and estimated by the team, velocity is highly subject to drift. Drift is subtle changes that add up over time. You don’t usually notice them in the small, but compare over a wider time horizon and it is glaringly obvious. Take a team that knows they are supposed to increase their velocity over time. Sure enough, they do. And we can probably see that they are delivering more value. But how much more? How can we be sure? In many cases, if you take a set of stories from a couple of years ago and ask this team to re-estimate them, they’ll give you an overall number higher than the original estimates. My premise is that this is because our estimates often drift higher over time. The bias for larger estimates isn’t noticeable from iteration to iteration, but is noticeable over quarters or years. You can use reference stories to help reduce this drift, but I don’t know if you can eliminate it. Second of all, even if you could prove that estimates didn’t drift at all, you’re still only measuring one dimension - rate of delivery.

'Graboid' Cryptojacking Worm Spreads Through Containers

'Graboid' Cryptojacking Worm Spreads Through Containers
This is the first time the researchers have seen a cryptojacking worm spread through containers in the Docker Engine (Community Edition). While the worm isn't sophisticated in its tactics, techniques or procedures, it can be repurposed by the command-and-control server to run ransomware or other malware, the researchers warn. The Unit 42 research report did not note how much damage Graboid has caused so far or if the attackers targeted a particular sector. "If a more potent worm is ever created to take a similar infiltration approach, it could cause much greater damage, so it's imperative for organizations to safeguard their Docker hosts," the Unit 42 report notes. "Once the [command-and-control] gains a foothold, it can deploy a variety of malware," Jay Chen, senior cloud vulnerability and exploit researcher at Palo Alto Networks, tells Information Security Media Group. "In this specific case, it deployed this worm, but it could have potentially leveraged the same technique to deploy something more detrimental. It's not dependent on the worm's capabilities."

Data Literacy—Teach It Early, Teach It Often Data Gurus Tell Conference Goers

No one can understand everything, he said. That’s why the “real sweet spot” is the communication between the data scientists and the experts in various fields of inquiry to determine what they are seeking from the data and how it can be used. And there’s also an ethical component so that the data are not used to arrive at false conclusions. Sylvia Spengler, the National Science Foundation’s program director for Information and Intelligence systems, said that solving today’s big questions requires an interdisciplinary approach across all the sciences. “We need a deep integration across a lot of disciplines,” she said. “This is made for data science and data analytics. But it puts a certain edge on actually being able to deal with the kinds of data coming at you because they are so incredibly different.” Spengler said this integration can only happen through teams of people working on it. “You have to be able to collaborate. Those soft skills are critical. It’s not just your brains but your empathy because it makes you capable of taking multiple perspectives,” she said.

Linux security hole: Much sudo about nothing

At first glance the problem looks like a bad one. With it, a user who is allowed to use sudo to run commands as any other user, except root, can still use it to run root commands. For this to happen, several things must be set up just wrong.  First the sudo user group must give a user the right to use sudo but doesn't give the privilege of using it to run root commands. That can happen when you want a user to have the right to run specific commands that they wouldn't normally be able to use. Next, sudo must be configured to allow a user to run commands as an arbitrary user via the ALL keyword in a Runas specification. The last has always been a stupid idea. As the sudo manual points out, "using ALL can be dangerous since in a command context, it allows the user to run any command on the system." In all my decades of working with Linux and Unix, I have never known anyone to set up sudo with ALL. That said, if you do have such an inherently broken system, it's then possible to run commands as root by specifying the user ID -1 or 4294967295. Thus, if the ALL keyword is listed first in the Runas specification, an otherwise restricted sudo user can then run root commands.

News from the front in the post-quantum crypto wars with Dr. Craig Costello
Image of Dr. Craig Costello for the Microsoft Research Podcast
One good thing was that this notion of public key cryptography, it arrived, I suppose, just in time for the for the internet. In the seventies, the invention of public key cryptography came along, and that’s the celebrated Diffie-Hellman protocol that allows us to do key exchange. Public key cryptography is kind of a notion that sits above however you try to instantiate it. So public key cryptography is a way of doing things, and how you choose to do them, or what mathematical problem you might base it on, I guess, is how you instantiate public key cryptography. So initially, the two proposals that were proposed back in the seventies were what we call the discrete log problem in finite field. When you compute powers of numbers, and you do them in a finite field, you might start with a number and compute some massive power of it and then give someone the residue, or the remainder, of that number and say, what was the power that I raised this number to in the group? And the other problem is factorization, so integer factorization.

Beamforming explained: How it makes wireless communication faster
future wifi
The mathematics behind beamforming is very complex (the Math Encounters blog has an introduction, if you want a taste), but the application of beamforming techniques is not new. Any form of energy that travels in waves, including sound, can benefit from beamforming techniques; they were first developed to improve sonar during World War II and are still important to audio engineering. But we're going to limit our discussion here to wireless networking and communications.  By focusing a signal in a specific direction, beamforming allows you deliver higher signal quality to your receiver — which in practice means faster information transfer and fewer errors — without needing to boost broadcast power. That's basically the holy grail of wireless networking and the goal of most techniques for improving wireless communication. As an added benefit, because you aren't broadcasting your signal in directions where it's not needed, beamforming can reduce interference experienced by people trying to pick up other signals. The limitations of beamforming mostly involve the computing resources it requires; there are many scenarios where the time and power resources required by beamforming calculations end up negating its advantages.

A First Look at Java Inline Classes
The goal of inline classes is to improve the affinity of Java programs to modern hardware. This is to be achieved by revisiting a very fundamental part of the Java platform — the model of Java's data values. From the very first versions of Java until the present day, Java has had only two types of values: primitive types and object references. This model is extremely simple and easy for developers to understand, but can have performance trade-offs. For example, dealing with arrays of objects involves unavoidable indirections and this can result in processor cache misses. Many programmers who care about performance would like the ability to work with data that utilizes memory more efficiently. Better layout means fewer indirections, which means fewer cache misses and higher performance. Another major area of interest is the idea of removing the overhead of needing a full object header for each data composite — flattening the data. As it stands, each object in Java's heap has a metadata header as well as the actual field content. In Hotspot, this header is essentially two machine words — mark and klass.

Developer Skills for Successful Enterprise IoT Projects
Successful IoT project
The first step of any successful IoT project is to define the business goals and build a proof-of-concept system to estimate if those goals are reachable. At this stage, you need only a subset of the skills listed in this article. But once a project is so successful that it moves beyond the proof-of-concept level, the required breadth and depth of the team increases. Often, individual developers possess several of the skills. Sometimes, each skill on the list will require their own team. The amount of people needed depends both on the complexity of the project and on success. More success usually means more work but also more revenue that can be used to hire more people. Most IoT projects include some form of custom hardware design. The complexity of the hardware varies considerably between projects. In some cases, it is possible to use hardware modules and reference designs, for which a basic electrical engineering education is enough. More complex projects need considerably more experience and expertise. To build Apple-level hardware, you need an Apple-level hardware team and an Apple-level budget.

IoT in Vehicles: The Trouble With Too Much Code
The threat and risk surface of internet of things devices deployed in automobiles is exponentially increasing, which poses risks for the coming wave of autonomous vehicles, says Campbell Murray of BlackBerry. To get a sense of how complicated today's cars are, Campbell notes that while A380 airplane runs around four million lines of code, an average mid-size car has 100 million lines of code. Statistically, that means there are likely many software defects. Using a metric of .015 bugs per line of code, that means cars with that much code could have as many as 150 million bugs, Campbell says in an interview with Information Security Media Group. Reducing those code bases is one way to reduce the risks, he says. "It's absolutely astonishing - the number of vulnerabilities that could exist in there," Campbell says. Meanwhile, enterprises deploying IoT need to remember the principles of safe computing: assigning the least amount of privileges, using dual-factor authentication and strong access controls, he adds.

Quote for the day:

"Leading people is like cooking. Don't stir too much; It annoys the ingredients_and spoils the food" -- Rick Julian

Daily Tech Digest - October 16, 2019

Ransomware: These are the most common attacks targeting you right now

Analysis of over 230,000 ransomware attacks which took place between April and September has been published by cyber security researchers at Emsisoft and one family of malware accounted for over half (56%) of reported incidents: the 'Stop' ransomware. Stop – also known as DJVU – first emerged in late 2018 and several different variants are known to exist. The ransomware is typically distributed by torrent websites, often hidden in cracked versions of software which users are attempting to download while avoiding paying for the product. However, this can come with an additional cost for the victim as Stop encrypts files with AES-256 encryption and demands a ransom of $490 in bitcoin in exchange for the decryption key. The ransom amount doubles after 72 hours in an effort to scare the victim into paying up immediately. "While attacks against home users have declined, Stop proves that consumer ransomware continues to be profitable," Fabian Wosar, CTO at Emsisoft told ZDNet. Commodity malware attacks remain common, but the most successful ransomware operations are making large amounts by compromising entire networks then demanding a ransom payments of hundreds of thousands of dollars in exchange for the network being restored.

Settling the Soft Skills vs. Technical Skills Debate

Image: SFIO CRACHO - stock.adobe.com
In the tech industry, there’s a lot of talk about hiring candidates who may have the soft skills but fall short on some of the technical skills. There are many opinions on the topic, but is there really a right or wrong on this issue? With IT unemployment at its lowest on record, companies are desperate for tech talent. So, are you settling for talent if you hire someone without all the qualities and skills on your check list? Absolutely not. How do you know when it’s okay to hire a candidate when they don’t check every box? The answer can depend on the type of talent you are hiring. Hiring is very different for permanent and contract positions. You are on an entirely different timeline, with very different goals. For contract talent, the goal is a sprint to successfully deliver a project. For full time employees, it’s a marathon where winning means the growth, development and success of your team and company. These different goals mean you’re looking for different candidates and using a different interview process to identify and secure them. While the best-case scenario is to hire a candidate with both technical and soft skills, for permanent positions you can, and should, absolutely hire for soft skills, even when all the technical expertise isn’t there.

6 ways AI and automation could improve process mining

"It's easy to think of tasks as the individual actions, which can often be generic, that make up an overall process," said Pankaj Chowdhry, founder and CEO of FortressIQ, based in San Francisco. A simple example would be sending an email. That's a task that, in context -- such as responding to a complaint or updating a salesperson with a new shipping date -- would be part of a process. Processes are composed of tasks and decisions and are usually across multiple systems and user groups. Workflows are typically the technical view of a process, which are implemented in BPMN systems. "We've seen aha moments from clients when they see how a process spans multiple systems and sometimes even extends outside their enterprise," Chowdhry said. Access to accurate, up-to-date process information changes the way a business is run. This is evolving from simple log analysis to "Fitbit for the enterprise," allowing the same simplicity in software activity tracking and analysis.

Why growing companies shouldn’t compromise when it comes to data protection

One of the biggest hurdles to establishing a strong data protection framework is that small and mid-size businesses have much fewer resources than large corporations. Data protection and back-up is costly, and business owners may be left wondering whether spending on it is actually justified. When in doubt, it’s paramount to consider the costs associated with breached or lost data. Compromised customer or intellectual property information not only results in revenue loss, but is also damaging to a business’s reputation, which can have long-term consequences in the marketplace. Then there are the damage control costs: Following a data breach, a business may need to spend money on legal fees, public relations to remediate their image, and new technology to secure their data moving forward. There are solutions emerging that are specifically tailored to address the growing business needs of midsized companies with fewer IT resources than their enterprise counterparts, yet are facing the same technical challenges. Metallic, a new division of Commvault, offers a comprehensive data protection software as a service (SaaS) portfolio, with recovery and backup, that was designed for scalability and flexibility.

Robotics and Automation: UK must tackle skills, immigration and SME challenges

Rachel Reeves, chair of the Business, Energy and Industrial Strategy (BEIS) committee, said: “The switch to automation brings challenges for businesses and workers, with fears for livelihoods or disruption to job roles coming to the fore. The real danger for the UK economy and for future jobs growth is, however, not that we have too many robots in the workplace, but that we have too few. “The government has failed to provide the leadership needed to help drive investment in automation and robot technologies. If we are to reap the potential benefits in the future of improved living standards, more fulfilling work, and the four-day working week, the government needs to do more to support British businesses and universities to collaborate and innovate. Factors limiting the ability of the UK to take advantage of automation and robotics include management teams who do not understand or recognise the potential of automation; a lack of digital skills among parts of the workforce;

Poor cyber resilience: an organization’s Achilles' heel

cyber resilience shock absorber
Around the world, organizations are showing a worrisome disconnect between their acknowledgement of cyber-risk as a top-rank priority and the way they are dealing with it. Essentially, it seems that organizations are zeroing in more on technology and prevention than on setting aside the time, resources, and activities they need to build meaningful cyber-resilience. Seventy-nine percent of respondents ranked cyber-risk as a top-five concern in their organization. This, in comparison to 62 percent in 2017. In fact, the number of firms that cited cyber-risk as their prime concern almost quadrupled, from 6 percent to 22 percent. This year’s survey revealed a notable drop in the firms’ confidence in every cyber-resilience area that matters. These include understanding, assessing, and measuring potential cyber-risks; the ability to reduce the likelihood of cyber-attacks or avert potential damage; and managing, responding to, and recovering from adverse cyber-events. This year, a mere 11 percent of companies reported a high degree of confidence in all three aspects of cyber resilience.

SAS Targets Operationalizing AI with ModelOps

Image: profit_image - stock.adobe.com
The ModelOps and ModelOps Health Check Assessment announcement marks a change in tone for SAS, a company that has traded on its reputation as a top provider of analytics software but hasn't been quick to embrace the changes happening in the market all around it. For instance, the market has welcomed a huge number of open source tools, new players in the field of big data, new programming languages such as Python, and new momentum for old programming languages like R. Plus, don't forget about cloud computing and storage. All the while, SAS remained confident in its core products and offerings, which had been closed to the open source technology that had gained so much popularity with undergraduate and graduate students pursuing data science degrees. In recent years SAS has gained a reputation of being "your grandfather's data science platform," according to Hare. "There are things that were happening outside the world of SAS that SAS was pretty much ignoring," Hare said. "SAS recognized over the past 3 to 5 years that they needed to do something different."

Pitney Bowes Says Ransomware Behind System Outages

Ransomware continue to put many businesses, municipal governments and schools in a bind. Earlier this month, the FBI issued a new advisory saying that ransomware attacks "are becoming more targeted, sophisticated and costly, even as the overall frequency of attacks remains consistent." The FBI says it has seen a sharp decrease in indiscriminate ransomware attacks. But the losses from the successful attacks - often targeting healthcare, industrial and transportation companies - have become increasingly costly (see: Texas Pummeled by Coordinated Ransomware Attack). Attackers often first infect systems by sending phishing emails with attached malware. Also, they hunt for vulnerabilities in remote desktop protocol - or go on dark web marketplaces to buy stolen credentials for access to organizations via RDP - which is widely used in Windows environments but may have vulnerabilities or weak authentication credentials, the FBI says.

How to prevent IPv6 VPN breakout

ipv6 wireless network apps smart technology iot
Many enterprises do not realize how often IPv6 is being used on devices that access their networks via VPN. Phones, tablets and laptops used for remote access to corporate networks commonly support IPv6 as do broadband and cellular services they might use to access the internet. As a result, enterprises often don’t recognize IPv6 as a security factor. They configure their VPNs to inspect only IPv4 traffic, which can leave mobile devices free to access IPv6 sites that could prove dangerous to business networks, devices and data. The way IPv4 protections work is, once the VPN has been established, the VPN concentrator inspects traffic bound for the internet and blocks traffic bound for destinations judged out of bounds by the policies the enterprise has configured. Most corporate VPNs enforce what is called no split-tunneling to enhance security by forcing all IPv4 connections to traverse the VPN. With no split-tunneling, once a VPN connection has been established, remote devices cannot make a separate connection to the internet at large.

Regression Testing: Answers and Strategies

Regression testing helps detect bugs in the source code early in the deployment cycle whenever code changes. The product team can save much time and effort in resolving build-up defects after releasing. Rather than functional testing that only focuses on inspecting behaviors of the new features, regression testing goes beyond to confirm the compatibility between newly added features and the existing ones. Hence, developers or tester can find it easier to investigate the primary cause of a test failure. With a software project that requires frequent updates and continuous improvements, regression testing is vital to guarantee the stability of application performance.  With the fast speed of regression testing, software teams can receive more informative feedback instantly as well as resolve problems more quickly and effectively. Regression testing helps developers and testers put effort into building new features for software applications rather than turn back to fixing bugs on the previous test case.

Quote for the day:

If you don't write your own story, you'll be nothing but a footnote in someone else's. -- Peter Shankman

Daily Tech Digest - October 15, 2019

Microsoft's TypeScript language: Startup reveals long journey to JavaScript spin-off

With the focus now on developer productivity, engineers looked for areas of the codebase that would produce obvious gains, such as concentrating on converting files that delivered quick wins, as well as prioritizing tooling and configuration. There were also technical barriers to be overcome. The migration team, for example, noticed engineers often needed to import special utilities to use TypeScript. So it focused on helping developers, so that they could write TypeScript in any service or component. "Whether it's the back-end, front-end, scripts, or devops tasks, we wanted our engineers to be able to write code in TypeScript and have it just work," explained Autry. He details a number of other steps the company took to ensure that the conversion process was simple, safe and automated, as well as improving the code review process.  Of course, ensuring everyone was able to feel and be productive meant offering a way for teammates to share problems that others may have the experience to solve. So the migration created a #typescript channel in Slack where developers leading the migration could answer questions and monitor for common issues.

The software-defined data center drives agility

sdn software defined network architecture
With Open Networking, you are not reinventing the way packets are forwarded, or the way routers communicate with each other. Why? Because, with Open Networking, you are never alone and never the only vendor. You need to adapt and fit, and for this, you need to use open protocols. The value of SDN is doing as much as possible in the software so you don’t depend on the delivery of new features to come from a new generation of the hardware. You want to put as much intelligence as possible into the software, thus removing the intelligence from the physical layer. You don’t want to build the hardware features; instead, you want to use the software to provide the new features. This is an important philosophy and is the essence of Open Networking. From the customer's point of view, they get more agility as they can move from generation to generation of services without having hardware dependency. They don’t have to incur the operational costs of swapping out the hardware constantly.

"At a basic level, a web API and a web app are very similar in that they are both typically code hosted on some form of web or application server, such as Apache, Tomcat or Node," said Peter Blum, a vice president of technology at Instart, a cloud service for web application performance and security. There are plenty of differences between APIs and web applications. On the one hand, the goal of a web application is to provide a complete user experience to a client, typically delivered through a web browser. On the other hand, a web API is typically just an assortment of methods that can be invoked to perform a specific task. But, web applications and web APIs are similar in the fact that they are both accessed over HTTP, both process input from a client and both have access to backend services, such as databases, NoSQL stores and mail servers. Since both share the same attack surface, when breached, they can provide hackers access to a similar set of resources.

What are microservices? Your next software architecture

What are microservices? Your next software architecture
A catchphrase you’ll often hear used about microservices architectures is that they should feature “smart endpoints and dumb pipes.” In other words, microservices should aim to use basic and well-established communication methods rather than complex and tight integration. As noted, this is another thing that distinguishes microservices from SOA. In general, communication between microservices should be asynchronous, in the sense that code threads aren’t blocked waiting for responses. (It’s still fine to use synchronous communications protocols such as HTTP, though asynchronous protocols such as AMQP (Advanced Message Queuing Protocol) are also common in microservices architectures.) This kind of loose coupling makes a microservices architecture more flexible in the face of the failure of individual components or parts of the network, which is a key benefit. One of the most popular is Spring Boot, which is specifically designed for microservices; Boot is extended by Spring Cloud, which as the name suggests, allows you to deploy those services to the cloud as well.

Apple says Tencent isn’t snooping on your browsing habits

As Matthew Green, cryptographer and professor at Johns Hopkins University explains, safe browsing providers send a list of hashed prefixes for malicious sites to users’ phones. If Safari matches the prefix of a site that the user tries to visit against that list, it goes back and asks the provider for a full list of the sites with that prefix, enabling it to check for the malicious site without divulging its address to the provider. Apple’s statement suggests that only devices registered to China get the Tencent list (the rest of us get Google’s), and that the web addresses you visit are never sent to either company. However, as Apple’s message in iOS settings clearly states, the company may still be able to log your own IP address. Green explains that this could represent a privacy issue if the provider chose to aggregate all the requests that your phone sent it to “extract a signal from the noisy Safe Browsing results”. The worry here is that if a single company sees your IP address enough times, along with a list of site prefixes that you’re worried about, it might be able to start making deductions about your surfing habits.

PKI’s Role In Forging Digital Trust

PKI’s role in forging digital trust - CIO&Leader
PKI based security solutions have been popular among organizations due to the unbeatable track record for mission-critical enterprise applications. It is built upon complex mathematical cryptographic functions designed to create, store, manage and send out authenticated digital certificates and their associated encryption keys. ... Disruption of services - leading to financial loss, damage to business reputation are some major business risk factors that enterprises must deal with during a confirmed cyber breach. Today, enterprises who deal with sensitive user and business data have realized that they can fortify digital trust at user, device, or server level. This is where PKI-based solutions can offer assurance and the much-needed immunity against breaches while mitigating business risks.  PKI can be easily integrated into a digital ecosystem made up of IoT devices; IoT-enabled networks, surveillance systems, and other mission-critical applications. PKI works this out by facilitating the secure exchange of data between end-users, connected devices, embedded systems, web servers or programs/applications addressing business risks and vulnerabilities which form the basis of digital trust.

5 Google Cloud tools you should know

Google added Access Transparency to enable users to view Google's services logs. Transparency has been a huge concern for cloud customers in recent years. They want to know how their cloud provider manages the underlying infrastructure that supports their applications. IT teams can use Access Transparency to monitor Google's internal logs pertaining to their accounts. The logs outline what exactly a Google admin did to resolve any issues that may have occurred with a specific customer's service. This Google Cloud tool works with six other Google services: Compute Engine, App Engine, Cloud Storage, Persistent Disk, Cloud Key Management Service and Cloud Identity and Access Management -- with more additions on the way. On top of helping monitor any maintenance being done to their workloads, Access Transparency also helps admins with system audits. 

Three Major Cybersecurity Pain Points to Address for Improved Threat Defense

An organization with a relatively small security budget and pool of experts can opt for all-in-one packages such as security incident and event management (SIEM) software. This product is especially useful for organizations whose network is comprised of several disparate systems that run different applications. Of course, this may not be a bulletproof solution, as like any program, SIEM software has its limitations. SIEM solutions and similar tools such as unified threat management (UTM) systems can benefit from additional threat intelligence sources to cross-check and vet initial findings with. Readily available data feeds and application programming interfaces (APIs) can prove handy in threat correlation — finding connections among potential threat sources, attacks, and malicious actors, for instance. A company that doesn’t have its threat experts or IT security team, meanwhile, can opt to hire third-parties to take care of its needs. 

Multimodal learning: The future of artificial intelligence

Use cases for multimodal applications span across industries, according to the post. In the automotive industry, Advanced Driver Assistance Systems (ADA), In-Vehicle Human Machine Interface (HMI) assistants, and Driver Monitoring Systems (DMS) are all being introduced to multimodal systems.  Robotics vendors are integrating multimodal systems into robotics HMIs and movement automation, the post said. Consumer device companies are seeing multimodal system applications in security and payment authentication, recommendation and personalization engines, and personal assistants.  Medical companies and hospitals are in the early stages of adopting multimodal learning techniques, but promising applications exist with medical imaging. The media and entertainment industries are also beginning to adopt multimodal learning with content structuring, content recommendation systems, personal advertising, and automated compliance marketing, the post said. Until more companies publicly adopt this way of operating, multimodal learning systems will remain unfamiliar to most people. However, the future of AI is heading in the multimodal direction.

The Top Three Benefits of Enterprise Architectur

For many organizations, these standards have been impossible to meet as their enterprise architectures are burdened by the use of systems that were not built for purpose. Basic visualization tools, spreadsheets and even word processors have typically played stand-in for dedicated EA solutions. The non-purpose-built systems lacked the industry standards needed to accurately capture and align business and IT elements and how they link together. Additionally, collaboration was often marred by issues with outdated, and even disparate file versions and types. This being due to business’ lacking the systems necessary to continuously and methodically maintain models, frameworks and concepts as they evolve. Therefore, a key milestone in maturing a modern enterprise architecture initiative, is developing a single source of truth, consistent across the enterprise. This requires the implementation of a dedicated, centralized and collaborative enterprise architecture tool, be that on-premise, or via the cloud.

Quote for the day:

"Pull the string and it will follow wherever you wish. Push it and it will go nowhere at all." -- Dwight D. Eisenhower

Daily Tech Digest - October 14, 2019

WiFi 6 will face 5G competition

Telecommunication network above city, wireless mobile internet technology for smart grid or 5G LTE data connection, concept about IoT, global business, fintech, blockchain
The decision about whether to put your faith in WiFi 6, or look to 5G for IoT applications, or other use-cases where there is a high density of connected devices isn't easy. Both technologies offer faster connections than their predecessors along with the ability to support more concurrent devices. On the other hand, retro-fitting devices to support either of the new standards is a massive undertaking, and then there's question of which option is the best fit. In a recent interview, Qualcomm's Rasmus Hellberg said: "We looked at all the Wi-Fi access points we have today, Then we added in millimetre wave at the exact same points. It's a big opportunity to drive millimetre wave indoors as a private network". With the number of IoT devices forecast to increase rapidly over the next few years, businesses will need to consider how to link all of those devices to existing networks. WiFi 6 seems like a safe path for transitioning from existing WiFi 4 and WiFi 5 networks. But 5G could make tasks such as comms configurations easier. Once a SIM card is programmed correctly, simply installing it would allow for devices to connect almost instantly.

What Top Innovation Leaders in Banking Have in Common

The increasing influence of non-traditional financial services companies has been driven by their ability to leverage new technology and data for an improved customer experience. According to the Innovation in Retail Banking 2019 research, as many as one-third of consumers across the world are now using at least one FinTech app, whether that’s peer-to-peer payments, financial planning, savings and investments, borrowing, or insurance. Consumers value these apps because they provide exceptional digital-first experiences, and because they seamlessly integrate with other apps that they are using. In other words, the battlefield has changed and legacy banking organizations need to respond. The most obvious way for traditional financial institutions to respond to new competition is to innovate ‘like a fintech’. While this may not be a ‘build from within’ solution, even a partnering opportunity can move the needle and either retain or acquire customers. The research done for the Innovation in Retail Banking 2019 report found that the most advanced innovators are acutely focused on improving the customer experience and are seeing results far superior than less advanced innovation firms.

Eight ways to secure your data on IoT devices

Before you get an IoT device or solution, first ensure that it is secure by design. This might not be feasible in some environments because of factors such as legacy products, price points or just because you don’t have enough details on how or if they are secured. You should reconsider acquiring an IoT device if the provider is unable to supply adequate information about the security approach of the device. In case, your IoT devices are already in place, security can be built around it. This is a usual occurrence in environments that has legacy products. You should also make sure that the manufacturer of your IoT device/s is able to provide timely patches and updates for the entire lifeline of the device. A lot of security vulnerabilities are revealed after the product has entered the market. As the lifetime for IoT devices can be as long as 10 to 20 years, its manufacturer should be able to provide you with its patches and updates for that duration. Following are the eight effective ways in which you can secure your data on IoT devices.

Analysis reveals the most common causes behind mis-issued SSL/TLS certificates

One of the conclusions that the researchers reached is that, as things stand now, Root Program’s owners have tremendous power in the PKI network and they should use it to penalize those CAs that put their welfare over that of the Public Key Infrastructure. “With just their independent decision they can end the business of a CA, especially if several Root Program’s owners are aligned with the revocation’s posture,” they noted. “Given that they are also the owners of the web browsers, they are judge, jury and executioner in the network. On the other hand, if a misbehavior is detected in a CA but not all the program’s owners agree with a mass removal of it, a removal by a sole owner may have a negative impact on this owner given the potential loss of customers that that decision may carry; therefore, if there is no consensus between Root Program’s owners, CAs may keep with their miss-practices.” They have also proposed several solutions for CAs and Root Program Owners to implement to improve trust in PKI.

Why the future of data security in the cloud is programmable

With the ability to monitor data access, govern it, and selectively protect data even from developers themselves wired into applications, there is another door that swings wide open: application portability. Many companies, from traditional manufacturing all the way to software companies themselves, are looking for ways to leverage the economics and flexibility of cloud infrastructure. For most of these companies, the number 1 and number 2 concerns as to going to infrastructure that they don’t control are security and compliance. But when a development team wires in tools to allow for the control of data regardless of where the application is deployed, the business is free to determine the best infrastructure for the application in question based on performance, cost, reliability and other IT priorities. Cloud options from platform-as-a-service all the way to serverless architecture, where IT doesn’t have to maintain any of the infrastructure stack, are all on the table. Through this lens the economic benefits of programmable data security come completely into focus. Adopting this approach, by way of example, ALTR has been able to help a business optimise its digital footprint based on delivery of technology services, not on the security of them.

Prayer Is Brexit Strategy as Services Brace for Red Tape Storm

Canary Wharf, \a Docklands Light Railway (DLR) train passing by in August 2019
“There is no fallback position as far as data is concerned, which is a bit of an oversight,” said Jack Bedell-Pearce, 4D’s CEO. You need to understand “how you’re going to be impacted when the rug is pulled from underneath your feet.” 4D helped survey businesses about data sovereignty — the location where data is stored — in the aftermath of the Brexit vote, and found that 87% were not looking at the issue. Judging by anecdotes, the figure is probably similar now, Bedell-Pearce said. Brexiteers have proclaimed that falling back on trade rules established by the Geneva-based World Trade Organization will be more than sufficient to ensure that the U.K. can continue to trade with Europe and the rest of the world. But the rules regarding services are particularly complicated and will provide less certainty to exporters. Further complicating matters is the fact the European Commission does not have exclusive authority over the EU’s internal services market as it does with goods under the customs union. That means each EU member state can impose their own services rules.

Microsoft Defender 'Tamper Protection' reaches general availability

Microsoft says that Tamper Protection "essentially locks Microsoft Defender" and prevents security settings from being changed through third-party apps and methods like: Configuring settings in Registry Editor on a Windows machine; Changing settings through PowerShell cmdlets; and Editing or removing security settings through group policies. The feature will be available for both the free version of Microsoft Defender (the one that ships with all modern Windows OS versions) but also with Microsoft Defender Advanced Threat Protection (ATP) (the commercial version, primarily employed on enterprise networks). Work on Tamper Protection began in December 2018, when it was first rolled out to Windows Insiders previews. In March this year, Microsoft rolled Tamper Protection for more tests to Microsoft Defender ATP versions. Starting today, the feature will be available for all Microsoft Defender users. Microsoft told ZDNet in a phone call last week that the feature will be enabled by default for all users in the coming weeks, in a multi-stage rollout. If users don't want to wait, Microsoft said they can also enable Tamper Protection right now.

Q&A on the Book Real-World Bug Hunting

The impact of an SQL injection vulnerability really depends on the information that a database contains. In the best case scenario, no harm is done because the database includes no sensitive information and is properly isolated for other sensitive systems. In the worst case scenario, SQL injection can be devastating if an attacker has access to extract large amounts of personally identifiable information, they can create administrative accounts for themselves on the application or can read local files from the server. An example of this is covered in the book: a SQL Injection in Drupal version 7 from 2015, which allowed for privilege escalation, arbitrary PHP execution, or other attacks depending on the configuration of the application. Content management systems and other development frameworks are only as secure as the developers creating web applications with them. It isn’t hard to introduce SQL injection vulnerabilities if you are mistakenly adding user controlled input to SQL statements. But that said, content management systems like Drupal and Wordpress and frameworks like Ruby on Rails do a great job of making it harder to introduce the vulnerability.

10 hot micro-data-center startups to watch

nano data server data center small hand holding network server johannes w via unsplash scanrail get
Data-hungry technology trends such as IoT, smart vehicles, drone deliveries, smart cities and Industry 4.0 are increasing the demand for fast, always-on edge computing. One solution that has emerged to bring the network closer to the applications generating and end users consuming that data is the micro data center. The micro data center sector is a new space filled with more noise than signal. If you go hunting for a micro data center for your business you’ll find everything from suitcase-sized computing stacks that replace a server closet to modular enclosures delivered by semi-trucks to larger units that reside at the foot of cell towers to dedicated edge data centers with standardized designs that can spring up wherever there’s demand and where real estate or access rights are available, including easements, rooftops and industrial sites. Several of the startups here started out in adjacent spaces, such as IaaS or colocation services and have only recently added micro data centers to their portfolios. Now, with the arrival of 5G, the demand for edge data centers could be ready to explode, with several of the startups below intending to drop micro data centers at the base of every 5G tower they can gain access to.

When Using Cloud, Paranoia Can Pay Off

While Google has locked down G Suite with encryption, two-factor authentication, and its emphasis on a culture focused on security, concerns still remain about situations where government can compel data disclosure, as well as whether automated scans or collected metadata can leak significant private details. "The short version is that, theoretically, Google can see anything that you can see in G Suite," says Jeremy Gillula, technology projects director with the Electronic Frontier Foundation. "Whether or not they actually do, is a totally different story." Users of any cloud productivity software generally have three threats to worry about: hackers, providers, and governments. Because both Microsoft and Google encrypt data at rest in their cloud, the information is protected against direct online attack. Steal the data, and it is still unreadable. However, online attackers have increasingly focused on stealing credentials and accessing the cloud by impersonating the authorized user. To foil such attacks, companies and individuals need to add multi-factor authentication, experts say. 

Quote for the day:

"The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn." -- Alvin Toffler