Daily Tech Diest - February 23, 2018

Cisco automation tools make it easier for network admins

Cisco has a new automation software portfolio that helps global service providers manage massive amounts of network data and better prepare for impending security threats. "We built out an entirely new portfolio of automation tools. It really centers on the fact that our customers have a whole set of challenges. They're currently spending on average somewhere between 3-4 times the amount to operate an infrastructure than they are just to purchase the infrastructure," said Jonathan Davidson, senior vice president and general manager of Cisco Service Provider Networking. In 2016, there were 17 billion devices and connections running on service provider networks and this is forecast to grow to 27 billion by 2021. To address this shift, the Cisco Crosswork Network Automation portfolio will assist industry adoption of complete lifecycle network automation and intent-based networking to help networks predict change and react in near real time.

Leveraging Security to Enable Your Business

The first step is to look into more modern technologies, such as a reverse proxy, which can overcome the cumbersome nature of multiple VPNs and ensure quick, seamless, and secure access from anywhere, on any device. With this approach, there is no need to repeatedly require MFA once a user has "passed the test" of proving who they are. Businesses can also leverage adaptive authentication technology, which automatically adjusts authentication requirements relative to the risk of the request. For example, an initial login may require MFA, but subsequent logins by the same user, from the same device, in the same day would not. If, however, the request suddenly comes from an unknown device, there could be something fishy going on. With adaptive authentication, the rules for an MFA requirement for specific risky login instances can be preset and automatically enforced.

AI for good: Can AI be trusted - and is it too late to ask?

Artificial Intelligence Trusted
The answer seems to point towards human input: in the words of AI researcher Professor Joanna Bryson, “if the underlying data reflects stereotypes, or if you train AI from human culture, you will find bias.” And if we’re not careful, we risk integrating that bias into the computer programs that are fast taking over the running of everything from hospitals to schools to prisons – programs that are supposed to eliminate those biases in the first place. Nigel Willson, global strategist at Microsoft, points out the importance of recognising how no technology is ever black and white. “The reality is that AI is like anything else – it can be very dangerous, or it can be amazing, based on how it’s used or misused,” he says. AI is only as as accurate as the information on which it is trained – meaning that we must be very careful with how we train it. Awareness of ‘unfair’ bias integrated into decades of data has led researchers to attempt the design of algorithms that counteract that bias when scraping the data: but this sparks the question of what constitutes ‘fairness’.

Telecom Opportunities: How to Monetize IoT

When League of Legends, one of the most popular online video games, went through the issue of lagging, their developers created their own internet to let players connect to the game. Riot Games created a network of routers, data centers and peer ISPs to create a network that placed latency before costs. Players from any part of the country would be directly connected to Riot’s access servers rather than routers on the regular ISP network. With 5G, Telcos can offer new levels of latency but there is more than just network connectivity that they can offer to gaming companies. One example Ericsson showed me during a recent visit to Kista, Sweden was an interface that allowed the gamer to manage their account from inside the game, for example they could top up their data allowance without having to exit the game.

“There is also growing use of managed security services to complement their on-site capability and provide secure file transfers and software updates, as well as continuous monitoring,” he said. However, he said that although there is a high level of awareness of the need for good cyber security in industrial operations, in many cases cyber security fundamentals are not yet in place. A recent Honeywell-sponsored survey by LNS Research of 130 decision makers from industrial companies revealed that only 37% were monitoring their plant systems for suspicious behaviour and 20% are not conducting regular risk assessments. “The survey also found that 53% said they had already experienced cyber security breach, but that is not surprising, given how young we are globally in cyber protection for critical infrastructure and industrial cyber security,” said Zindel.

Big Data Isn’t a Thing; Big Data is a State of Mind

Big Data is about exploiting the unique characteristics of data and analytics as digital assets to create new sources of economic value for the organization. Most assets exhibit a one-to-one transactional relationship. For example, the quantifiable value of a dollar as an asset is finite – it can only be used to buy one item or service at a time. Same with human assets, as a person can only do one job at a time. But measuring the value of data as an asset is not constrained by those transactional limitations. In fact, data is an unusual asset as it exhibits an Economic Multiplier Effect, whereby it never depletes or wears out and can be used simultaneously across multiple use cases at near zero margin cost. This makes data a powerful asset in which to invest. Understanding the economic characteristics of data and analytics as digital assets is the first step in monetizing your data via predictive, prescriptive and preventative analytics.

How long does it take to detect a cyber attack?

The study found that US companies took an average of 206 days to detect a data breach. This is a slight increase on the previous year (201 days). Ponemon suggests all organizations should aim to identify a breach within 100 days. The average cost of identifying a breach within this time was $5.99 million, but for breaches that took longer to identify, the average cost rose to $8.70 million. There is a similar correlation in terms of containing a breach. Breaches that took less than 30 days to contain had an average cost of $5.87 million, but this rose to $8.83 million for breaches that took longer to contain. The good news is that organizations have become significantly better at containing breaches, with the average time dropping from 70 days in 2016 to 55 days. The majority of breached organizations are notified by someone other than their own staff, according to Mandiant’s M-Trends 2017 report. It found that 53% of breaches were discovered by an external source.

Hackers are selling legitimate code-signing certificates to evade malware detection

Code-signing certificates are designed to give your desktop or mobile app a level of assurance by making apps look authentic. Whenever you open a code-signed app, it tells you who the developer is and provides a high level of integrity to the app that it hasn't been tampered with in some way. Most modern operating systems, including Macs, only run code-signed apps by default. But not only does code-signing have an affect on users who inadvertently install malware, code-signed apps are also harder to detect by network security appliances. The research said that hardware that uses deep packet inspection to scan for network traffic "become less effective when legitimate certificate traffic is initiated by a malicious implant." That's been picked up by some hackers, who are selling code-signing certificates for as little as $299. Extended validation certificates which are meant to go through a rigorous vetting process can be sold for $1,599.

Machine-learning promises to shake up large swathes of finance

Natural-language processing, where AI-based systems are unleashed on text, is starting to have a big impact in document-heavy parts of finance. In June 2016 JPMorgan Chase deployed software that can sift through 12,000 commercial-loan contracts in seconds, compared with the 360,000 hours it used to take lawyers and loan officers to review the contracts. Machine-learning is also good at automating financial decisions, whether assessing creditworthiness or eligibility for an insurance policy. Zest Finance has been in the business of automated credit-scoring since its founding in 2009. Earlier this year it rolled out a machine-learning underwriting tool to help lenders make credit decisions, even for people with little conventional credit-scoring information. It sifts through vast amounts of data, such as people’s payment history or how they interact with a lender’s website.

The emerging link between employee well-being and cyber security services

This epidemic means big problems for employees and employers alike — and a significant opportunity for brokers who can provide solutions that protect employees’ financial well-being. When identity thieves take advantage of employees’ stolen personal information to obtain credit or loans, or commit various types of fraud, both employees and employers pay a steep price. ...  In other words, the identity theft resolution process is not only stressful for employees, it has a significant impact on their productivity at work. The reason is because without the assistance of an identity theft resolution resource, employees have to do a lot of leg work, such as filing police reports, writing letters and making trips to financial institutions to report fraud.

Quote for the day:

"You never really learn much from hearing yourself speak." -- George Clooney

Daily Tech Digest - February 22, 2018

(Image: geralt/Pixabay)
Organizations are investing more money in their analytics programs. These programs do more now than recommending a new blouse or what to watch next on Netflix. If you are SpaceX and your data is incorrect, it could result in the loss of a multi-million-dollar rocket, Biltz said. That's a big deal. The Accenture report, culled from survey responses of more than 6,300 business and IT executives worldwide, found that 82% of those executives are using data to drive critical and automated decisions. What's more, 97% of business decisions are made using data that managers consider to be of unacceptable quality, Accenture notes, citing a study published in HBR. "Now it becomes vitally important that the data you have is as true, as correct, as you can make it," Biltz said. Right now, organizations don't have the systems in place to do that." Plus, there's just more data now, coming from a variety of different sources, than there ever has been in the past.

9 ways to overcome employee resistance to digital transformation

While it's easy to assume technology changes would cause the most issues in the transformation process, tech isn't actually the root of the problem, said R/GA Austin's senior technology director Katrina Bekessy. "Rather, it's usually organizing the people and processes around the new tech that's difficult," Bekessy said. "It's hard to change the way people work, and realign them to new roles and responsibilities. In short, digital transformation is not only a transformation of tech, but it also must be a transformation in a team's (or entire company's) culture and priorities." Inertia and ignorance are two key parts of employee resistance to transformation, according to Michael Dortch, principal analyst and managing editor at DortchOnIT.com. "Inertia results in the 'but we've always done it this way' response to any proposed change in operations, process, or technology, while ignorance limits the ability of constituents to see the necessity and benefits of digital transformation," Dortch said.

8 Machine Learning Algorithms explained in Human language

Machine Learning explained in human language
What we call “Machine Learning” is none other than the meeting of statistics and the incredible computation power available today (in terms of memory, CPUs, GPUs). This domain has become increasingly visible important because of the digital revolution of companies leading to the production of massive data of different forms and types, at ever increasing rates: Big Data. On a purely mathematical level most of the algorithms used today are already several decades old. ... You are looking for a good travel destination for your next vacation. You ask your best friend for his opinion. He asks you questions about your previous trips and makes a recommendation. You decide to ask a group of friends who ask you questions randomly. They each make a recommendation. The chosen destination is the one that has been the most recommended by your friends. The recommendations made by your best friend and the group will both make good destination choices. But when the first recommendation method works very well for you, the second will be more reliable for other people.

3 Things You Need to Know (and Do) Before Adopting AI

3 Things You Need to Know (and Do) Before Adopting AI
AI enables machines to learn and act, either in place of humans or to supplement the work of humans. We’re already seeing widespread use of AI in our daily lives, such as when brands like Netflix and Amazon present us with options based on our buying behaviors, or when chat bots respond to our queries. AI is used to pilot airplanes and even streamline our traffic lights. And, that’s just the beginning as we enter the age of AI and machine learning, with these technologies replacing traditional manufacturing as drivers of economic growth. A McKinsey Global Institute study found that technology giants Baidu and Google spent up to $30 billion on AI in 2016, with 90 percent of those funds spent on research and development, and deployment and 10 percent on AI acquisitions. In 2018, AI adoption is expected to jump from 13 percent to 30 percent, according to Spiceworks' 2018 State of IT report.

Is the IoT backlash finally here?

Is the IoT backlash finally here?
As pretty much everyone knows, the Internet of Things (IoT) hype has been going strong for a few years now. I’ve done my part, no doubt, covering the technology extensively for the past 9 months. As vendors and users all scramble to cash in, it often seems like nothing can stop the rise IoT. Maybe not, but there have been rumblings of a backlash to the rise of IoT for several years. Consumer and experts worry that the IoT may not easily fulfill its heavily hyped promise, or that it will turn out to be more cumbersome than anticipated, allow serious security issues, and compromise our privacy.  Others fear the technology may succeed too well, eliminating jobs and removing human decision-making from many processes in unexamined and potentially damaging ways. As New York magazine put it early last year, “We’re building a world-size robot, and we don’t even realize it.” Worse, this IoT robot “can only be managed responsibly if we start making real choices about the interconnected world we live in.”

Intel expects PCs with fast 5G wireless to ship in late 2019

Intel 5g notebook
Intel will show off a prototype of the new 5G connected PC at Mobile World Congress show in Barcelona. In addition the company will demonstrate data streaming over the 5G network. At its stand, Intel said that it will also show off eSIM technology—the replacement for actual, physical SIM cards—and a thin PC running 802.11ax Wi-Fi, the next-gen Wi-Fi standard. Though 5G technology is the mobile industry’s El Dorado, it always seems to be just over the next hill. Intel has promoted 5G for several years, saying it will handle everything from a communications backbone for intelligent cars to swarms of autonomous drones talking amongst themselves.  Carriers, though, have started nailing down when and where customers will be able to access 5G technology. AT&T said Wednesday, for example, that a dozen cities including Dallas and Waco, Texas, and Atlanta, Georgia, will receive their first 5G deployments by year’s end. Verizon has plans for three to five markets, including Sacramento, California.

Who's talking? Conversational agent vs. chatbot vs. virtual assistant

A conversational agent is more focused on what it takes in order to maintain a conversation. With virtual agents or personal assistants, those terms tend to be more relevant in cases where you're trying to create this sense that the conversational agent you're dealing with has its own personality and is somehow uniquely associated with you. At least for me, the term virtual assistant sort of metaphorically conjures the idea of your own personal butler -- someone who is there with you all the time, knows you deeply but is dedicated to just you and serving your needs. .. I think there becomes an intersection between the two ideas. For it to serve you on a personal level, any kind of good personal assistant or virtual assistant needs to retain a great deal of context about you but then use that context as a way of interacting with you -- to use the conversational agent technique for not just anticipating your need but responding to your need and getting to know you better to be able to respond to that need better in the future.

Why the GDPR could speed up DevOps adoption

One of the key trends that's happening now, especially with the changing demographics and change in technology, is most people are interacting with businesses digitally, via their phones, via their computers and so on. A lot of businesses, whether it's retail or banking or insurance or whatever have you—the face of those businesses has started to become digital and where they're not becoming digital there are new companies that are springing up that are disrupting those businesses. DevOps, the whole movement, the single biggest thing about it is agility, which is the ability to bring applications to market quicker, so this new demographic that's interacting with all the businesses digitally can consume or can interact with these businesses in ways that they're used to interacting with everything else, and for these businesses to protect themselves against disruption from other people.

Cisco Report Finds Organizations Relying on Automated Cyber-Security

Among the high-level findings in the 68-page report is that 39 percent of organizations stated they rely on automation for their cyber-security efforts. Additionally, according to Cisco's analysis of over 400,000 malicious binary files, approximately 70 percent made use of some form of encryption. Cisco also found that attackers are increasingly evading defender sandboxes with sophisticated techniques. "I'm not surprised attackers are going after supply chain, using cryptography and evading sandboxed environments, we've seen all these things coming for a long time," Martin Roesch, Chief Architect in the Security Business Group at Cisco, told eWEEK. "I've been doing this for so long, it's pretty hard for me to be surprised at this point." Roesch did note however that he was pleasantly surprised that so many organizations are now relying on automation, as well as machine learning and artificial intelligence, for their cyber-security operations.

Artificial general intelligence (AGI): The steps to true AI

Artificial general intelligence (AGI): The steps to true AI
AI lets a relatively dumb computer do what a person would do using a large amount of data. Tasks like classification, clustering, and recommendations are done algorithmically. No one paying close attention should be fooled into thinking that AI is more than a bit of math. AGI is where the computer can “generally” perform any intellectual task a person can and even communicate in natural language the way a person can. This idea isn’t new. While the term “AGI” harkens back to 1987, the original vision for AI was basically what is now AGI. Early researchers thought that AGI (then AI) was closer to becoming reality than it actually was. In the 1960s, they thought it was 20 years away. So Arthur C. Clarke was being conservative with the timeline for 2001: A Space Odyssey. A key problem was that those early researchers started at the top and went down. That isn’t actually how our brain works, and it isn’t the methodology that will teach a computer how to “think.” In essence, if you start with implementing reason and work your way down to instinct, you don’t get a “mind.”

Quote for the day:

"A man's character may be learned from the adjectives which he habitually uses in conversation." -- Mark Twain

Daily Tech Digest - February 21, 2018

The New Era Of Artificial Intelligence

AI will soon become commoditized and democratized, just as electricity was in its time. Today we use computers, smartphones, other connected devices, and, mostly, apps. Whilst access to internet technologies has constantly improved over the past decades, very few people are able to program these and generate income by intelligently exploiting consumer data, which, in theory, is not theirs. GAFA (Google, Amazon, Facebook and Apple) and the Chinese BAT (Baidu, Alibaba and Tencent,) are among the most prominent players in these fields. Tomorrow’s world would be different with the emergence of relatively simple, portable AI devices, which might not necessarily be connected to each other by the internet, but would feature completely new protocols and peer-to-peer technologies. This will significantly re-empower consumers. Because it is decentralized, portable AI will be available for the masses within a decade or so. Its use will be intuitive; just as driving a car is today. Portable AI will also be less expensive than motorized vehicles, 

What is DevSecOps and Vulnerabilities?

The principles of security and communications should be introduced every step of the way when building applications. The philosophy of DevSecOps was created by security practitioners who seek to “work and contribute value with less friction”. These practitioners run a web site that details an approach to improving security, explaining that “the goal of DevSecOps is to bring individuals of all capabilities to a high level of security efficiency in a short period of time. Security is everyone responsibility.” DevSecOps statement includes principles such as building a lower access platform, focusing on science, avoiding fear, uncertainty and doubt, collaboration, continuous security monitoring and cutting edge intelligence. Community DevSecOps promotes action directed at detecting potential issues or exploiting weaknesses. In other words, think like an enemy and perform similar tactics such as trying to penetrate to identify gaps that can be exploited and that need to be treated.

7 essential technologies for a modern data architecture

7 essential technologies for a modern data architecture
At the center of this digital transformation is data, which has become the most valuable currency in business. Organizations have long been hamstrung in their use of data by incompatible formats, limitations of traditional databases, and the inability to flexibly combine data from multiple sources. New technologies promise to change all that. Improving the deployment model of software is one major facet to removing barriers to data usage. Greater “data agility” also requires more flexible databases and more scalable real-time streaming platforms. In fact no fewer than seven foundational technologies are combining to deliver a flexible, real-time “data fabric” to the enterprise. Unlike the technologies they are replacing, these seven software innovations are able to scale to meet the needs of both many users and many use cases. For businesses, they have the power to enable faster and more intelligent decisions and to create better customer experiences.

Tesla cloud systems exploited by hackers to mine cryptocurrency

Researchers from the RedLock Cloud Security Intelligence (CSI) team discovered that cryptocurrency mining scripts, used for cryptojacking -- the unauthorized use of computing power to mine cryptocurrency -- were operating on Tesla's unsecured Kubernetes instances, which allowed the attackers to steal the Tesla AWS compute resources to line their own pockets. Tesla's AWS system also contained sensitive data including vehicle telemetry, which was exposed due to the unsecured credentials theft. "In Tesla's case, the cyber thieves gained access to Tesla's Kubernetes administrative console, which exposed access credentials to Tesla's AWS environment," RedLock says. "Those credentials provided unfettered access to non-public Tesla information stored in Amazon Simple Storage Service (S3) buckets." The unknown hackers also employed a number of techniques to avoid detection. Rather than using typical public mining pools in their scheme

Micron sets its sights on quad-cell storage

Micron sets its sights on quad-cell storage
The first single-level cell, with one bit per cell, first emerged in the late 1980s when flash drives first appeared for mainframes. In the late 1990s came multi-level cell (MLC) drives capable of storing two bits per cell. Triple-level cell (TLC) didn't come out until 2013 when Samsung introduced its 840 series of SSDs. So, these advances take a long time, although they are being sped up by a massive increase in R&D dollars in recent years. Multi-bit flash memory chips store data by managing the number of electronic charges in each individual cell. With each new cell, the number of voltage states doubles. SLC NAND tracks only two voltage states, while MLC has four voltage states, TLC has eight voltage states, and QLC has 16 voltage states. This translates to much lower tolerance for voltage fluctuations. As density goes up, the computer housing the SSD must be rock-stable electrically because without it, you risk damaging cells. This means supporting electronics around the SSD to protect it from fluctuations.

When it comes to cyber risk, execute or be executed!

Accountability must be clearly defined, especially in strategies, plans and procedures. Leaders at all levels need to maintain vigilance and hold themselves and their charges accountable to execute established best practices and other due care and due diligence mechanisms. Organizations should include independent third-party auditing and pen-testing to better understand their risk exposure and compliance posture. Top organizations don’t use auditing and pen-testing for punitive measures, but rather, to find weaknesses that should be addressed. Often, they find that personnel need more training, and regular cyber drills and exercises to get to a level of proficiency commensurate with their goals. Those organizations that fail are those that do not actively seek to find weaknesses or fail to address known weaknesses properly. Sound execution of cyber best practices buys down your overall risk. With today’s national prosperity and national security reliant on information technology, the stakes have never been higher.

Hack the CIO

CIOs have known for a long time that smart processes win. Whether they were installing enterprise resource planning systems or working with the business to imagine the customer’s journey, they always had to think in holistic ways that crossed traditional departmental, functional, and operational boundaries. Unlike other business leaders, CIOs spend their careers looking across systems. Why did our supply chain go down? How can we support this new business initiative beyond a single department or function? Now supported by end-to-end process methodologies such as design thinking, good CIOs have developed a way of looking at the company that can lead to radical simplifications that can reduce cost and improve performance at the same time. They are also used to thinking beyond temporal boundaries. “This idea that the power of technology doubles every two years means that as you’re planning ahead you can’t think in terms of a linear process, you have to think in terms of huge jumps,” says Jay Ferro, CIO of TransPerfect, a New York–based global translation firm.

Taking cybersecurity beyond a compliance-first approach

Stack of legal documents with compliance and regulatory stamp
With high profile security breaches continuing to hit the headlines, organizations are clearly struggling to lock down data against the continuously evolving threat landscape. Yet these breaches are not occurring at companies that have failed to recognize the risk to customer data; many have occurred at organizations that are meeting regulatory compliance requirements to protect customer data.  Given the huge investment companies in every market are making in order to comply with the raft of regulation that has been introduced over the past couple of decades, this continued vulnerability is – or should be – a massive concern. Regulatory compliance is clearly no safeguard against data breach. Should this really be a surprise, however? With new threats emerging weekly, the time lag inherent within the regulatory creation and implementation process is an obvious problem. It can take over 24 months for the regulators to understand and identify weaknesses within existing guidelines, update and publish requirements, and then set a viable timeline for compliance.

Three sectors being transformed by artificial intelligence

While these industries will see significant AI adoption this year, the AI platforms and products that scale to mainstream adoption won’t necessarily be the household names you may expect. As the “Frightful Five” continue to grow and expand their reach across industries, they have designed powerful AI products. However, these platforms present challenges for smaller companies looking to implement AI solutions, as well as larger companies in competitive industries such as retail, online gaming, shipping, and travel to name a few. How can an advertiser on Facebook feel comfortable entrusting its data to a tech behemoth that may sell a product that competes with its business? Should a big data company using a Google AI feature be concerned about the privacy of its data? These risks are very real, yet businesses have options. They can instead choose to host data on independent platforms with independent providers, guarding their intellectual property while also supercharging the advancement of AI technology.

What the ‘versatilist’ trend means for IT staffing

IT staff who once only focused on systems in the datacenter now focus on systems in the public cloud as well. This means that while they understand how to operate the LAMP stacks in their enterprise datacenters, as well as virtualization, they also understand how to do the same things in a pubic cloud. As a result, they have moved from one role to two roles, or even more roles. However, the intention is that eventually that the traditional systems will go away completely, and they will just be focused on the cloud-based systems. I agree with Gartner on that, too. While I understand where Gartner is coming from, the more automation that sits between us and the latest technology means we need more technology specialists, not less. So, I’m not convinced that IT versatilists will gain new business roles to replace the loss of of the traditional datacenter roles, as Gartner suggests will happen.

Quote for the day:

"We're so busy watching out for what's just ahead of us that we don't take time to enjoy where we are." -- Bill Watterson