Daily Tech Digest - July 11, 2017

The Future of Digital Business is Self-Improving Products

That’s a lot of what digital business is all about — turning data into better products and services. But Tesla is taking things to a whole new level. The data from every car is sent to headquarters and shared with every other car on the road. So your car knows what to look out for even if you’ve never been on that street before. Tesla has essentially turned itself into a massively parallel learning machine. The Tesla customer experience improves automatically the more people use the products. And the company is gathering detailed information that can be used for many of other business opportunities. That’s perhaps why Tesla is now the most valuable US car company, eclipsing General Motors, even though GM makes over one hundred times as many cars.

NIST to security admins: You've made passwords too hard

NIST recommends administrators leave out overly complex security requirements that make it harder for users to do their jobs and don't really improve security, since frustrated users are more likely to look for shortcuts. For example, users struggle to memorize large numbers of passwords—the average user accesses more than 40 accounts—so they may either write down passwords, which defeats the purpose of having a "secret" password; reuse passwords, which makes it easier to break into accounts; or use variations of existing passwords, which makes it easier for attackers to guess the patterns. "The username and password paradigm is well past its expiration date," said Phil Dunkelberger, CEO of Nok Nok Labs. "Increasing password complexity requirements and requiring frequent resets adds only marginal security while dramatically decreasing usability.

Digital is Driving Transformation in the Financial Sector

What’s certain is that the pressure on financial institutions is mounting to adapt the way they deliver services to customers. The ubiquity of smartphones, laptops and tablets and shifting consumer habits are driving banks to be creative in how they differentiate. In a market where transactional services are becoming increasingly commoditized, many are seeking to redefine the customer experience as a point of competitive differentiation. Given the vital role of digital maturity, we examine five tenets of digital transformation in the banking and financial sector. ... Everywhere you look there is feedback, potentially valuable snippets of information about your business and how it might improve. The challenge financial firms face is channeling that feedback intelligently, i.e. gathering it, analyzing and extracting value from disparate pieces of data.

GraphQL vs REST: Overview

GraphQL is a query language, specification, and collection of tools, designed to operate over a single endpoint via HTTP, optimizing for performance and flexibility. One of the main tenants of REST is to utilize the uniform interface of the protocols it exists in. When utilizing HTTP, REST can leverage HTTP content-types, caching, status codes, etc., whereas GraphQL invents its own conventions. Another main focus for REST is hypermedia controls (a.k.a HATEOAS), which lets a well designed client run around an API like a human runs around the Internet; starting with a search for "How to complete my tax returns", reading a perfectly relevant article, and after a few clicks ending up on BuzzFeed article about Miley Cyrus throwing Liam Hemsworth a "Weed-Themed" birthday party.

Apache Flink: The Next Distributed Data Processing Revolution?

The Hadoop framework is capable of storing a large amount of data on a cluster. This is known as the Hadoop File System (HDFS) and it is used at almost every company which has the burden to store Terabytes of data every day. Then the next problem arose: how can companies process all the stored data? Here is where Distributed Data Processing frameworks come into play. In 2014, Apache Spark was released and it now has a large community. Almost every IT section has implemented at least some lines of Apache Spark code. Companies gathered more and more data and the demand for faster data processing frameworks is growing. Apache Flink (released in March 2016) is a new face in the field of distributed data processing and is one answer to the demand for faster data processing frameworks.

Why the Blockchain Needs More Failures to Succeed

In the world of startups, learning from failures is an inescapable reality, and part of the prevalent conventional wisdom. That is how the ecosystem and entrepreneurs move forward to greater heights, and with more successes. But in the burgeoning blockchain segment, we haven’t seen that many failures yet. At least, not of the scale and variety required to extract long-lasting lessons for the entire industry. And certainly, not enough to warrant a call for an imminent crash or correction. Failures are important because their sum results in a new body of knowledge that is rich with useful insights and best practices. An aftermath of real failures can make the whole blockchain ecosystem more resilient, because it will result in revealing the boundaries and realities of what’s possible, useful, absurd, impossible, repeatable and scalable

Big Data's Potential For Disruptive Innovation

Disruptive innovations are: More accessible (with respect to distribution or usability); Cheaper (from a customer perspective); And utilize a business model with structural cost advantages (with respect to existing solutions) than their existing counterparts in the market. The reason why the above characteristics of disruption are important is that when all 3 exist, it is very difficult for an existing business to stay in competition. Whether an organization is saddled with an outmoded distribution system, highly trained specialist employees or a fixed infrastructure, adapting quickly to new environments is challenging when one or all of those things become outdated. Writing off billions of dollars of investment, upsetting the distribution partners of your core business, firing hundreds of employees – these things are difficult for managers to examine, and with good reason.

Hackers Find ‘Ideal Testing Ground’ for Attacks: Developing Countries

“India is a place where newer A.I. attacks might be seen for the first time, simply because it is an ideal testing ground for those sorts of attacks,” said Nicole Eagan, the chief executive of Darktrace. At times, these attacks are simply targeting more susceptible victims. While companies in the United States will often employ half a dozen security firms’ products as defensive measures, a similar company elsewhere may have just one line of defense — if any. In the case of attacks carried out by a nation-state, companies in the United States can hope to receive a warning or assistance from the federal government, ... Cybersecurity experts now speculate that a February 2016 attack on the central bank of Bangladesh, believed to have been carried out by hackers linked to North Korea, was a precursor to similar attacks on banks in Vietnam and Ecuador.

Common Misconceptions Found in the World of IoT

A lot of people believe that IoT is only related to collecting data, something along the lines of Big Data. This has often been fueled by the fact that IoT is commonly used along with Big Data. However, IoT is not only limited to collection of data. It is actually related to the exchange of data between devices and how they are connected to the internet. These devices can include any electronics or gadgets that fall under the smart category, some of them in the consumer product section such as TVs, fridges, etc. However, it is not only limited to these categories, but can expand to other sectors such as cars, smart grids, power plants, and so on. ... people believe all IoT devices are safe or rather they underestimate that the devices can be unsafe. IoT devices are often insecure, because of their need for constant connection to the internet making it vulnerable for hacking if the network is hacked as well.

Where’s the value in big data?

Increased revenue will be yours, competition will disappear and customers will love you even more. And yet, the reality is not matching the hype. ‘How do I really drive value from big data’? is a question that needs to be fully answered. Frustration seems to be building and there’s a danger that disillusionment will set in. But it doesn’t have to be this way. There is a route to driving value but you have to be realistic and you have to be methodical in your approach. You also have to start by recognising that, in reality, there are only three kinds of big data projects. The first is simply focused on replacing aging traditional infrastructure; in effect to re-platform an environment and make it fit for purpose in today’s economy – let’s call this the makeover.

Quote for the day:

"You have to put in many,many, many tiny efforts that nobody sees or appreciates before you achieve anything worthwhile." -- Brian Tracy