Daily Tech Digest - June 05, 2017

Artificial intelligence will outperform humans in all tasks in just 45 years

A recent report suggests that those who are paid less than $20 an hour will be unemployed and see their jobs filled by robots over the next few years. But for workers who earn more than $20 an hour there is only a 31 percent chance and those paid double have just a 4 percent risk. To reach these numbers the CEA's 2016 economic report referred to a 2013 study about the 'automation of jobs performed by Oxford researchers that assigned a risk of automation to 702 different occupations'. Those jobs were then matched to a wage that determines the worker's risk of having their jobs taken over by a robot. 'The median probability of automation was then calculated for three ranges of hourly wage: less than 20 dollars; 20 to 40 dollars; and more than 40 dollars,' reads the report. The risk of having your job taken over by a robot, Council of Economic Advisers Chairman Jason Furman told reporters that it 'varies enormously based on what your salary is.'


Deep Learning 101: Demystifying Tensors

Tensorflow and systems like it are all about taking programs that describe a machine learning architecture (such as a deep neural network) and adjusting the parameters of that architecture to minimize some sort of error value. They do this by not only creating a data structure representing our program, but they also produce a data structure that represents the gradient of the error value with respect to all of the parameters of our model. Having such a gradient function makes the optimization go much more easily. But here is the kicker. You can write programs using TensorFlow or Caffe or any of the other systems that work basically the same way. But the programs that you write don’t have to optimize machine learning functions. You can write programs that optimize all kinds of programs if the ones you write use the tensor notation provided by the package that you have chosen.


Get ready for the ‘first robotic empathy crisis

Perhaps the most important thing humans can do to keep AI in check, according to Brin, is to apply accountability measures and regulation. “The only way that you have been able to make it so that our previous AIs — corporations, governments, and such — don’t become cheaters the way the kings and lords and priests were in the past is by breaking up power and setting it against each other in regulated competition, and that is the method by which we have division of powers, that’s the way we have healthy markets,” Brin said. Regulated competition and accountability have been vital to the protection and advancement of what Brin called the “five great arenas” over powerful interests: democracy, science, sports, law and courts, and markets. Beyond his work as a consultant to federal agencies and his writing, Brin is a Scholar-in-Residence at the Arthur C. Clarke Center for Imagination at the University of California, San Diego (UCSD).


Serverless computing takes another step forward

Like most other serverless environments Functions uses webhooks as triggers. You're able to define the type of webhook used, and Functions' code editor will help you extract payloads that can then be delivered to the Twilio APIs. It's a relatively simple environment at present, offering node.js functionality for your code. That makes sense, as node.js works best when treated as a switch in a message-driven environment. Messages for Functions come in on webhooks and once processed, the output is delivered through Twilio's messaging APIs. Malatack sees the serverless model as a style of programming, where you need to think as little as possible about anything other than your code. "It's a progression for developers as things are abstracted. That's important as abstractions are better, they make it easier to focus on the outcome." It should result in developers getting more leverage, as there's no need to think about complex issues like scaling.


Five Ways To Cultivate a Data-driven Culture for Your Business

In today’s dynamic marketplace, the ability for businesses to use data to identify challenges, spot opportunities, and adapt to change with agility is critical to its survival and long-term success. Therefore, it has become an absolute necessity for businesses to establish an objective, data-driven culture that empowers employees with the capabilities and skills they need to analyse data and use the insights extracted from it to facilitate a faster, more accurate decision-making process. Contrary to what many people think, cultivating a data-driven culture is not just a one-time transformation. Instead, it’s more like a journey that requires efforts from employees and direction from both managers and executives. In this article, I am sharing five different ways businesses can accelerate their transformation into a data-driven enterprise.


How to find the real value in operational big data

You must first understand how to conceptually rationalize all your sites. What is the strategic importance of all the sites in your business operations, and how will you measure performance in aggregate? How do business operations contribute to the overall health of the company, and how will you communicate that through key performance indicators? This exercise should result in a scorecard that aligns with your technical efforts. Metadata analysis is your next step. The strategic team and the technical team must work together to map data from the scorecard all the way down to the sites. Generalization, filtration, and transformation are three key activities this team must perform. Generalization involves mapping up similar data points in to one common concept; filtration is the purposeful exclusion and inclusion of data; and transformation is the rule set that moves included data to common concepts.


The digital revolution: The future is … disruptive banking

A full-serviced digital bank is now a more attractive option, especially because by definition these types of banks are more flexible and instantly responsive, focused on removing the bottlenecks in the existing experience, delivering increased convenience and a more humanised experience with the customer’s interest at its core. These start-up digital banks are likely to not only add competition into the marketplace but to disrupt the entire sector, forcing change with their greater focus on customer experience and satisfaction. Digital banks are more agile than established institutions and create a better user experience and more value for the customer. This value is not fiscal, but delivered through a more personalised experience for each customer.


Is technology destroying the quality of human interaction?

Looking at the world around us today, many things which were not here 10 years ago are fully ingrained in our daily lives. A day without Google, FaceTime or WhatsApp would be quite an experience for me. I use WhatsApp to stay in touch with my family and thanks to FaceTime my kids can see their dad every day even if I am on the other side of the planet. Although I have never met any of the makers of the software, I trust these brands because their products work seamlessly. So instead of trusting the people behind the brand, I have built trust in the brand because their product continuously satisfies my expectations. Obviously, banking is richer than a simple messaging service. Our spectrum ranges from a basic transaction to a complex corporate financing deal. The latter cannot be acknowledged by just ‘two blue ticks’ as is the case for a WhatsApp message.


Continuous Delivery with Kubernetes the Hard Way

Why is continuous delivery so important? Deploying releases to an application continuously eliminates a “release day” mindset. The theory here is that the more frequently releases are done, the less risky releases are. Developers can also make changes to the application (whether implementing new code or rolling back to an older version) as code is ready. This means that the business can make changes more quickly, and therefore be more competitive. Since microservices use APIs to speak to one another, a certain level of backwards compatibility needs to be guaranteed between releases to avoid having to synchronize releases between teams. According to Conway’s law, software takes on the structure of the organization it belongs to. Thus, microservices and containers are just as much about organizational changes as they are about technological changes.


Benchmarks Don't Have to Die

Benchmarks are not just useful for improvements. They can be used as the basis for regression testing, ensuring that performance stays consistent as code is changed. Given that performance isn’t a binary state like broken/working, it’s not always obvious when a regression has occurred. Methodically tracking performance regressions is extremely important for mature projects. Perhaps most important of all, publishing a well-crafted benchmark can codify an entire community’s understanding of interesting workloads and user behaviour. Since the best optimisations figure out the common case and tune for that, benchmarks can guide all developers (especially new ones) towards improving the code that matters most. However, as the Google Chromium team pointed out, there are several drawbacks to benchmarks.




Quote for the day:


"What I've really learned over time is that optimism is a very, very important part of leadership." -- Bob Iger


No comments:

Post a Comment