Daily Tech Digest - January 25, 2018

How policymakers should approach AI
AI is already super-human in many domains and in the next 5-20 years it is quite likely that we will be able to capture and express all of extant culturally-communicated human knowledge with it. Already we are far better at predicting individuals' behaviour than individuals are happy to know, and therefore than companies are happy to publicly reveal. Individuals and parties exploiting this are very likely compromising democracy globally, notably in the UK. There is an incredibly large project here for the social sciences and the humanities as we urgently address the political, economic, and existential (in the philosophical sense) challenges of massive improvements in communication, computation, and prediction. Again, natural laws of biology tell us to anticipate an accelerated pace of change given the increased plasticity of increased intelligence. Therefore we need to ensure our societies are robust to this increase, with sufficient resilience built into the system to allow individuals to have periods out of work finding a new place in the economy.



Implement OAuth in 15 minutes with Firebase

Fire
This article provides a 15 minute, step-by-step guide to adding OAuth support to a CLI-generated Angular application using Firebase. We will implement OAuth with a Google account, but other platforms supported by Firebase include: Facebook, Twitter, and GitHub. But first, what is Firebase? Firebase got its start as a realtime cloud-hosted NoSQL database supporting multi-user synchronization. Since being acquired by Google in October of 2015 it has become an entire publishing platform for web and mobile applications. Many major companies, including Lyft, Shazam, The New York Times, and NPR, use Firebase to support their apps . Some of these applications see over 100 million monthly users and update the database more than 3,000 times per second, providing strong evidence that the platform can scale.


Why NoSQL Needs Schema-Free ETL Tools

Even developers don't like writing boring "plumbing code" — code that just links data from one place to another. It's dull and repetitive. Customers don't like it, either — as anywhere that code is needed inevitably means a maintenance headache, not to mention a long time to write and test it in the first place. This means increased costs to initially deploy a new technology like NoSQL. Equally, on the output side, if you can't rapidly visualize the insights you can glean from your data, then you cannot fully realize the benefits of your investment in NoSQL database technology. Trying to code around the problem leads to longer project times, and the aforementioned increase in costs associated with custom coding. Many NoSQL companies have tried to shoe-horn SQL support into their products in an effort to bridge the gap between traditional BI vendors and their products. This has only been partially successful. 


NHS Wales IT outage: What went wrong with its datacentres?


Guillaume Ayme, IT operations evangelist at big data analytics software supplier Splunk, raised concerns about the datacentres’ setup, given that running dual sites usually means that in the event of an outage, one will failover to the other. “For the issue to be impacting two datacentres suggests it is severe, as one would normally be the backup for the other,” he said. “This may suggest there has been a problem in the failover procedure. “Once the service is restored, it will be essential to find the root cause to avoid a potential repeat. This can be complex for organisations that do not have full visibility into the data generated by their IT environment.” ... “While systems are now back up and running, the chaos it created shows why we need to move from hours to minutes to resolve problems like this,” said Anderson. “Ultimately, it comes down to our reliance on software and the need for it to work perfectly – and that’s difficult in IT environments that are getting more complex by the day.


Exploring Modern Data Platforms (Podcast)

The DevOps thing is something that everyone is trying to get their head around right now. When you have a whole staff of people who know SQL and know relational databases and now we say, ‘Okay, but all of your data is going to go to an object data store.’ Like, what does that look like or how should the data be organized? How do you query it? How do you use it? That type of training, but, to be honest, that really is not as much of a leap as it was even a year ago. The evolution is happening very, very, very rapidly. A year or two ago we’d say, ‘You need to use an object data store’, and we were speaking some foreign language. Now they get it, and they say, ‘Okay, let’s do it,’ because they think what’s happened is over the years people started dipping their toes and they’re realizing the economics of it. It’s like Hadoop was the gateway drug for this type of platform where you could start experiencing drastic cost reduction with enhanced capabilities


How CIOs Can Ensure a Seat at the Strategy Table


Digital disruption has placed technology at the heart of most business discussions, yet many CIOs are still fighting for a seat at the strategy table. Monika Sinha, research director at Gartner, says information and technology are considered too late in the strategy process in many enterprises. “IT is fundamental to the new business challenges,” Sinha told CIOs at Gartner Symposium/ITxpo in Goa, India this week. “It underpins new business models, products and services that are disrupting existing industries and creating new markets. As strategists, CIOs are flexible, agile and opportunistic in their approach.” “Once your ambition is clear, appropriately position IT at the heart of your business strategy.” The traditional “wait and respond” approach to enterprise strategy – the business strategy is finalized, the CIO reviews the strategy for IT’s contribution and an IT strategy is developed in response – is no longer viable.


The Benefits Of Open Standards For Process Automation


Once we see overall total cost of ownership of these process automations systems being reduced in the long run, we’ll be able to take advantage of the built-in, intrinsic cybersecurity features that are being designed into these open process automation systems. The rapid insertion of new technologies, new capabilities, and innovations will be inserted into the formerly closed systems in a much faster and cheaper way. Ultimately, that translates in manufacturing to increased equipment reliability, faster time to market, increased quality of production, and other benefits. ... It’s important to also remember that the intellectual property (IP) of those vendors is still preserved. There are points where we’re breaking up existing hardware and software systems into modules. With the modules, there will still be the intellectual property of the vendors—but the interfaces in between are what’s standard. In the future, there still will be the IP of the vendors in the hardware and software and in the application layer.


Cozy is building a personal cloud service that respects your privacy


Instead of creating yet another ecosystem of hosted services financed by ads, Cozy wants to change the balance and create a platform where the user is in charge. As you can read in the terms of services, you remain the owner of your data and your data won’t be shared with anyone unless you give your consent. And for the most privacy-concerned users, you can also install a Cozy instance on your own server. The main GitHub repositories have been updated today. The company just unveiled the first services of this new platform today. First, it starts with a good old file-syncing service. With Cozy Drive, you can install an app on all your computers, synchronize files with Cozy’s servers and find them everywhere — on your other computer, on your phone or on the web. Second, Cozy Photos lets you backup your photos. This works like Google Photos, Microsoft OneDrive’s Camera Upload and similar features.


IIoT and the Open Process Automation Forum

OPAF envisions a future open control system that will take information and data from any device and optimize it for better decision making. It will empower the workforce to be more actively involved and responsible for good business outcomes. For example, secondary measures will be key, such as differential pressure or sensor temperatures. We will be able to collect and communicate data about the overall health status of the instrument or sensor, which will drive new levels of reliability and overall operational integrity and profitability. This new level of control and new control functions will drive incredible value. Fitzgerald: Much depends on the scale and relevant policies of a given client. While DHCP might be “easier” for both wired and wireless integrations, discrete IP addresses associated with given subnets provide additional needed security and robustness of operations.


Robots are needed to manage the automation robots


Dube says the combination of physical robotic machine bodies and AI software brains will eventually make it hard to tell humans and robots apart. “We are carbon-based organisms and robots are silicon-based, but I think the boundaries around them are going to get progressively diffused to the extent that you will not be able to distinguish between a human and an android in the next nine years,” he says. “Robots are becoming fairly smooth in terms of mechanical motion. They can easily walk through a crowded mall, avoiding people. They can take an escalator, climb down stairs and even run faster than humans. In five years, their dexterity will be as good as humans. “But one component is missing – the brain – and that is the area we specialise in. When we implant the brain into the robot frame, it will be able to be asked a question, analyse what was said, and provide an answer. It will be able to walk and talk to you.”



Quote for the day:


"A leader must have the courage to act against an expert_s advice." -- James Callaghan


No comments:

Post a Comment