Healthcare is an industry that's been described as ‘data rich but insight poor.’ Yet, it’s on the cusp of a data revolution. Two-thirds of the data generation are comfortable sharing their personal information with the healthcare sector, more than any other sector. Also, they're more willing to share vital health statistics than any other form of data. This generation understands that by making better use of the data available – be it patient data, performance data, clinical data or administrative data – the healthcare industry will be able to be better understand patient conditions. And as a result, they'll be able to diagnose diseases more quickly, support better interventions, deliver new models of care and improve quality of life.
There have been many instances where businesses had been ahead of their time in regards to big data – for example when Boots first introduced the loyalty card in 1997 as a way of rewarding people in return for their data. ... These larger organizations have now moved on from simply analysing their data to report on how they are performing, and are now attempting to analyse their data to predict the future. Staples are a great example of this – they measure the behaviour of their customers directly before they stop buying from the company, giving them the ability to predict when their current customers will stop buying from them. With this data, Staples can focus their marketing efforts on the people who look like they are about to stop buying from them – it’s cheaper to retain a customer than it is to acquire a new one.
Now, the fourth Industrial Revolution is gathering steam thanks to developments in Internet of Things (IoT), automation and robotics. These technologies, along with big data and analytics, are key elements in this new industrial revolution, commonly called Industry 4.0.This is driving developments in a range of sectors. In manufacturing, smart factories with automated maintenance and significant improvements to efficiency are becoming a reality. In the financial services sector, automation is being used to cope with an ever-increasing volume of data, whether for customer service or shifting focus to areas such as security and risk. No matter what industry you’re in, cloud technology is a critical enabler of the next Industrial Revolution, by providing the means for businesses to innovate around these technologies.
Technology isn’t just centered around medicine. It can also enhance the patient experience, help ease their life while at hospital and reassure families. Dr Samir Abbas Hospital has also deployed tamper-proof RFID ankle bands for babies to track the location of infants within the hospital. The system has been designed to eliminate risks of abduction or baby swapping. You may be surprised to find that Orange has been involved in these transformation developments at Dr Samir Abbas Hospital, as you probably don’t associate us with health. In fact, we are making huge investments in the sector. A few years ago, we set up a dedicated unit, Orange Healthcare, within Orange Business Services to provide reliable and effective solutions that can help both patients and healthcare professionals.
For most software developers, there have historically been many barriers to entry in machine learning, most notably software libraries designed more for academic researchers than for software engineers as well as a lack of sufficient data. ... In tandem, the last few years have seen a proliferation of cutting-edge, commercially usable machine learning frameworks, including the highly successful scikit-learn Python library and well-publicized releases of libraries like Tensorflow by Google and CNTK by Microsoft Research. The last two years have also seen the major cloud providers Amazon Web Services and Google Cloud Services release machine learning–specific services — both Machine Learning as a Service platforms and graphics processor unit machines optimized for machine learning work.
Edge analytics can increase your ability to monitor and react to equipment health, resulting in lower costs. Evaluating asset performance at the point of monitoring helps drive corrective action and reduce premature degradation. It is estimated companies can recover as much as half their annual maintenance budget by aligning maintenance investment to asset condition. Increased condition-based and predictive maintenance frees up capital investment for other expansion projects or return on shareholder value. As examples, consider condition-based maintenance (CBM) and predictive maintenance. CBM uses sensor data from equipment and applies a monitoring strategy that uses the actual condition of the asset to decide when and what maintenance should be done. CBM can augment a time-based maintenance strategy and help reduce failures.
“We urgently need more due process with the algorithmic systems influencing our lives,” says Kate Crawford, a principal researcher at Microsoft Research who has called for big data due process requirements. “If you are given a score that jeopardizes your ability to get a job, housing or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision.” The European Union has recently adopted a due process requirement for data-driven decisions based “solely on automated processing” that “significantly affect” citizens. The new rules, which are set to go into effect in May 2018, give European Union citizens the right to obtain an explanation of automated decisions and to challenge those decisions.
Exciting things have happened over the past two years: Reactive programming has come to the forefront in front-end JS on several fronts, and has become mainstream in terms of technique and mindshare. However, the current generation of implementations have some issues that we need to solve. For example: hot vs. cold can add significant cognitive overhead for developers, glitches and weird gotchas such as diamond shapes like: combine(f, stream1, stream1), while infrequently encountered, will be surprising to many devs. The community is just starting to research other implementation techniques, including rigorous FRP approaches to correctness, such as signal functions, signal vectors, pull-based, and push-pull-based. These can help us create implementations that rule out glitches.
The description of a programming language is usually split into the two components of syntax (form) and semantics (meaning). Some languages are defined by a specification document (for example, the C programming language is specified by an ISO Standard), while other languages (such as Perl) have a dominant implementation that is treated as a reference. Some languages have both, with the basic language defined by a standard and extensions taken from the dominant implementation being common. You don’t need us to tell you that when it comes to tech, staying ahead of the curve is a pretty good idea. In such an innovative and fast-paced industry, new technologies are emerging every week, every day… basically all the time!
Wouldn't it be nice if software developers had something like a spellchecker, but instead of catching typos and simple grammar mistakes, it caught basic security problems? Developers would be able to fix them immediately, and also learn to write more secure code in the process. The traditional approach is to test software for vulnerabilities after it has been written. But today the testing is moving to earlier in the development process, to when commits are made, or even earlier, while the developer is actually writing the code. "We really need to be implementing this type of application security in our software development stage," said Doug Cahill, analyst at research firm Enterprise Strategy Group. "There are some organizations that are integrating these types of security best practices into their software methodology, but not enough.
Quote for the day:
"Oh yes the past can hurt. But you can either run from it or learn from it." -- Rafiki, The Lion King