Daily Tech Digest - September 16, 2019

What is Computer Vision And The Amazing Ways It’s Used In Business

What is Computer Vision And The Amazing Ways It's Used In Business
Many car manufacturers from Ford to Tesla are scrambling to get their version of the autonomous vehicle into mass production. Computer vision is a critical technology that makes autonomous vehicles possible. The systems on autonomous vehicles continuously process visual data from road signs to seeing vehicles and pedestrians on the road and then determine what action to take. Computer vision in medicine helps in diagnosing disease and other ailments and extends the sight of surgeons during operations. There are now smartphone apps that allow you to diagnose skin condition using the phone's camera. In fact, 90 percent of all medial data is image-based—X-rays, scans, etc. and a lot of this data can now be analyzed using algorithms. Digital marketing: By using computers to sort and analyze through millions of online images, marketers can bypass traditional demographic research and still target marketing to the right online audience and do this work dramatically quicker than humans could. Marketers even use computer vision to ensure ads are not placed near content that is contradictory or problematic for its audience.

Research explores economic benefits of full-fibre and 5G at local level

“Knowledge-intensive sectors are shown to benefit most,” said the report. “Education and health sectors have also been shown to experience larger-than-average productivity impacts of increased connectivity… [and] there is also a likelihood for full-fibre and 5G in particular to lead to productivity improvements in industrial and manufacturing settings.” Therefore, an area with a particularly high density of knowledge workers will benefit more than area with a relatively low density. Likewise, an area with a high concentration of manufacturing businesses, such as the West Midlands, where the UK’s first regional testbed for 5G is taking place, will benefit more than an area with a low concentration. “Many reports already estimate the benefits that full-fibre and 5G can bring to the UK economy,” said BSG CEO Matthew Evans. “But what does it mean for Manchester, Merthyr Tydfil or the Midlothian hills?

Brain hack devices must be scrutinised, say top scientists

Neurons in brain
 In future, "people could become telepathic to some degree" and being able to read someone else's thoughts raises ethical issues, experts said. This could become especially worrying if those thoughts were shared with corporations. ... Among the risks highlighted by the report was the idea of thoughts or moods being accessed by big corporations as well as the bigger question about whether such devices fundamentally change what it means to be human. Dr Tim Constandinou, director of the next generation neural Interfaces (NGNI) Lab, at Imperial College London and co-chair of the report, said: "By 2040 neural interfaces are likely to be an established option to enable people to walk after paralysis and tackle treatment-resistant depression, they may even have made treating Alzheimer's disease a reality. "While advances like seamless brain-to-computer communication seem a much more distant possibility, we should act now to ensure our ethical and regulatory safeguards are flexible enough for any future development. "In this way we can guarantee these emerging technologies are implemented safely and for the benefit of humanity."

Microservices Migration Use Cases

By migrating to microservices IT will enable your teams to become more innovative as they are freed up from daily mundane tasks supporting and developing on a legacy system that simply cannot compete in the competitive world we are in today. The other primary benefit customers see is scale — an elastic environment that allows your business to auto-scale takes the worry out of slow performance during critical events or peak traffic seasons. This could be a retail outlet during Black Friday/Cyber Monday, or an insurance company during a natural disaster or macro-economic changes that cause a flurry of activity on Wall Street. We create value on mobile apps with external development providing an entry point to enter the data center and consume our APIs. We empower from hundreds to thousands of microservices to happen with a self-service platform for developers to publish new services and new versions as needed. All of this is automated allowing the platform team to set boundaries on what teams can do.

It’s not easy going green – but the Internet of Things can help

Only through cross-system communication is compliance and energy efficiency possible. Much of the IoT’s value lies in its ability to integrate the various, complex components and IT systems that make up any modern building or facility. When building systems can ‘talk’ with each other, the resilience of the infrastructure is strengthened. This provides access to a greater volume of intelligence, leading to more robust compliance and better use of resources. An IoT-connected system enhances an organisation’s pursuit of greater energy efficiency, where the rapid collection of, and reaction to, massive amounts of information is essential. For example, having IoT devices and sensors integrated with a heating, ventilation and air conditioning system means that organisations can collect real-time data on energy consumption and device health. Armed with this information, organisations are empowered to take a fresh look at their current practices, generate business change and create efficiencies that cut costs and emissions. From an energy management perspective, Schneider Electric’s PowerLogic ION9000 is the ideal connected solution.

Open source and open data

First and foremost, our primary mission is “to organize the world’s information and make it universally accessible and useful.” Certainly one obvious way to make information universally accessible and useful is to give it away! Second, making these materials available stimulates scientific research outside of Google. We know we can’t do it all, and we spend a lot of time reading, understanding and often extending work done by others, some of which has been developed using tools and data we have provided to the research community. This mix of competition and cooperation among groups of researchers is what pushes science forward. Third, when we hire new employees, it’s great if they can hit the ground running and already know and use the tools we have developed. Familiarity with our software and data makes engineers productive from their first day at work. There are many more reasons to share research data, but these three alone justify the practice. We aren’t the only internet company to appreciate the power of open data, code, and open research.

New NetCAT CPU side-channel vulnerability exploitable over the network

Hands typing on a laptop keyboard binary code and a hazard symbol on screen.
The culprit is Intel’s Data Direct I/O (DDIO) technology, which gives peripheral devices such as network cards direct access to the processor’s internal cache to achieve better performance, less power consumption, and higher data throughput. Before DDIO, these devices exchanged data with the CPU through RAM, whose latency can be a bottleneck. DDIO was designed with ethernet controllers and fast datacenter networks in mind to allow servers to handle 10-gigabit ethernet (10 GbE) connections and higher. The technology was first introduced in 2011 in the Intel Xeon E5 and Intel Xeon E7 v2 enterprise-level processor families. CPU attacks like Spectre and Meltdown and their many variants have used the CPU cache as a side-channel to infer sensitive data. Researchers from the VUSec group at Vrije Universiteit Amsterdam have now shown that DDIO’s cache access can be exploited in a similar manner. In a new paper released today, the researchers described an attacked dubbed NetCAT which abuses DDIO over the network to monitor access times in the CPU cache triggered by other clients connected to the same server over SSH (Secure Shell).

NHSX emphasises need for ethical patient data access

“NHS and care organisations have an obligation to protect patient data, but in my view, they also have the obligation to make best use of it,” she said. “Collaborations need to benefit everyone involved – patient lives are at stake.” Donnelly also mentioned that “citizen juries” are currently taking place to debate the matter of how patient data should be used what constitutes a fair partnership between the NHS and researchers, charities and industry on uses of patient and operational data from the NHS. “By testing different commercial models against the principles on which our citizens are not prepared to compromise, we hope to reach a consensus on what good looks like and how best we achieve the promised benefits.” In July, a programme was launched by Public Health England and NHSX with the aim to usher in a “new era of evidence-based self-care”, with patients increasingly expected to allow access to their personal data.

Gartner sees blockchain as ‘transformational’ across industries – in 5 to 10 years

Chains of binary data.
"Once it has been combined with the Internet of Things (IoT) and artificial intelligence (AI), blockchain has the potential to change retail business models forever, impacting both data and monetary flows and avoiding centralization of market power," Gartner said. As a result, Gartner believes that blockchain has the potential to transform business models across all industries — but the opportunities demand that enterprises adopt complete blockchain ecosystems. Without tokenization and decentralization, most industries will not see real business value. The journey to create a multi-company blockchain consortium is inherently awkward, Garter said. "Making wholesale changes to decades-old enterprise methodologies is hard to achieve in any situation. However, the transformative nature of blockchain works across multiple levels simultaneously (process, operating model, business strategy and industry structure), and depends on coordinated action across multiple companies."

Rethinking Flink’s APIs for a Unified Data Processing Framework

Flink’s existing API stack consists of the Runtime as the lowest level abstraction of the system that is responsible for deploying jobs and running Tasks on distributed machines. It provides fault-tolerance and network interconnection between the different Tasks in the JobGraph. On top of Flink’s Runtime sit two separate APIs, the DataSet and DataStream APIs. The DataSet API has its own DAG (directed acyclic graph) representation for tying together the operators of a job, as well as operator implementations for different types of user-defined functions. The DataStream API has a different DAG representation as well as its own set of operator implementations. Both types of operators are implemented on a disjointed set of Tasks which are given to the lower-level Runtime for execution. Finally, we have the Table API / SQL which supports declarative-style programming and comes with its own representation of logical operations and with two different translation paths for converting Table API programs to either the DataSet or DataStream API, depending on the use case and/or the type of sources that the program comes with.

Quote for the day:

"Courage is the ability to execute tasks and assignments without fear or intimidation." -- Jaachynma N.E. Agu

No comments:

Post a Comment