Google DeepMind's AI can now detect over 50 sight-threatening eye conditions
In a project that began two years ago, DeepMind trained its machine learning algorithms using thousands of historic and fully anonymized eye scans to identify diseases that could lead to sight loss. According to the study, the system can now do so with 94 percent accuracy, and the hope is that it could eventually be used to transform how eye exams are conducted around the world. AI is taking on a number of roles within health care more widely. ... AI is also being used to help emergency call dispatchers in Europe detect heart attack situations. Diagnosing eye diseases from ocular scans is a complex and time-consuming for doctors. Also, an aging global population means eye disease is becoming more prevalent, increasing the burden on healthcare systems. That's providing the opportunity for AI to pitch in. "The number of eye scans we're performing is growing at a pace much faster than human experts are able to interpret them," said Pearse Keane, consultant ophthalmologist at Moorfields, in a statement. "There is a risk that this may cause delays in the diagnosis and treatment of sight-threatening diseases, which can be devastating for patients."
How Fintech Is Transforming Access to Finance
A percentage of the digital transactions that merchants receive are set aside to repay their advances. This arrangement keeps repayments fluid, bite-sized, and in line with cash flow. In India, Capital Float, a nonbank finance company, provides instant decisions on collateral-free loans for small entrepreneurs. A risk profile assessment is carried out in real time by analyzing MSMEs’ cash flows using data from Paytm, an e-commerce payment system and digital wallet company, mobile financial services firm Payworld, and smartphones. Capital Float customers carry out electronic know-your-customer authentication, receive the loan offer, confirm acceptance, and sign the loan agreement on a mobile app. The loan amount is credited to their account on the same day, with nil paperwork. Cash flow loans help MSMEs seize opportunities when they arise and are an excellent example of the targeted, niche innovation that enables fintech to compete with more prominent—but slower—traditional banks. They are well-suited to businesses that maintain very high margins, but lack enough hard assets to offer as collateral.
The Commercial HPC Storage Checklist – Item 3 – Protection at Scale
Many HPC storage solutions provide only replication for data protection. Replication protects against media failure within a node by creating two or three additional copies of data on other nodes in the storage cluster. The problem is a replication only model forces the organization to store two or three full additional copies of data. While replication does maintain performance during a failure, the level of exposure to an additional failure is enormous. Most enterprise storage systems support a single or dual parity protection scheme. While parity does not have the capacity waste of a replicated system, it can hurt storage performance if the design of the storage system cannot maintain performance during a failure/rebuild process. A Commercial HPC storage system needs to provide a parity-based protection scheme, so they do not waste capacity nor unnecessarily waste data center floor space. Because restarting of workloads is so time-consuming it also needs to have multiple layers of redundancy so that one or two drive failures don’t stop an HPC process from executing.
How artificial intelligence is shaping our future
A world fuelled and enhanced by AI is one to look forward to. Autonomous cars will mean efficient and safe transport. Real-time translation buds that will enable you to speak one language and hear another will transform our travel experiences. Despite the cries of alarmists, there is little reason to believe that our AIs are going to “wake up” and decide to do away with us. ... New drugs, therapies and treatments will produce a revolution in the delivery of healthcare. What’s true for health is true for education, leisure, finance and travel. Every aspect of how individuals, corporations and governments function can be more effectively managed with the right application of the right data. ... Humans will come to confide, trust and rely on our new companions. They will support us for better or worse, in our prime and our decline. Powered by AI and abundant data, they may assume the characteristics of those dear or near to you. Imagine your late grandmother or your favourite rock star chatting helpfully in your living room.
Microsoft may soon add multi-session remote access to Windows 10 Enterprise
At this point, multi-session Remote Desktop Services (RDS) is a Windows Server-only feature, one that lets users run applications hosted on servers, whether the servers are on-premises or cloud-based. But the evidence uncovered by Alhonen hints that Microsoft will expand a form of RDS to Windows 10. "There's a ton of unanswered questions," said Wes Miller, an analyst at Directions on Microsoft, noting Microsoft silence on such a move. He expected that some answers will be revealed at Microsoft Ignite, the company's massive conference for IT professionals that's set for Sept. 24-28, or with the release of Windows 10 1809 this fall. One thing he's sure of, however. "You won't see this running on hardware at a user's desktop," Miller said of Windows 10 Enterprise for Remote Sessions. Instead, he believes the SKU should be viewed as back-end infrastructure that will be installed at server farms in the virtual machines that populate those systems. If Windows Server serves - no pun intended - as the destination for remote sessions accessing applications or even desktops, why would Microsoft dilute the market with the presumably-less-expensive Windows 10 Enterprise SKU?
Will network management functions as a service arrive soon?
The cloud also eliminates the need for patching and upgrading software. Those functions would be handled by the vendor. In considering NMaaS, Laliberte said organizations should understand the underlying architectures, which in some cases could simply be individual licenses. "After that, it would come down to the cost model of Opex versus Capex, along with maintenance," he said. Laliberte said it is important to find out how the NMaaS offering charges and to determine the cost model and whether there any ingress charges for data collected. One of the other big issues, he added, is security. "If you are in a regulated industry or have sensitive information traversing your network and that data is being sent to the cloud, make sure to get the security team engaged and that they approve the model." NMaaS also enables the collection and dissemination of benchmarking data, which companies can use to determine how their networks compare to those of their peers. "It is a capability that could be very helpful for organizations to understand and to improve their own environment," Laliberte said.
Apcela optimizes Office 365 performance, improving user productivity
The architecture of the network Apcela has built follows the model of Network as a Service. It starts with a core network anchored on globally distributed carrier-neutral commercial data centers such as Equinix, which Apcela calls application hubs, or AppHUBs. These data centers are then connected with high capacity, low latency links. That high-performance core then interconnects to the network edge, which can be enterprise locations such as branches, manufacturing facilities, regional headquarters, data centers, and so on. This core network also interconnects with the cloud, connecting to the public internet, or directly peering with cloud data centers, such those operated by Microsoft where the vendor hosts Office 365. ... A full security stack is also deployed to these commercial data centers. By distributing security and moving it out of the enterprise data center and into these distributed network nodes, a branch office simply goes to the nearest AppHUB to clear security there, and from there, it can go to the internet or to whatever SaaS applications these branches need to use, rather than having to go all the way back through the enterprise data center before they get out to the cloud.
8 guidelines to help ensure success with robotic process automation
The first step is to find out what really goes on day-to-day in your organization. It is very surprising how many variants of processing can build up. Use process mining, process discovery tools or consultants to figure out what you actually do in a process. Methods to do so might include extracting systems logs, or mouse clicks and keystrokes to find out how many ways an activity can happen and then eliminate the less optimal ways to automate the most common paths. Many different tools can be used to support automation, especially ones that have best practice processes already in them. Filtering to see if RPA should be used needs to start with understanding the process in order to then understand the choices of automation available in the short, medium and longer term. If people don’t know what they do or how they do it, they’re not ready to start with RPA. Standardized, repetitive, re-keying tasks of digital data is the optimal place to start thinking if RPA makes sense or not.
How Smaller Financial Services Firms Can Win With Open-Banking Disruption
Even though only nine of Europe’s largest banks are required to comply with PSD2, many small and midsize financial services – as well as their much-larger rivals – are warming to the idea of opening up their customers’ transactional data. Banks and insurers like the idea of using this information to propose more compelling lending options, credit lines, and investment services to their customers. But more importantly, their customers will have the power to dictate how their information is exchanged with other institutions to find the best way to manage their financial growth. According to Oxford Economics study “The Transformation Imperative for Small and Midsize Financial Services Firms,” sponsored by SAP, small and midsize banks and insurers seem to be on the right digital path towards open banking. Surveyed participants indicated that they are heavily investing in efficient, scalable, and connected technology that can help keep their data and systems more secure and support innovation.
The Ethics of Security
The usual Black Mirror-style thought experiment (admittedly one not used in the 18th century) is to imagine you kindly drop by to visit a friend in hospital. On walking through the door, their new Benthamometer detects your healthy heart, lungs, liver and kidneys could save the lives of 5 sick people inside and your low social media friend count suggests few folk would miss you. Statistically, sacrificing you to save those 5 more popular patients is not only OK, it is morally imperative! It is the extreme edge cases of a utilitarian or statistical approach that are often the cause of algorithmic unfairness. If the target KPIs are met, then by definition the algorithm must be good, even if a few people do suffer a bit. No omelettes without some broken eggs! If you think this would never happen in reality, we only need to look at the use of algorithmically generated drone kill lists by the US government in the Yemen. Journalist and human rights lawyer Cori Crider has revealed that thousands of apparently innocent people have been killed by America’s utilitarian approach to acceptable civilian casualties in a country they are supposed to be helping.
Quote for the day:
"Leaders know the importance of having someone in their lives who will unfailingly and fearlessly tell them the truth." -- Warren G. Bennis
No comments:
Post a Comment