Daily Tech Digest - November 21, 2018

Gotcha pricing from the cloud pushes workloads back on premises
An interesting development is the return of apps from the cloud to on premises. Many companies that moved to the cloud to reduce costs got nasty sticker shock. The survey found that organizations that use public cloud spend 26 percent of their annual IT budget on public cloud computing, just 6 percent using public cloud came in under budget, and 35 percent overspent in their use of public cloud resources. Why? It's because of the cost of reserved instances. Many apps start in the cloud in a virtualized instance like Amazon EC2, but once developed and running regularly, they need a more permanent home, especially if this is a high-scaling app. So, the customer moves to reserved instances, where more resources can be brought to play and the instance is permanent, not temporary. And while cloud service providers offer discounts up front, the costs still can add up fast and become unexpectedly expensive.



How AI will shape the future of digital payments
As humankind takes giant leaps in terms of technology, AI ensures machines and gadgets imitate human actions, perceive the environment around, and adjust according to a diverse set of circumstances. Many global companies are working through smart technologies like AI to directly improve the consumer experience across education, daily life, and commerce. At a time when the Indian government is pushing both the digital payments and the financial inclusion agenda, AI can and must offer some real tangible benefits like tighter security and risk management in a world that is increasingly complex and moving at the speed of light. The Indian financial sector has been quick to realise the potential of AI in operations. Over the last few months, Indian PSU heavyweights like SBI and Bank of Baroda have invested heavily in AI platforms to improve efficiencies and offer enhanced services. SBI uses a chat assistant called SBI Intelligent Assistant (SIA), which resolves queries of NRI customers exactly like a bank representative would without the need to wait in queue for customer service. 


SWIFT India Partners With Fintech Firm for Blockchain Pilot
Per the announcement, the new program based on MonetaGo’s financial services network technology will be integrated through standardized SWIFT financial messages. The banks will purportedly deploy a shared distributed ledger network, that complies with industry-level governance, security and data privacy requirements in order to improve the efficiency and security of their financial products and procedures. According to Kiran Shetty, CEO of SWIFT India, the company will digitize trade processes, while MonetaGo will provide “fraud mitigation solutions to avoid double-financing and check authenticity of e-way Bill.” E-way Bill is an electronically generated bill for the specific movement of goods with a value more than 50,000 rupees ($700). "Given India's focus on a digital infrastructure which is supported by both policy and technological innovation, it makes sense that large institutional players are interested in these products and initiatives," said Jesse Chenard, CEO of MonetaGo.


Hyperscale cloud reliability and the art of organic collaboration

networking with a personal touch
Connecting network policies to logical formulas that capture their intent has been a recurring topic in networking research, but prior tools were ad hoc, written for a specialized format and hardly extensible. Microsoft Research’s satisfiability modulo theories solver Z3 is a state-of-the-art theorem prover that is specifically tailored to capture domains that are found commonly in software and hardware descriptions. It is used prominently in software verification, testing, and analysis. By 2012, network verification was an area of nascent interest, and the Azure network was going through early stages of build-out. It quickly became evident that ACLs could be expressed directly as logical formulas and that the machinery that reasons about such formulas was well-suited and sufficiently efficient for checking properties of ACLs. Jayaraman and Bjørner developed the SecGuru tool, replacing manual what-if policy reviews with automated analysis for ACL updates.


9 cyber security predictions for 2019

crystal ball
In 2019, we’ll see how the EU will react to those complaints. That will provide some much-needed clarity regarding the risk that GDPR and other privacy regulations present. If the GDPR doesn’t react, then that’s telling, too. It sends the message not to take the regulation seriously. Rising concern over how companies use and protect personal information will encourage many Americans to hold those companies more accountable. “The reaction by consumers to constant security breaches and other unethical information disclosures (e.g., Facebook) leads U.S. consumers to demand more default privacy and control over their own information,” says CSO contributor Roger Grimes. Grimes expects to see an effort to enact privacy laws similar to GDPR nationally in 2019. The California Consumer Privacy Act has already passed into law and goes into effect in 2020. On November 1, Sen. Ron Wyden introduced a bill titled the Consumer Data Protection Act (CDPA), which has stiff penalties, including jail time, for privacy violations.


Machine learning, meet quantum computing

The big advantage of quantum computing is that it allows an exponential increase in the number of dimensions it can process. While a classical perceptron can process an input of N dimensions, a quantum perceptron can process 2N dimensions. Tacchino and co demonstrate this on IBM’s Q-5 processor. Because of the small number of qubits, the processor can handle N = 2. This is equivalent to a 2x2 black-and-white image. The researchers then ask: does this image contain horizontal or vertical lines, or a checkerboard pattern? It turns out that the quantum perceptron can easily classify the patterns in these simple images. “We show that this quantum model of a perceptron can be used as an elementary nonlinear classifier of simple patterns,” say Tacchino and co. They go on to show how it could be used in more complex patterns, albeit in a way that is limited by the number of qubits the quantum processor can handle.


Dell XPS 13: The best Linux laptop of 2018


What makes it a "Developer Edition" besides the top-of-the-line hardware is its software configuration. Canonical, Ubuntu's parent company, and Dell worked together to certify Ubuntu 18.04 LTS on the XPS 13 9370. This worked flawlessly on my review system. Now, Ubuntu runs without a hitch on almost any PC, but the XPS 13 was the first one I'd seen that comes with the option to automatically install the Canonical Livepatch Service. This Ubuntu Advantage Support package automatically installs critical kernel patches in such a way you won't need to reboot your system. With new Spectre and Meltdown bugs still appearing, you can count on more critical updates coming down the road. The XPS 13's hardware is, in a word, impressive. My best of breed laptop came with an 8th-generation Intel Coffee Lake Core i7-8550U processor. This eight-core CPU runs at 4Ghz. The system comes with 16GB of RAM.


How open source is fuelling an explosion in fintech innovation

The open technologies that fintech services are built upon are new and speak to the new age of financial services in the palm of consumers’ hands. “Fintech firms are establishing themselves not only as significant players in the industry, but also as the benchmark for financial services,” states Ernst & Young in its Fintech Adoption Index. “Their new propositions are increasingly attractive to consumers who are underserved by existing financial services providers, and their use will only rise as fintech awareness grows, consumer concerns fall, and technological advancements, such as open APIs, reduce switching costs.” The blockchain is a central technology that many fintech services have built themselves upon, especially across the P2P payments space. AI, Big Data and the cloud are all vital components of the services fintech companies are innovate with. Together with open APIs and intuitive UI’s, these technologies form a new toolbox that each startups, in particular, are exploiting.


Inside the chief data privacy officer role with Barbara Lawler

While GDPR sucked all the oxygen out of the room and continues to drive new or revised privacy rules around the globe, it is important to keep in mind that no single country or region owns the rules. Each country interprets privacy according to its cultural norms and legal frameworks. Other international efforts such as APEC’s Cross Border Privacy Rules, the EU-US Privacy Shield, along with legal data protection regulations and frameworks in 126 countries and across 50 U.S. states prove that responsibly handling people’s data is serious and critical for business success. New on the horizon is the California Consumer Privacy Act (CCPA) of 2018, inspired by GDPR but carrying its own unique set of requirements. It’s highly likely that other U.S. states will replicate some of it or all of it. This is currently driving renewed dialog and debate in the U.S. 



“Microsoft systematically collects data on a large scale about the individual use of Word, Excel, PowerPoint and Outlook. “Covertly, without informing people, Microsoft does not offer any choice with regard to the amount of data, or possibility to switch off the collection, or ability to see what data are collected, because the data stream is encoded,” Privacy Company wrote in a blog post covering its findings. While Microsoft is considered a data processor, the report warned that the way it collects data from users for diagnostics means it should be classified as a joint controller as defined in article 26 of the GDPR. The DPIA report recommended IT administrators for Dutch government users configure the “zero exhaust” setting in Microsoft Office to prevent sensitive data from being leaked and centrally prohibit the use of Microsoft Connected Services for spell checking and language translation, as well as disabling access to SharePoint Online, OneDrive Online and the web version of Office 365 Live.



Quote for the day:


"Good leaders make people feel that they're at the very heart of things, not at the periphery." -- Warren G. Bennis


Daily Tech Digest - November 20, 2018


Making the banking business even more difficult, smaller fintech and large techfin companies are developing solutions that use insight and digital technology to improve the customer experience across product lines. These new competitors threaten legacy financial institutions of all sizes. ... Failing to respond could lead to the demise of less agile organizations. The good news is that many of the new technologies that are threatening the banking industry also present significant opportunities. In fact, those organizations that can leverage big data, advanced analytics and new technologies to improve the customer experience can build trust, loyalty and revenues that are the keys to success in the future. According to Dan Cohen, Senior Vice President, Global Financial Services and Insurance at Atos, “Banks are at a crossroads. Continuous finTech innovation and new technologies such as blockchain are disrupting the market. While it creates threats, it also opens multiple opportunities for financial services to reinvent themselves and thrive.”



How automating feature engineering can help data scientists

Deep Feature Synthesis is an automated feature engineering approach that, essentially, can be applied to many different types of data, ranging from marketing use cases to financial services use cases to healthcare use cases. The general principle behind it is we're trying to emulate how human data scientists would approach these problems. Deep Feature Synthesis works by having a library of feature engineering building blocks called primitive functions, and each one of these primitives is labeled with the type of data it can input and the type of data it can output. To give you a very simple example, you can imagine a primitive that took in a list of numbers and outputted the maximum value in that list. We have a library of many of these primitives and when we get a new data set, Deep Feature Synthesis looks at the specific column and relationships in the data and figures out which primitives to apply. That's how it can take the generic primitives and create specific features.


Managing cloud infrastructure post-migration — a CTO guide

Managing cloud infrastructure post-migration — a CTO guide image
“This is something many businesses have quickly realised as they have continued along their deployment journeys. ...” “The skills gap has been an extremely prevalent issue in the cloud world for some time, with many businesses either lacking the budget to meet the substantial salaries that people with cloud skill sets now command, or simply unable to find people with the required level of technical expertise. This highlights the importance of finding the right partners so that businesses can hand off the most complicated jobs to a team of experts.” “However, it also highlights the need for better tooling for lifecycle management and operations. Lowering the barrier to entry, a solid choice of orchestration and management frameworks will take a pragmatic view on what’s needed to increase productivity around the day to day operations, and exceed expectations around even complicated processes such as upgrades of complex infrastructure software.”


Has storage become sexy?

The web-scale companies adopted this ethos with gusto and enterprise organizations soon began to follow suit. This march towards a commodity hardware dominated and software-driven world seemed inexorable. And then AI happened. Considering how long AI has been part of the public consciousness, it's almost funny that it snuck up on the entire tech industry. While industry leaders have been working on AI technologies for decades, until recently it didn't play a meaningful role in enterprise strategy nor was it a significant element of tech company go-to-market motions. And then, AI was everywhere. Because of its sudden rise as a top-of-mind issue, enterprise leaders were largely unprepared to deal with AI — and most critically, were ill-equipped to deal with the impact these new AI workloads would have on their newly cloudified architectures. As enterprises have begun working with AI, machine learning, advanced analytics, and other data- and resource-intensive workloads, they have found that commodity-based architectures built for traditional workloads buckle under the demands of these much more intense workloads.


Is Artificial Intelligence Dangerous? 6 AI Risks Everyone Should Know About


AI programmed to do something dangerous, as is the case with autonomous weapons programmed to kill, is one way AI can pose risks. It might even be plausible to expect that the nuclear arms race will be replaced with a global autonomous weapons race. Russia’s president Vladimir Putin said: “Artificial intelligence is the future, not only for Russia, but for all humankind. It comes with enormous opportunities, but also threats that are difficult to predict. Whoever becomes the leader in this sphere will become the ruler of the world.” Aside from being concerned that autonomous weapons might gain a “mind of their own,” a more imminent concern is the dangers autonomous weapons might have with an individual or government that doesn’t value human life. Once deployed, they will likely be difficult to dismantle or combat.


Code First: Girls teaches more women to code in UK than universities

“We are working very closely with, for example, the Institute of Coding,” she said. “We are very much working together to try and address this challenge because they also acknowledge that these numbers just aren’t good enough.” The social enterprise has announced a partnership with telecoms and broadband provider BT to teach cohorts of 30 women the skills they need for a job in tech in a free four-month course. The programme will teach women skills such as web development, Python programming, databases, test-driven development, agile development and cyber security, and participants will be given the opportunity to be interviewed for a job in a BT tech team. De Alwis said BT approached Code First: Girls to ask for help in training groups of women with the potential goal of hiring them, and she pointed out that the organisation helps companies feel confident in hiring outside their usual talent pool.


A closer look at HTC’s blockchain phone, the Exodus 1


The future of all of this is still very much up in the air. “I see us as the trusted Android,” Chen says, vaguely alluding to a future road map that finds HTC shifting its focus from hardware to software and IP. “We’re not talking about [monetization] right now, but we have some ideas.” While the devoted blockchain phone is largely a stepping stone toward incorporating that technology into more mainstream devices, there are plans to continue development on the line, as the Exodus 1 name optimistically implies. Chen explains that the company is working on follows that will be further distinguished from other handsets, though he’s not ready to discuss specifics. Presently HTC has between 20 and 30 engineers working on the blockchain project, bringing in expert in the space to educate them on the intricacies of the technologies. Event among those who are currently devoted to building out the device, this is all clearly very much a learning process.


The actual cost of downtime in the manufacturing industry

Of course, while gathering data is a key driver in solving problems and having a better understanding of downtime, just obtaining more data does not mean that an organization will know what to do with it. According to a recent study by Accenture, 60% of operators cite dealing with outcomes of data gathered as a major challenge. It is important to understand the reasons for collecting increasing amounts of data and how the data can be applied to improve condition-based monitoring and predictive maintenance, including: The ability to identify data-based patterns; Cognitive learning capabilities; Opportunities to leverage data in the Cloud for cross-organization/industry comparisons; and The ability to share data with trusted service providers for additional analysis and insights There is a significant opportunity to continuing carving down unplanned downtime through digitization, but as Deloitte notes in a recent report, “Simply ‘doing’ digital things will not make an organization digital.” Organizations need to go beyond just technology changes to truly embrace the benefits of digitization.


Supporting Multiple API Protocols with Thriftly


Bitfire Safety is a fictional fire protection company. Bitfire Safety provides dry pipe sprinkler system installations for customers that own cold-climate structures, such as parking garages. These systems are installed and configured with a command panel system interface and software that is used to locally monitor and test various aspects of the system. As part of a modernization initiative, Bitfire Safety is enhancing their services to include remote monitoring and issue remediation. They are first concentrating on the monitoring of the supervisory pressure switches. These switches are responsible for ensuring the proper system pressure and will pump or release pressure through a ball valve to maintain the correct levels. Through monitoring, Bitfire Safety can identify when pressures are tracking low or high. Low pressures could be indicative of an air compressor failing or a leak in the system; pressures tracking high could lead to damaging clappers and gaskets in the system, and could pose a safety risk in the event of a fire where open clappers would just bleed off system air rather than delivering water to a fire.


Building Human Interfaces With Artificial Intelligence

The main trick here is to allow humans to stay human. For decades computers were not exciting to use as they required us to change our ways. We needed to click the right buttons, in the right order to achieve a task. We needed to remember passwords and addresses and know which program to use for different tasks. In essence, we needed to get conditioned to software to use it and to learn how to interface with it before we enjoyed it. When you talk to Cortana, Siri or Google, you don’t need to use a keyboard or a mouse and you can ask questions like "what is the temperature today in the capital of Denmark?" without having to know what the capital is or tell the computer what "today" means. We have a lot of data already out there and computers can analyse the data without extra work from our side. That way we add the extra information the computer needs to give us the right results for the questions we ask.



Quote for the day:



"The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann


Daily Tech Digest - November 19, 2018

Tips for protecting your data when losing an employee


The hard reality is that the majority of your departing employees will try to take company data with them, but there are proactive steps companies can take to ensure their data is safe after the staffers leave. You can’t protect what you don’t know you have. So, the first step is to perform a detailed inventory of your organization’s data and where it’s stored. This involves a thorough audit of the files within your company, which may include in-depth questionnaires for every employee or department. The end result should be a data “map” that details where all of your data is kept, who has access to which files, and when those files were created and modified. Regardless of a former employee’s motives for removing data from your business, if you confront them with evidence of the file-copying, many times they will simply delete or return the files to settle the matter without the need for further action.


Cyber crime: why business should report it as soon as possible

Data breach investigations reveal that some organisations can takeweeks or months to discover a cyber attack, but some cyber criminal activities are identifiable immediately such as distributed denial of service (DDoS) attacks, ransomware and other types of extortion. The message here is not to delay in reporting cyber criminal activity. “Report as soon as possible, particularly if it is a crime in action. We have much more chance of being able to help and of being able to catch the criminals responsible if the crime is reported to us while it is taking place,” says Hulett. The NCA recognises that it can appear to be a “cluttered landscape” for the businesses’ point of view in terms of how to go about reporting a cyber crime, particularly as many organisations will have to report personal data breaches to their data protection authority for the first time under the EU’s General Data Protection Regulation (GDPR) and new GDPR-aligned data protection laws in the UK.


What network pros need to know about IoT

it pros executives iot smart city machine learning ai city skyline metamorworks getty
When it comes to IoT, latency is the enemy. With thousands of devices spread across offices, factories, hospitals, and remote locations, more and more data and computing resources will reside on the edges of the network. "I always say, 'I don't care how fast your network is, you don't deploy your car's airbag from the cloud,'" says Shepherd. "Similarly, if I'm an operations person who needs real-time control over a manufacturing line, I want to move computing for process control and quality as close as feasible to the line, so I'm not relying on a wide-area network to respond." By 2022, Gartner estimates that 75% of all enterprise data will be generated and processed on the network's edge. And that raises a host of new data governance issues. Determining which data stays on the edge and what travels across the network can be complicated, says Kimberly Clavin, vice president of engineering for Pillar Technology, which designs IoT solutions for the automotive, healthcare, and retail industries.


These are the programming language features that really matter to developers

In general, developers want more of a safety net when creating complicated applications, writes Thomas Elliott, data scientist at GitHub. That desire for safety and predictability is evident in the rise of languages that support static typing, where developers can specify the type of each variable, allowing many errors to be flagged when code is compiled. "With the exception of Python, we've seen a rise in static typing, likely because of the security and efficiency it offers individual developers and teams working on larger applications," writes Elliott, who adds there is also an increased appetite for languages that make it easier to build stable multi-threaded applications. "TypeScript's optional static typing adds an element of safety, and Kotlin, in particular, offers greater interactivity, all while creating trustworthy, thread-safe programs." Among the fastest-growing languages, Elliott identifies a common theme of modern, more fully featured languages that can interoperate with older languages, and that, in some cases, are starting to supersede them.


CarsBlues Bluetooth attack Affects tens of millions of vehicles

CarsBlues Bluetooth attack
A new Bluetooth hack, dubbed CarsBlues, potentially affects millions of vehicles, Privacy4Cars warns. The CarsBlues attack leverages security flaws in the infotainment systems installed in several types of vehicles via Bluetooth, it affects users who have synced their smartphone to their cars. Privacy4Cars develops a mobile app for erasing PII from vehicles, according to the firm tens of millions of vehicles could be affected worldwide, and it is an optimistic estimate because the number could be much greater. The riskiest scenario sees drivers who sync their phones to vehicles that have been rented, borrowed, or leased and returned. Their data might be exposed to attackers that can use them for various malicious purposes. “The attack can be performed in a few minutes using inexpensive and readily available hardware and software and does not require significant technical knowledge.” reads the post published by the company. “As a result of these findings, it is believed that users across the globe who have synced a phone to a modern vehicle may have had their privacy threatened. It is estimated that tens of millions of vehicles in circulation are affected worldwide, with that number continuing to rise into the millions as more vehicles are evaluated.”


IoT Home Assistant API for Raspberry Pi

Home Assistant is an open-source home automation platform running on Python 3. It is used to track and control all devices at home and has many utilities to help us with automation control. You can check at Home Assistant blog how dynamic is the community with constant updates and upgrades for the platform. We expect to interact Home Assistant with the embryo API available at the IoT.Starter.Pi thing device. There are many ways to install Home Assistant, since it supports many different hardware platforms. This project focus on Haspbian, a disk image that contains all needed to run Home Assistant on a Raspberry Pi. The Haspbian image is built with same script that generates the official Raspbian image's from the Raspberry Pi Foundation. The same tool used to create the raspberrypi.org Raspbian images was forked from home-assistant/pi-gen repository. The final stages were ripped off and a new stage-3 was replaced to install Home Assistant. With the exception of git , all dependencies are handled by the build script.


Can Artificial Intelligence Improve Learning?


Hard data can indeed help identify learning challenges for individual students. Virtual reality can enliven a science lesson visually, and for engineering students, in particular, simulate and break down connections between moving parts in ways that even the most imaginative teacher cannot put together in a lecture. Engineering education in India is being criticized for churning out unemployable graduates in large numbers. Most of them seem to lack communication skills and find themselves at a loss when asked to solve practical challenges in the workplace. Technologies such as Artificial Intelligence and Virtual Reality can help monitor and identify personal preferences and aptitudes. And they can do this much faster than any human, providing the opportunity for much-needed intervention at exactly the stage at which it is required. That is the crux of providing students with a complete vocational experience and making their education relevant to what is required by industry.


A quick guide to important SDN security issues

Traditional network security vulnerabilities are bad enough without adding SDN security issues to the mix. But, as organizations deploy SDN, they risk exposing their networks to new types of threats and attacks, especially if they don't have proper plans in place. A prevalent concern with SDN security focuses on the SDN controller. The controller contains and provides intelligence for the entire network. Whoever has access to the controller has control of the network. This means organizations need to configure policies and design the network to make sure the right people are in charge. Here are four useful tips to help organizations avoid detrimental SDN security issues and get the most from their SDN deployments. ,,, The SDN controller is a vital part of the security discussion, because successful attacks on the controller can totally disrupt network operations, he said. To combat these attacks, organizations can configure role-based authentication to make sure the right people get access to applications and data. 


How open source makes lock-in worse (and better)

opensourceistock000038083056mindscanner.jpg
Open source creates lock-in? Surely not! Well, surely yes, at least in the enterprise. Why? Because enterprise computing doesn't like change. As hard as it is to get an enterprise to embrace new technologies, once they do, they tend to stick around forever. Remember when mainframes died a decade or two back? Except, of course, they didn't die: Enterprises continue to spend billions each year on old-school tech that had its day back when Flock of Seagulls was still on the radio. Fast forward to Amazon vs. Oracle. Amazon, with a multi-billion dollar database business of its own that directly competes with Oracle's, had every reason to move off the legacy database vendor. And yet it didn't. Year after year, Amazon wrote massive checks worth tens of millions to Oracle, its stated enemy. Finally, on November 9, AWS chief Andy Jassy said that Amazon's consumer business finally weaned itself off Oracle's data warehouse for Amazon Redshift, and was getting close to moving all other applications to Amazon Aurora and DynamoDB.


Robots and the NHS: How automation will change surgery and patient care

Surgeons are one of the first medical specialties to welcome their robot overlords: in the NHS, surgical robots can already be found assisting with a range of operations, including urology, colorectal, and prostate procedures. These robots -- which are made up of a set of arms wielding cameras, lights and medical instruments -- are controlled by a surgeon sitting at a console who is then able to control the actions of the robot's arms with great precision. Using robots means surgeons can make smaller incisions, reducing blood loss and pain for patients, which can mean a faster recovery time and a shorter stay in hospital. That's good news for the patients, who can get back to their normal life quicker, but also good news for the NHS, which has fewer infections and complications to deal with, and sees beds freed up faster. Another attraction is that these robots can reduce the physical burden on surgeons -- bending over patients for several hours a day over years is not kind on the back -- which can allow clinicians to carry on operating for longer.



Quote for the day:


"Honor bespeaks worth. Confidence begets trust. Service brings satisfaction. Cooperation proves the quality of leadership." -- James Cash Penney


Daily Tech Digest - November 18, 2018


“Before integrating any new technologies into American life, we must be absolutely sure that those innovations are imbued with our values,” Democratic Sen. Edward Markey, who sent a letter to Amazon CEO Jeff Bezos expressing his concern about the company's facial recognition services, told BuzzFeed News. “I am not convinced Rekognition passes that test.” By contrast, decision-makers from Orlando seem prepared to go full steam ahead with tests of Amazon’s technology, though emails between city officials and Amazon reveal there were setbacks. Sgt. Eduardo Bernal, a public information officer for the city’s police department, told BuzzFeed News that Amazon provided no hands-on training on Rekognition, just standard documentation. Test results were flawed. There were miscommunications, including an embarrassing misstep that required an apology from Amazon — to the public and to Orlando PD.


Alphabet stops its project to create a glucose-measuring contact lens for diabetes patients

Google smart contact lens to measure glucose levels in tears.
"Our clinical work on the glucose-sensing lens demonstrated that there was insufficient consistency in our measurements of the correlation between tear glucose and blood glucose concentrations to support the requirements of a medical device," the company said. Verily made a big splash when if first launched the program in 2014, while it was still known as Google Life Sciences. The company partnered with Alcon, Novartis' eye-care division, on the project. However, it's been quiet about the project in the past few years, leading to speculation that it was winding down. Verily said it did have some success with the experiment in a controlled environment, but not in actual tests because of the dynamic environment of the eye. It's a problem that goes beyond Verily. Billions of dollars have been spent on research and development, but companies across both technology and life sciences have struggled. There's even a book dedicated to documenting these failures titled "the pursuit of noninvasive glucose: hunting the deceitful turkey."


Is the Ransomware Scare Over?

The primary emphasis in ransomware preparation, other than user education and perimeter defense, is backups. In response to ransomware, IT needs to protect all data more frequently including file servers and endpoints. To some extent, backing up all data is data protection 101, but in our experience, most organizations, except for critical applications, back up most of their data once per night. Ransomware makes once per night backups obsolete. While the public announcement of ransomware attacks may be down, the “creativity” of these attacks is on the rise. According to Proofpoint, the number of ransomware variants is up 30X. The variations make it harder for perimeter defense solutions to detect them. Some of the variants specifically attack components of the data protection process like protected data stores and backup configuration files. Also, some malware strains now sit idle, instead of immediately executing their encryption attack. This ensures that the malware file is backed up repeatedly by the data protection process.


Why tech-enabled go-to-market innovation is critical for industrial companies

Why tech-enabled go-to-market innovation is critical for industrial companies--and what to do about it
While most industrial companies have come to terms with the need to make more strategic use of technology,1 they are often unsure of how to proceed or are focused on the wrong initiatives, resulting in halting action and a failure to build significant value. On the other hand, those companies that move quickly and decisively to transform their go-to-market channels, models, and culture through technology should be able to unlock substantial value: top quartile B2B players generate 3.5 percent more revenue and are 15 percent more profitable than the rest of the B2B field. Our detailed analysis has identified a pool of $74 billion to $298 billion in revenue growth that could be tapped through enabling technology in sales (Exhibit 2). The value comes primarily through new customer experiences, refined pricing, and enhanced selling processes. ... Our experience in working with dozens of industrial companies has helped to identify where the main source of value is across the four main steps of the selling process: the presales stage, the sales process, the transaction itself, and IoT-enabled selling


Big banks are not feeling the FinTech heat (yet)

It’s the push-pull syndrome. FinTech apps push a lot of information to me because they’re intelligent; big bank apps force me to pull the information because they’re dumb. FinTech apps can predict and present my financial lifestyle to me intelligently; big bank apps show me what I’ve spent in a traditional debit and credit ledger that has no insight at all. Or that’s my experience of two of the most frequently used big bank apps. They’re pretty dumb. Meantime, my experience of some of the most popular FinTech apps is the opposite. ... Top of the fintechs is established payments unicorn TransferWise, with just 0.5 per cent of the visitor share in the most recent week. Revolut, which recently announced it had signed up 1 million UK users, has just 0.3 percent of the market share, while Starling Bank has 0.2 per cent. Traditional banks even dominate the new downloads list, though Starling manages to sneak into the top 10, with 4.6 per cent of downloads in the most recent week.


How Do HIPAA Regulations Apply to Wearable Devices?

HIPAA regulations could potentially apply to new technologies used by covered entities and business associates.
Wearable devices and how HIPAA regulations potentially apply is a very difficult issue, Spencer said. “There is a lot of ambiguity about exactly where HIPAA is triggered and where it's not,” she stated. “The only real clarity is where a company that offers a wearable, or a mobile app that collects health information, where that arrangement is just directly between the device maker and the individual. Or it’s between the app maker and the individual, and there's no covered entity or business associate involved. Then there's no application of HIPAA, that's clear.” HIPAA regulations only apply to covered entities and business associates, Spencer reiterated. This includes health plans, healthcare clearinghouses and certain healthcare providers that engage in certain payment and other financial transactions. Business associates are those organizations that specifically have access to health information to provide a service or perform a function on behalf of a covered entity, she noted.


Are We Nearing The End Of Hadoop And Big Data?

So, it’s no longer just Hadoop. Cloudera Chief Executive, Tom Reilly, admitted as much, in his comments after the merger: “Hadoop has evolved so drastically that we don’t even mention it anymore.” This analysis provides an overview of the different options available to enterprises instead of using Hadoop. And you have to wonder, if this trend continues, what the future will be for the technology. As the author writes, “The center of gravity has moved elsewhere.” What this development represents is how big data is now becoming just data. Every organization, large and small, now has access to an unparalleled quantity and quality (and more current/real-time) data than at any time in history. They have more technological options to build services using this data — and this is important because different use cases (using different types of data) mean it’s possible to choose the right technology for what you need. For example, there are numerous open-source options, as well as proprietary machine learning platforms. Many of these make the 10-year-old Hadoop technology look dated.


In bigger crackdown of crypto abuses, SEC goes after unregistered coin offerings

The U.S. Securities and Exchange Commission in Washington, D.C.
The settlement comes a week after the agency notched another "first," setting charges that a crypto firm called EtherDelta was operating as an unregistered exchange. The cases underscore the SEC's insistence that the relatively new digital financial products must follow traditional securities rules. "We have made it clear that companies that issue securities through ICOs are required to comply with existing statutes and rules governing the registration of securities," Stephanie Avakian, the SEC's co-director of enforcement, said in a statement. "These cases tell those who are considering taking similar actions that we continue to be on the lookout for violations of the federal securities laws with respect to digital assets." On Thursday, federal prosecutors in New York announced a guilty plea by a man who defrauded investors with two cryptocurrencies he founded during the initial coin offering boom.


All Roads Lead to Liquidation: Crypto Companies Cash in Big

All Roads Lead to Liquidation: Crypto Companies Cash in Big
The rising trend of acquisition could be the result of simple, sudden opportunity. Of the Bitstamp acquisition, CEO Nejc Kodrič, said that “the sale wasn’t planned. There was no active effort to go around and solicit buyers. The vibrant industry last year sparked potential interest from buyers to make a footprint in the industry. We started to get approached by buyers in the middle of last year.” Indeed, acquisition is a swift, simple way for a company’s owners to profit while maintaining some control over the company’s operations. Kodrič still holds a 10 percent stake in the company; Damian Merlak, his co-founder, sold all of his 30 percent stake. Generally speaking, “the benefits of [acqisition] include receiving valuable intellectual property and the talented employees of the acquired company – those are precious resources that can help companies grow quickly. Communities and a new user-base are also precious resources the acquirer gets after the deal,” explained Ruslan Gavrilyuk Co-Founder, President of Kepler Finance.


Spark Application Performance Monitoring using Uber JVM Profiler, InfluxDB and Grafana

Apache Spark provides a web-ui and REST API for metrics. Spark also provides a variety of sinks including Consoles, JMX, Servlet, Graphite etc. There are few other open source performance monitoring tools available like dr-elephant, sparklint, prometheus, etc. Metrics provided by these tools are mostly server level metrics, and few of them also provide information of running applications. Uber JVM Profiler collects both server level and application code metrics. This profiler can collect all metrics (cpu, memory, buffer-pool etc) from the driver, executor or any JVM. It can instrument existing code without modifying it, so it can collect metrics about methods, arguments and execution time. For storing metrics for timeseries analysis, we will use InfluxDB, which is a powerful timeseries database. We will extend Uber JVM Profiler and add a new reporter for InfluxDB so metrics data can be stored using HTTP API. For the dashboard of graphs and charts we will use Grafana, which will query the InfluxDB for metrics data.



Quote for the day:


"We get our power from the people we lead, not from our stars and our bars." -- J. Stanford


Daily Tech Digest - November 17, 2018

newyorkdeepmasterprints.jpg
The researchers from New York University detail in a new paper how they used a neural network to create 'DeepMasterPrints', or realistic synthetic fingerprints that have the same ridges visible when rolling an ink-covered fingertip on paper. The attack is designed to exploit systems that match only a portion of the fingerprint, like the readers used to control access to many smartphones. The aim is to generate fingerprint-like images that match multiple identities to spoof one identity in a single attempt. DeepMasterPrints are an improvement on the MasterPrints the researchers developed last year, which relied on modifying details from already captured fingerprint images used by a fingerprint scanner for matching purposes. The previous method was able to mimic the images stored in the file, but couldn't create a realistic fingerprint image from scratch. The researchers tested DeepMasterPrints against the NIST's ink-captured fingerprint dataset and another dataset captured from sensors.


The strategy of treating containers as logically identical units that can be replaced, spun up, and moved around without much thought works really well for stateless services but is the opposite of how you want to manage distributed stateful services and databases. First, stateful instances are not trivially replaceable since each one has its own state which needs to be taken into account. Second, deployment of stateful replicas often requires coordination among replicas—things like bootstrap dependency order, version upgrades, schema changes, and more. Third, replication takes time, and the machines which the replication is done from will be under a heavier load than usual, so if you spin up a new replica under load, you may actually bring down the entire database or service. One way around this problem—which has its own problems—is to delegate the state management to a cloud service or database outside of your Kubernetes cluster. That said, if we want to manage all of your infrastructure in a uniform fashion using Kubernetes then what do we do?


A data lake is where vast amounts of raw data or data in its native format is stored, unlike a data warehouse which stores data in files or folders (a hierarchical structure). Data lakes provide unlimited space to store data, unrestricted file size and a number of different ways to access data, as well as providing the tools necessary for analysing, querying and processing. In a data lake each data item is assigned with a unique identifier and metadata tags. In this way the data lake can be queried for relevant data and that smaller set of relevant data can be analysed. Also, data can also be stored in data lakes before being curated and moved to a data warehouse. ... The Azure Data Lake is a Hadoop File System (HDFS) and enables Microsoft services such as Azure HDInsight, Revolution-R Enterprise, industry Hadoop distributions like Hortonworks and Cloudera all to connect to it. Azure Data Lake has all Azure Active Directory features including Multi-Factor Authentication, conditional access, role-based access control, application usage monitoring, security monitoring and alerting.


Harvard researchers want to school Congress about AI

Funded by HKS’s Shorenstein Center on Media, Politics, and Public Policy, the initiative will focus on expanding the legal and academic scholarship around AI ethics and regulation. It will also host a boot camp for US Congress members to help them learn more about the technology. The hope is that with these combined efforts, Congress and other policymakers will be better equipped to effectively regulate and shepherd the growing impact of AI on society. Over the past year, a series of high-profile tech scandals have made increasingly clear the consequences of poorly implemented AI. This includes the use of machine learning to spread disinformation through social media and the automation of biased and discriminatory practices through facial recognition and other automated systems. In October, at the annual AI Now Symposium, technologists, human rights activists, and legal experts repeatedly emphasized the need for systems to hold AI accountable.  “The government has the long view,” said Sherrilyn Ifill, president and director-counsel of the NAACP Legal Defense Fund.


Role of digitisation and technologies like AI & ML in digital transformation of SMEs?


More specifically, AI-based solutions like automation can be greatly beneficial to SMEs in reducing several processes like sales planning, managing finances and supply chain, marketing, etc. These processes which most SMEs still conduct through offline methods considerably reduce the efficiency of the enterprise, since the managers’ focus is largely on the operations, rather than on serving customers and retaining them. Simultaneously, digitised business management and enterprise mobility solutions can enable SMEs to expand their business to any region within the country or outside, without having to worry about the infrastructural and monetary challenges associated. Customised, enterprise-centric solutions with AI and Machine Learning Every organisation faces a different set of issues and challenges. The solutions, then, to effectively tackle these challenges should also be specific to the business segment, as well as the industry, which the enterprise is involved in.


What Edge Computing Means for Infrastructure and OperationsLeaders

Edge computing solutions can take many forms. They can be mobile in a vehicle or smartphone, for example. Alternatively, they can be static — such as when part of a building management solution, manufacturing plant or offshore oil rig. Or they can be a mixture of the two, such as in hospitals or other medical settings. The capabilities of edge computing solutions range from basic event filtering to complex-event processing or batch processing. “A wearable health monitor is an example of a basic edge solution. It can locally analyze data like heart rate or sleep patterns and provide recommendations without a frequent need to connect to the cloud,” says Rao. More complex edge computing solutions can act as gateways. In a vehicle, for example, an edge solution may aggregate local data from traffic signals, GPS devices, other vehicles, proximity sensors and so on, and process this information locally to improve safety or navigation. More complex still are edge servers, such as those found in next-generation (5G) mobile communication networks.


The rare form of machine learning that can spot hackers who have already broken in


In cybersecurity, supervised learning works pretty well. You train a machine on the different kinds of threats your system has faced before, and it chases after them relentlessly. But there are two main problems. For one, it only works with known threats; unknown threats still sneak in under the radar. For another, supervised-learning algorithms work best with balanced data sets—in other words, ones that have an equal number of examples of what it’s looking for and what it can ignore. Cybersecurity data is highly unbalanced: there are very few examples of threatening behavior buried in an overwhelming amount of normal behavior. Fortunately, where supervised learning falters, unsupervised learning excels. The latter can look at massive amounts of unlabeled data and find the pieces that don’t follow the typical pattern. As a result, it can surface threats that a system has never seen before and needs few anomalous data points to do so.


Building a Web App With Yeoman

Released in 2012, Yeoman is an efficient open-source software system for scaffolding web applications, used for streamlining the development process. It is known primarily for its focus on scaffolding, which means the use of many different tools and interfaces coordinated for optimized project generation. GitHub hosts Yeoman. The Yeoman experience is three-tiered. Though they work together seamlessly, each part of Yeoman was developed separately and works individually. Primarily, Yeoman includes "Yo," the command line utility form used with Yeoman. This is the baseline of the Yeoman software platform. Next, Yeoman has "Grunt," and "Gulp," which are application builders to help automate your application development. Finally, the Yeoman software features "npm", which is a package manager. Package managers manage code packages for back-end and front-end development and their dependencies for you to develop your application. Yeoman provides developers with many options to combine in their development process.


Enterprise architecture still matters


Rather than checking in on how each team is operating, EAs should generally focus on the outcomes these teams have. Following the rule of team autonomy (described elsewhere in this booklet), EAs should regularly check on each team’s outcomes to determine any modifications needed to the team structures. If things are going well, whatever’s going on inside that black box must be working. Otherwise, the team might need help, or you might need to create new teams to keep the focus small enough to be effective. Most cloud native architectures use microservices, hopefully, to safely remove dependencies that can deadlock each team’s progress as they wait for a service to update. At scale, it’s worth defining how microservices work as well, for example: are they event based, how is data passed between different services, how should service failure be handled, and how are services versioned? Again, a senate of product teams can work at a small scale, but not on the galactic scale. 


Put Your BLL Monster in Chains

A very popular architecture for enterprise applications is the triplet Application, Business Logic Layer (BLL), Data Access Layer (DAL). For some reason, as time goes by, the Business Layer starts getting fatter and fatter losing its health in the process. Perhaps, I was doing it wrong. Somehow very well designed code gets old and turns into a headless monster. I ran into a couple of these monsters that I have been able to tame using FubuMVC's behaviour chains. A pattern designed for web applications that I have found useful for breaking down complex BLL objects into nice maintainable pink ponies. ... The high code quality is very important if you want a maintainable application with a long lifespan. By choosing the right design patterns and applying some techniques and best practices, any tool will work for us and produce really elegant solutions to our problems. If on the other hand, you learn just how to use the tools, you are going to end up programming for the tools and not for the ones that sign your pay-checks.



Quote for the day:


"A positive attitude will not solve all your problems. But it will annoy enough people to make it worth the effort" -- Herm Albright