Daily Tech Digest - June 15, 2017

How big data is disrupting education

Some people are visual learners and others more hands on. Also, some people are simply good at taking tests. There’s a difference between passing a test on a subject and actually having that subject mastered. Big data analytics allows educators to use a wider range of sources when grading a student to come up with a more realistic picture of how well they’ve learned a subject. It also provides the opportunity to... Data analytics has pointed out some startling truths about education in the inner-cities of America. Studies show that inner-city schools are more likely to higher less educated teaching staff to reduce costs. Unfortunately, those cost-savings have directly translated into lower grades and higher dropout rates. This problem that big data pointed out, can be monitored in real time and solved using the same tools.

Trends Shaping Machine Learning in 2017

Businesses greatly value data to take the appropriate actions, whether it is to understand the consumer demand or comprehend a company’s financial standing. However, it is not the data alone they should value because without an appropriate algorithm, that data is worth nothing. Peter Sondergaard, Senior Vice President of Gartner Research, says that, “Data is inherently dumb and the real value lies in the algorithms which deduce meaningful results from a cluster of meaningless data". Algorithm Economy has taken center stage for the past couple of years, and the trend is expected to follow as we expect to see further developments in machine learning tools. The use of algorithm economy will distinguish small players from the market dominators in 2017. Small businesses that have just entered the transitional phase of embedding machine learning processes in their business models will be using canned algorithms in tools such as BI, CRM and predictive analysis.

The evolution of data center networks

With an open platform powering each, we no longer have to solve the same problem twice (once for networking, and once for compute). Often, the two solutions are vastly different for no reason other than history. Computing was opened up when Linux replaced the proprietary server OSes; this changed the application landscape as we know it. I think solutions such as clouds and the Google search would have been harder to invent if not for the widespread use of Linux. For too long people have innovated around networks, not with them. If networking can be opened up as computing has been, and if people can innovate with the network rather than around it, I think networking and computing as we know it can change dramatically.

Stakeholders Collectively Designing Future Of Artificial Intelligence

The goals of the partnership are to understand what the main issues are “in the pervasive use of Ai in our lives,” economic and societal impacts, such as data policy, data privacy, data ownership, and sharing. ... A number of issues have been identified for further focus over the next two years, she said, such as safety, trustworthiness and transparency. IBM Watson, as explained by Whatis.com is “an IBM supercomputer that combines artificial intelligence (AI) and sophisticated analytical software for optimal performance as a ‘question answering’ machine.” According to Technopedia, “IBM’s Watson supercomputer is a question-answering supercomputer that uses artificial intelligence to perform cognitive computing and data analysis. With a processing rate of 80 teraflops, Watson performs deep-level input analysis to present accurate answers to questions generated by humans.“

Miniservices may deliver, much to microservices purists' chagrin

Scalability is probably one of the big reasons why we've kind of been looking at moving toward a microservices architecture. I think, in that case, it's fair to say HTTP doesn't solve the problem. I'm not sure that there's a firm line in the sand here. [But] just because you've got a hundred services or a thousand services doesn't necessarily mean that HTTP is the wrong choice. It may be just as functional as any other integration technology. But, you will get to a point, I'm sure, where the amount of traffic starts to get in your way. And then the other angle around HTTP or within web-centric integration that's perhaps useful is the concept of governance. HTTP affords us more capability there simply because with an easier way of seeing where traffic is coming from and flowing to.

Facebook teaches machines to negotiate with humans

Over the course of the interactions, machines naturally adopted many common negotiation tactics — like placing false emphasis on a low-value item in an attempt to use it as a more valuable bargaining chip later. Under the hood, Facebook’s rollout technique takes the form of a decision tree. Decision trees are a critical component of many intelligent systems. They allow us to model future states from the present to make decisions. Imagine a game of tic-tac-toe, at any given point of the game, there is a finite option set (places you can place your “X” on the board. In that scenario, each move has an expected value. Humans don’t usually consider this value in an explicit way but if you decompose your decision process when playing the game, you are effectively short-handing this math in your head.

Public cloud services show no sign of slowing down

Cloud experts predict growth in the PaaS realm. "For ten years, cloud companies focused on getting companies to purchase IaaS," said Maribel Lopez, strategic advisor at Lopez Research. "Today, cloud providers are delivering more value in the PaaS area to help people build new mobile-cloud ready apps as well as cybersecurity services that will help companies secure everything from devices to network and cloud-resident services." "As companies put more data into the cloud," said Lopez, "developing a richer set of services as a way to manage and secure corporate data has become a critical set of features and services for cloud providers. The cybersecurity market is the toughest market to crack but has tremendous opportunity." While it is clear that the cloud is showing signs of huge growth, it has still not quite caught up with the use of data centers.

Fintech developers tell you what to look for and why the fintech revolution arose

"Connecting the dots is what matters these days so the trend is towards establishing a data lake for investigation. AI needs data to be effective, the more the merrier, and the rise of 'big' unstructured data from social media, news and other sources helps this, as does the cloud as a connective layer. More data translates into better insights for anti-fraud or money laundering pattern spotting behavioral systems, financial market surveillance systems and other such applications. These are the key trends and technologies for me." What you do with the technology is important too. If people, process and technology aren't aligned and embedded in stringent management, risk and governance layers, then the full effectiveness of any new technical development won't be unleashed. People rely on machines and vice versa.

7 Techniques to Handle Imbalanced Data

What have datasets in domains like, fraud detection in banking, real-time bidding in marketing or intrusion detection in networks, in common? Data used in these areas often have less than 1% of rare, but “interesting” events. However, most machine learning algorithms do not work very well with imbalanced datasets. ... It is noteworthy that cross-validation should be applied properly while using over-sampling method to address imbalance problems. Keep in mind that over-sampling takes observed rare samples and applies bootstrapping to generate new random data based on a distribution function. If cross-validation is applied after over-sampling, basically what we are doing is overfitting our model to a specific artificial bootstrapping result. That is why cross-validation should always be done before over-sampling the data, just as how feature selection should be implemented.

Will Augmented and Virtual Reality Replace the Bank Branch?

Banking bots will soon be able to support most banking queries, with 24/7/365 access benefits afforded the consumer and significant cost savings being the driving force for the banking industry. Once the transition is complete for simple teller transactions, will the technology be used for full-function robo-advising as well? ... USAA was the first to offer limited voice banking, with Capital One enabling banking by voice, connecting via Amazon’s Alexa – embedded in devices like the Amazon Echo, Amazon Dot and newly introduced devices. Capital One customers can check balances, review transactions, make payments and more using simple voice commands. Bank of America is also working on voice recognition technology called Erica, that will allow people to do virtual banking by voice with a computer, similar to how people already use Amazon Alexa or Apple’s Siri.

Quote for the day:

"A pessimist is somebody who complains about the noise when opportunity knocks." -- Oscar Wilde

No comments:

Post a Comment