June 05, 2016

The barbell effect of machine learning

One of machine learning’s most lasting areas of impact will be to democratize basic intelligence through the commoditization of an increasingly sophisticated set of semantic and analytic services, most of which will be offered for free, enabling step-function changes in software capabilities. These services today include image recognition, translation and natural language processing and will ultimately include more advanced forms of interpretation and reasoning. Software will become smarter, more anticipatory and more personalized, and we will increasingly be able to access it through whatever interface we prefer – chat, voice, mobile application, web, or others yet to be developed. Beneficiaries will include technology developers and users of all kinds.


Unstructured Data Miners Chase Silver with Deep Learning

There’s a tremendous amount of value hidden in unstructured data, including social media posts, news stories, legal documents, and other free sources of data. But as indico CEO Slater Victoroff explains, it can be very difficult to get useful insights out of these sources, such as performing sentiment analyses. “We found the majority of people are either not doing anything with unstructured data, or not doing nearly enough with it,” Victoroff tells Datanami. “The main barrier to sentiment analysis is not making a better model. It’s getting more data.” Indico claims it has come up with a better way to analyze unstructured data. While much of how the platform works is secret, key breakthroughs involve the combination of silver-standard corpora training data, as well as the use of transfer learning techniques to accelerate the training of its recurrent neural network (RNN) model.


Tech Conferences: Where the Conversation Has Been and Where It Should Go

Security conferences are an important way to educate and engage an audience beyond the security professional and hackers of the world who attend. The most important conversation we can have is that cybersecurity can no longer be an aftermarket consideration. It needs to be built into every new network and infrastructure system. For example, the idea of collective problem solving would call for “crowdsourcing” information from beyond just cybersecurity professionals and would focus on the collective knowledge of individuals with diverse skill sets to identify and address where existing technologies are lacking and vulnerable. These conferences strive to foster community building among different groups of people working on the same problems. They create an open space where ideas can be shared, people can connect, and collaboration can flourish, ultimately leading to better technology and stronger systems.


Machine Learning Is Redefining The Enterprise In 2016

The good news for businesses is that all the data they have been saving for years can now be turned into a competitive advantage and lead to strategic goals being accomplished. Revenue teams are using machine learning to optimize promotions, compensation and rebates drive the desired behavior across selling channels. Predicting propensity to buy across all channels, making personalized recommendations to customers, forecasting long-term customer loyalty and anticipating potential credit risks of suppliers and buyers are Figure 1 provides an overview of machine learning applications by industry.


Why stop at DevOps? How about AllOps?

We fully embrace this mindset but we think it is too limiting. Why stop at providing seamless integration and automation for just the developers? After all, there are many other teams that surround and interact with IT operations. Why not let them do the same thing—use their tools of choice to interact with the operations environment in a fully integrated and automated manner?  For instance, the virtualization admin may want to provision new infrastructure, update existing infrastructure, or see how virtual resources, such as networks, are mapped to the physical infrastructure. Rather than requiring two teams, two tools, and manual processes, shouldn’t the virtualization admin expect a VirtOps environment? Shouldn’t they be able to work in their preferred virtualization tool and still interact with the physical infrastructure in an integrated and automated way?


A Self Motivated Tester in an Agile World

We have to strongly concentrate on the ‘Agile Manifesto’ which says to focus on individuals and interactions over processes and tools. As testers, there is a need to work as members of the team and to engage with others within the team, regardless of titles. The role of a tester is to test. However, if you see a task that is stuck and needs immediate attention, even though it’s part of the developer's job, you should take up the task. The title and role of every individual working should have less impact in any matter. Regular and timely communication is yet another aspect. Use the daily scrum meeting as the best platform to communicate effectively with the entire team. We have to look at incorporating some best practices so that our efforts as an Agile tester are really effective.


EU's market watchdog to take deeper look at blockchain

"It is too early at this stage to form a definite opinion on whether DLT (blockchain) will be able to address these issues in an efficient way," the watchdog said in a statement. "ESMA stresses that firms willing to use DLT (blockchain) should be mindful of the existing regulatory framework." The watchdog also published a discussion paper asking for more views on blockchain's risks and benefits to help it decide whether new rules are needed for securities markets. "ESMA believes that the DLT will need to overcome a number of possible challenges and shortcomings before its benefits can be reaped," the watchdog said. "Some of these challenges are related to the technology itself. Others are mainly related to possible governance, privacy and regulatory issues." Regulators are trying to work out what is really new about blockchain in case it takes off.


Confluent Platform 3.0 Supports Kafka Streams for Real-Time Data Processing

Kafka Streams is a lightweight solution to real-time processing of data which is useful in use cases such as fraud and security monitoring, Internet of Things (IoT) operations and machine monitoring. It provides a new, native streaming development environment for Kafka. Developers will be able to leverage this library for building distributed stream processing applications using Kafka. Kafka covers the messaging and transport of data, Kafka Streams covers the processing of the data. Kafka Streams also supports stateful and stateless processing as well as distributed fault tolerant processing of data. No dedicated cluster, message translation layer, or external dependencies are required to use Kafka Streams. It processes one event at a time instead of as micro-batches of messages. It also allows for late arrival of data and windowing with out-of-order data.


Artificial intelligence is changing SEO faster than you think

Google’s RankBrain is in the camp of the Connectionists. Connectionists believe that all our knowledge is encoded in the connections between neurons in our brain. And RankBrain’s particular strategy is what experts in the field call a back propagation technique, rebranded as “deep learning.” Connectionists claim this strategy is capable of learning anything from raw data, and therefore is also capable of ultimately automating all knowledge discovery. Google apparently believes this, too. On January 26th, 2014, Google announced it had agreed to acquire DeepMind Technologies, which was, essentially, a back propagation shop. So when we talk about RankBrain, we now can tell people it is comprised of one particular technique  on ANI. Now that we have that out of the way, just how much is this field progressing?


Zimbabwe: IT Governance Integral Part of Corporate Governance

IT governance now constitutes a key component of every company's strategic plan and consequently it has become a standing agenda item at board meetings. There are also other factors that have catapulted focus on IT governance. IT systems and e-commerce, despite their advantages have also brought with them a lot of operational risks which organisations need to mitigate. Cybercrime has become a very big challenge to organisations with computer database hacking, data corruption, manipulation and loss, phasing, identity theft, card fraud, virtual money laundering etc becoming more rampant globally. This has produced disastrous consequences not only to organisations, their customers and stakeholders, but also to global economies and stability. The rise in cybercrime has placed organisations in "panic mode" catalysing them to invest more in risk mitigation measures as part of their IT governance framework.



Quote for the day:


"Any powerful idea is absolutely fascinating and absolutely useless until we choose to use it." -- Richard Bach


No comments:

Post a Comment