IBM is also planning hardware and software for inferencing, which requires lighter processing on the edge or end device. The inferencing engine takes results from a trained model, adds additional data or input, and provides improvised results. Drones, robots, and autonomous cars use inferencing engines for navigation, image recognition, or data analysis. Inferencing chips are also used in data centers to boost deep learning models. Google has created its own chip called TPU (Tensor Processing Unit), and other companies like KnuEdge, Wave Computing, and GraphCore are creating inferencing chips. IBM is working on a different model for its inferencing hardware and software, Gupta said. He did not provide any further details.
“Developing strong business acumen in IT is a prerequisite to effectively shift IT focus from optimizing IT operational efficiency to driving business effectiveness, value creation and growth,” said Lily Mok, research vice president at Gartner. “At the heart of an effective IT communication strategy is the ability to clearly link the vision, strategy and action plans of IT to the business to drive desired behaviors in the workforce that contribute to improved IT performance and business outcomes.” In the digital business world, it is essential that CIOs and IT leaders be able to show the role of IT within the business strategy. Business acumen enables CIOs to combine technology know-how with a broad understanding of the market and industry, and design solutions that create new business value and growth.
Modern production machine learning systems do often contain common inference and preprocessing techniques that PMML can represent, but they also contain data transformation and feature engineering steps that are particular to the problem at hand, and modeling approaches that have not yet been canonized in the PMML standard. Thus we commonly see PMML-based deployment strategies that are decorated with Python scripts or other kludges that cover those functional gaps but detract from the raison d’être of the intermediate representation: a single description of the computation that ops needs to manage in production. Because of its restriction to a canned set of functionality, PMML is a failed standard.
With the growth of mobile devices and the many applications that now support them, IT departments everywhere have witnessed an uptick in requests for new applications at a time when staffing has remained relatively flat. This has prompted the adoption of IT automation that can promote a more rapid development and deployment of new applications. It has also prompted leaders to take a look at other areas of IT that can benefit from automation. At the same time, moving to IT process automation can be difficult. It requires the rewriting of policies and procedures, vetting these rewrites for governance compliance, reworking workflows in the data center to accommodate the automation, and of course, retraining and/or redirecting IT personnel to work with the automation in their daily routines.
“CIOs are tired of hearing about the cost of IT and how to shave it further. By being able to discuss value, they can get a strategic seat at the table,” says Apptio’s Gupta. “They can take it to the level of a services construct, with cost metrics behind that. Security is a great example: The board asks why are you spending so much on security. With TBM, you can have a model of cost and value of security. That means you can talk to business executives at a higher level. And the transparency of the standard TBM model builds a foundation of trust.” Thus, when a CIO comes to the board the discussion isn’t about “why are we spending so much on storage?” but “we can use these existing systems more effectively for these business needs if we invest more here and less there, or replace some with different technologies or cloud services.
"It takes some time for the ecosystem to realise the importance of disruptive technology, but now the world is waking up to the reality of AI. AI will disrupt and change the way HR functions and all the current cutting-edge technology in prevalence is going to disappear. This technology is going to grow and since we are one of the few early entrants we will get our fair share of growth," says Sanan.
Growing at a fast clip, Spire is poised to take advantage of a world which is slowly, but steadily accepting the impact and understanding the efficiencies of an AI-based platform. While the US, being the frontrunner in the tech world, is the primary market for Spire, many businesses in India are also keen to take advantage of disruptive technologies like Spire TalentSHIP.
“Freedom on the Net,” Freedom House’s annual study on internet freedom around the world, sheds light on the fact that in many parts of the world, access to the free and open internet is simply not a reality. While this is due to a number of factors, for many people government-mobilized access restrictions and shutdowns present a significant barrier to the tremendous benefits of the internet. The Internet Society is a proud sponsor of this year’s report. As an organization that supports the global development of an open internet free of undue regulation and censorship, it considers the findings of “Freedom on the Net” troubling. In particular, we believe the disruptions in internet access identified by Freedom House have harmful social, free speech, political and even economic consequences.
In addition to the partnership with OpenAI, Microsoft also announced the its Azure Bot Service preview that allows developers to build bots on the "Microsoft Bot Framework and easily deploy and manage them in a serverless environment on Azure." These bots run on Azure Functions—available now—to help them scale and more efficiently use resources. However, Microsoft hasn't always had the best luck with bots. Back in April, a chatbot from Microsoft called Tay was manipulated by users into spewing racist and sexist messages before being taken offline. While interesting, the partnership between Microsoft and OpenAI was preceded by the launch of the Partnership on AI in September, when Microsoft joined Facebook, IBM, Google, and Amazon to promote best practices in AI development.
Google fired its latest salvo on Tuesday, announcing a set of enhancements to its suite of cloud machine-learning capabilities. The first was a new Jobs API aimed at helping match job applicants with the right openings. In addition, the company is slashing the prices on its Cloud Vision API and launching an enhanced version of its translation API. On top of that, Google is offering GPUs in its cloud both through the company's managed services and its infrastructure-as-a-service product. Companies that want to roll out their own machine learning systems and algorithms can now take advantage of the new hardware. These moves are important steps for the company as it continues to compete with Microsoft, Amazon, IBM and other vendors.
"What makes it legacy is when it can no longer perform its purpose and deliver value to the business at any cost. That's when it becomes a hindrance to move your business forward," said David Mayer, vice president of product management-software at Insight. He pointed to legacy-related issues with corporate networks as case in point. Many companies have networks initially built to handle data moving only through internal systems; now those companies expect their networks to handle significant amounts of data moving both internally and externally. "There are a lot of networks designed for 'Old World' interchanges and not for interchanges out in the cloud," Mayer said, adding that legacy networks create bottlenecks that slow down business.
Quote for the day:
"Success is what happens after you have survived all your mistakes." -- Anora Lee