Once we were confident that we were not about to make a tragic mistake, we plunged in and started the sometimes challenging process of redoing what we had in order to fit our lives into the OpenStack framework. We started with one of the available implementations of OpenStack/Cloud Foundry. (As with Linux, you can go entirely open source or you can choose a supported version from a number of providers). But, as our knowledge and experience of the OpenStack framework grew, we identified some gaps that created issues around segregation of duties (which is critical for SOX, SOC 2 and other compliance standards). We began modifying our way into our own version, which includes some technologies we created to better handle application-level security and data access controls.
Bremeau illustrated what a successfully integrated – but simplified – big picture model would look like, using a classic Enterprise Architecture for Data Warehousing. “If there is one thing to learn and master in any Metadata Management and Data Governance solution, it’s known as ‘the big picture.’” “There are a couple of data stores on one side, could be files, and then I have some ETL tools that are bringing everything into the Data Warehouse,” with the BI tools on the other side. “Now this is very simplified because most of the customers that we deal with do a lot of staging areas, before. And they’re not using one ETL but three or four different types of ETLs, and some hand-written SQL scripts, and you have to deal with all that. That’s the reality, if you truly want to know the lineage of what’s going on in the enterprise.”
What many have tried is give more power to the IT department, or have stricter controls, more principles and guidelines, reviews, gates, and so forth. All are ways in which the agility of the enterprise suffers, with nothing much to show for it. Because in the end: the immediate business goals almost always outweigh the long term architecture goals. So, these ways never last. What we need is more agility in architecture (especially now that agile change methods have become popular), not less. My answer is that boards of enterprises should not give these IT architecture goals to the IT department, they must explicitly give them to the business units instead. And they must have the strength of conviction to actually hold those business units accountable for the IT goals, in the same way that they hold those units accountable to compliance with external demands, from owner/shareholder to regulator.
Per VoC research conducted by our firm, today’s personalization is broken. It relies on implicit data, i.e., web browsing behavior, data mined from social media, data modeling, and purchase-based behaviors. These are not providing the necessary depth of information to drive relevant communications and offers. As a result, most attempts at personalization simply do not drive the expected increases in response. Marketers must now make a profound shift and move to human data, which is based on explicit, self-profiled, opt-in preference data. Human data personalization is unique in that it lends itself to segmentation based on self-described personality types, attitudes, and life stages. Human data-based personalization is consistently driving double-digit response rates.
Most crucial part of an AI system that brings machines slightly closer to humans is continuous learning. How does a recommendation engine in eCommerce sites works? It uses humongous historical data on browsing and purchase behavior of website visitors, learns patterns in the data for product preferences and makes relevant product recommendations. This learning is not one time, but happens continuously whenever chunk of new data is available. In fact learning is a key component that differentiates current generation of AI systems from earlier generations. This advancement is made possible mainly because of prevalence of machine learning algorithms. For example, recently Google announced that its language translation tool is dramatically improved because of using a versatile machine learning technique called Deep Learning.
From a macroeconomic perspective, there is a significant pressure on banks, especially European banks, because there is limited growth opportunities. If you cannot go because interest rates are low, and the transaction volume is also because in the market of uncertainty, is low. So, there’s a revenue pressure, the pressure even on the cost side is going up significantly. Cost side meaning your cost-income ratio is under pressure compared to US banks. US banks are at 55% of your cost vs. revenue, most European banks are at over 70%. So there is a significant pressure on those banks to be very careful to reduce your operating expenses, which has also an impact on potential investments going forward. So it is constrained and stressed environment, and the new technology is even triggering, from my perspective, even bigger, significant change.
Interactions via natural language processing must be quick and simple - in a word, functional. From a bot perspective, one key differentiator is the capacity for banks to allow richer “mini-apps” as part of their messaging experience, in which each message has the potential to become an atomic application. That means functionality must be broken down into manageable chunks supported by services or better said, micro-services, in the integration layers of core systems. Sadly, if you are a banker, these micro-services are unlikely to exist in your organization. ... If we limit the choice of what users can do in a chat, we will need to somehow train the users or offer “menu” choices, much more obvious in a traditional interface, which reduces useability and defeats the purpose of a “conversation” in the first place. Check out this example from BI Intelligence:
The incoming description of a problem can be analyzed for its underlying patterns. Much time is wasted in IT services as one service desk listener responds to a problem one way and another in a different way. "Incident management needs better categorization. There's sometimes misinterpretation of what the incident is about and what skill set is needed to resolve it," said Hough. With ServiceNow's existing configuration management database and change management products will be tied into machine learning to get a more accurate incident management process. The information available through them will also make it possible for a machine-learning system to look at pending changes and "assess the risk as changes come through, based on its learnings from what's happened in the past," she said.
At first hand, it appears that MDM and big data are two mutually exclusive systems with a degree of mismatch. Enterprise MDM initiative is all about solving business issues and improving data trustworthiness through the effective and seamless integration of master information with business processes. Its intent is to create a central trusted repository of structured master information accessible by enterprise applications. The big data system deals with large volumes of data coming in unstructured or semi-structured format from heterogeneous sources like social media, field devises, log files and machine generated data. The big data initiative is intended to support specific analytics tasks within a given span of time after that it is taken down. In Figure 1 we see the characteristics of MDM and big data.
Data is without a doubt a boardroom responsibility in a digital economy. Organizations have to think of their business from a customer and data perspectives if they want to thrive amidst rapid progress of data-enabled technologies and increasingly competitive environments. The recent Microsoft Asia Data Culture Study 2016, which polled 940 business leaders from medium to large-sized companies in 13 markets in Asia, found that 87% of respondents felt a data culture should be driven from top down, and that there should be a formalized role in the leadership team to drive successful adoption of their data strategy. ... A data strategy is needed to define what data is to be used by the organization – and how that will add long term value. As part of an overall data governance framework, this requires an understanding of the value, risk and constraints inherent in all data.
Quote for the day:
"If someone's criticism is completely unfounded on data, then I don't want to hear it. It doesn't hold up to scrutiny." -- Tim Ferriss