In reality, granular software applets add unnecessary complexity that stymies growth, increases overhead, and is at odds with how most contemporary cloud systems operate. This is "because [microservice] applications must be refactored to realize their value," Sweet said. "It's easy to get containerization mixed up with microservices," he said. "But where a traditionally monolithic application can be delivered in a large container model, moving an application from a traditional monolithic architecture to microservices requires complete refactoring. And as many enterprises learned when they tried to build private clouds, just because a new technology is hot doesn't mean there's enough engineering talent to go around." Microservice systems also demand a skilled employment ecosystem. Market demand and interest in microservices currently exceeds the pool of available, trained workers.
Zhang observes that IIoT is not a new concept for manufacturing as most production facilities have had process control systems, SCADA data and historians for decades. However, with the advent of IIoT the next wave of manufacturing productivity is at the doorstep (or on the loading dock). The application of analytics to volumes of data produced by instrumented, connected assets can deliver quantifiable savings and benefits across supply chain and manufacturing processes. However, one of the immediate challenges frequently voiced by manufacturers undertaking an IIoT initiative is acknowledgement that much of the data generated by their assets is never captured, particularly “unstructured” data. And this is where IBM’s Watson IoT cognitive capabilities can transform that data, once captured, into meaningful insight.
Several concepts central to data science will be examined. Or, at least, central in my opinion. I will do my best to put forth how they relate to one another and how they fit together as individual pieces of a larger puzzle. As an example of somewhat divergent opinions, and prior to considering any of the concepts individually, KDnuggets' Gregory Piatetsky-Shapiro has put together the following Venn diagram which outlines the relationship between the very same data science terminology we will be considering herein. The reader is encouraged to compare this Venn diagram with Drew Conway's now famous data science Venn diagram, as well as my own discussion below and modified process/relationship diagram near the bottom of the post.
Big data can act as an external source of master information for the MDM hub and can help enrich internal Master Data in the context of the external world. MDM can help aggregate the required and useful information coming from big data sources with internal master records. An aggregated view and profile of master information can help link the customer correctly and in turn help perform effective analytics and campaign. MDM can act as a hub between the system of records and system of engagement. However, not all data coming from big data sources will be relevant for MDM. There should be a mechanism to process the unstructured data and distinguish the relevant master information and the associated context. NoSQL offering, Natural Language Processing, and other semantic technologies can be leveraged towards distilling the relevant master information from a pool of unstructured/semi-structured data.
Security providers are increasingly using machine learning to tell good software from bad, good domains from bad. Now, there are signs that the bad guys are using machine learning themselves to figure out what patterns the defending systems are looking for, said Evan Wright, principal data scientist at Anomali. "They'll test a lot of good software and bad software through anti-virus, and see the patterns in what the [antivirus] engines spot," he said. Similarly, security systems look for patterns in domain generation algorithms, so that they can better spot malicious domains. "They try to model what the good guys are doing, and have their machine learning model generate exceptions to those rules," he said. Again, there's little hard evidence that this is actually happening.
In cybersecurity, the defenders acknowledge that highly motivated and creative adversaries are launching sophisticated attacks. There’s also the realization that when software is used as a weapon, building a stronger or taller wall may not necessarily keep out the bad guys. To them, more defensive measures provide them with additional opportunities to find weak spots and gain access to a network. This mentality goes against the fundamental principle in IT security of erecting multiple defensive layers around what you’re trying to protect. By separating what you’re trying to protect from the outside world, you’re keeping it safe—at least in theory. While this works in physical security, where IT security has its roots, it doesn’t really work when you’re facing enemies who need to be successful just once to carry out their mission.
Security is the categorical imperative of the IoT. Many companies have always understood this and have never abdicated their responsibilities. But that understanding needs to be made absolute. Security must be baked directly into every IoT solution; incorporated into the development process of all devices and systems and suppliers; normalized across every application. All stakeholders need to be on a common ground — and education is the first step. Efforts like those of the Internet of Things Consortium (IoTC) Privacy and Security Committee seek to establish and disseminate guidelines for minimum viable products and policies to strengthen privacy and security. There is no such thing as infallible security and there will always be people looking for ways to exploit and subvert IoT technologies. But we don’t have to make it so easy for them.
Vital data needed by organizations frequently is found not only outside the enterprise data warehouse, but outside the enterprise. Businesses are pressed to recognize the value that can come from integrating data from a variety of sources. Data management and data integration solutions have been strongly challenged to handle continuous changes in data and how it's used, increasingly in real-time. Modern data integration builds on technologies and processes that long have been part of the bigger world of data integration, beyond basic ETL functions. Practices like data quality, data profiling and data governance (also highly relevant to business users) comprise important capabilities that are central to reliable up-to-date data, no matter the source or structure. Modern data integration offerings encompass interoperating multi-platform solutions (iPaaS and on-premises), as well as pure-play cloud and SaaS solutions, where the lines continue to blur between application and data integration.
Cybersecurity is inherently interdisciplinary, Hurley said. "One thing I've done over and over is bring people from different disciplines into a room, to create a common vocabulary and work through a particular issue or problem that needs to be resolved," she said. Depending on your background, you may be able to make the leap to security within your own company, Hurley said. "There are tons of opportunities in cyber and many doors of entry," Hurley said. "Whatever doorway you come through, you will be working with colleagues from many disciplines, and becoming more expert." Shelley Westman, senior vice president of alliances and field operations at Protegrity, started her career as a lawyer. She left the field and went to work at IBM in a number of different roles ranging from procurement to product management. Eventually, she was assigned a role in hardware security.
So optimizing in one area can cause problems downstream in another area, and we don't recognize that if we're not looking at the big picture. Getting people to see and understand the big picture, and then having that help bring previously maybe even warring tribes together to have a conversation about how to fix it. Secondly, DevOps has a lot to do with culture. It's not just about automation and how can we change culture from a top-down command and control approach to a more distributed power to give people autonomy to do the work that they need. So some of the models that we're looking at is the Western behavior model which is the elements for a pathological kind of organization that's run by fear, versus a bureaucratic organization that's usually run by rules, versus a really agile organization to work with that's run by high cooperation and diversity
Quote for the day:
"Don't let today's opportunities become tomorrow's what ifs." -- Pat Flynn