There is a clear trend among CEOs to magnify the relative importance of technology in the future of work with 67 percent saying they believe that technology will create greater value in the future than human capital will. Another 63 percent of CEOs said they perceive that technology will become their firm’s greatest source of future competitive advantage. But the economic reality differs sharply, with human capital, not physical capital, creating the greatest value for organizations. CEOs’ distorted perceptions demonstrate the extent to which people are being painted out of the future of work ... A full 44 percent of leaders in large global businesses told Korn Ferry that they believe that the prevalence of robotics, automation, and artificial intelligence (AI) will make people “largely irrelevant” in the future of work.
An interesting anecdote from The Lean Startup, one of the manifestos for startup founders, is that Intuit holds themselves accountable to being innovative and agile by using two key metrics: (1) the number of customers using products that didn’t exist three years ago and (2) the percentage of revenue coming from offerings that did not exist three years ago. Historically for Intuit, it took a new product an average of 5.5 years to reach $50 million in revenue; at the time the book was written, they had multiple products generating $50 million in revenue that were less than a year old. Particularly, as the world is moving towards cloud computing, continuous development, and continuous updates are the name of the game.
“Hadoop is known to be self-healing, so if a node goes down on a server, it’s not a problem,” Dijcks says. “But if you buy inexpensive servers, you’re more likely to have nodes down and spend more time fixing hardware. And when you have a chunk of nodes that aren’t working, you’ve lost that capacity.” ... IT departments figure, “‘We’ve invested a lot of time, we’ve worked on this very hard, and now we need to put it into production,’” Dijcks says. “You can learn on throwaway servers, because if [the environment] goes down, no worries—just restart it. But in production, the cluster needs to stay up through hardware failures, human interaction failures, and whatever can happen.”
"Knowledge is power, and knowledge of yesterday is not as valuable as knowledge about what's happening now in many -- but not all -- circumstances," said W. Roy Schulte, vice president and analyst at Gartner. Businesses want to analyze information in real time, an emerging term dubbed fast data. Traditionally, acting on large volumes of data instantly was viewed as impossible; the hardware needed to support such applications is expensive. ... The use of commodity servers and the rapidly decreasing cost of flash memory now make it possible for organizations to process large volumes of data without breaking the bank, giving rise to the fast data architecture. In addition, new data management techniques enable firms to analyze information instantly.
During a recent customer visit, we got discussing the financial impact of Data Governance. To help explain this point, I thought I’d share some of the more common problems associated with NOT having data governance. By looking at it from this point of view we can get an idea of what the business is doing to overcome these issues, against which we can then associate some value. ... This isn’t meant to be an exhaustive paper on the subject, more a sharing of thoughts and ideas. I’d also add that the ideas presented in this blog aren’t suggesting these impacts will happen, more a sharing of some common challenges we see in the world of Financial Services and a way to try to understand the potential financial impact they might cause. These challenges should be seen from the perspective of potentially being part of a broader Data Governance initiative.
It makes us feel more than think, and it comforts more than challenges. The result is a deeply fragmented society, driven by emotions, and radicalized by lack of contact and challenge from outside. This is why Oxford Dictionaries designated “post-truth” as the word of 2016: an adjective "relating to circumstances in which objective facts are less influential in shaping public opinion than emotional appeals." ... Social media, in contrast, uses algorithms to encourage comfort and complaisance, since its entire business model is built upon maximizing the time users spend inside of it. Who would like to hang around in a place where everyone seems to be negative, mean, and disapproving? The outcome is a proliferation of emotions, a radicalization of those emotions, and a fragmented society.
The virtual intelligent eye works by generating a digital "fingerprint," based on behavior for every single login by every single user in every single application and database across the organization. This information is a recording of the "who, what, when, where, why and how" data is being accessed within an organization. Once a baseline for behavior is established, the system can easily identify anomalies in user activity and send out the appropriate alerts immediately when there are deviations from normal behavior. The cost of this technology will be positively impacted by the continuing decline in the cost of storage and processing power -- from cloud computing giants like Amazon, Microsoft and Alphabet. The healthcare data security war can be won, but it will require action and commitment from the industry.
Data socialization is an evolution in data accessibility and self-service across individuals, teams and organizations that is reshaping the way organizations think about, and employees interact with, their business data. Data socialization involves a data management platform that unites self-service visual data preparation, data discovery and cataloging, automation and governance features with key attributes common to social media platforms, such as having the ability leverage user ratings, recommendations, discussions, comments and popularity to make better decisions about which data to use. It enables groups of data scientists, business analysts and even novice business users across a company to search for, share and reuse prepared, managed data to achieve true enterprise collaboration and agility.
“Every breach occurs because someone in that company did something they were not supposed to do or because someone in that company failed to do something they were supposed to do,” Abagnale said. “There is not a master hacker sitting in Russia who will get through the company. The hacker will say, ‘I am not getting into JP Morgan Chase because they spend a fortune every year on cybersecurity, but they employ 200,000 people worldwide, so all I am looking for is one of those people who failed to do something they were supposed to or did something they were not supposed to do.’” Abagnale said he will explain the weaknesses and soft spots in companies and instill in attendees that the most important job they have is to keep the information entrusted with them safe.
Reactor's two main types are the Flux<T> and Mono<T>. A Flux is the equivalent of an RxJava Observable, capable of emitting 0 or more items, and then optionally either completing or erroring. A Mono on the other hand can emit at most once. It corresponds to both Single and Maybe types on the RxJava side. Thus an asynchronous task that just wants to signal completion can use a Mono<Void>. This simple distinction between two types makes things easy to grasp while providing meaningful semantics in a reactive API: by just looking at the returned reactive type, one can know if a method is more of a "fire-and-forget" or "request-response" (Mono) kind of thing or is really dealing with multiple data items as a stream (Flux).
Quote for the day:
"Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it." -- Edsger W. Dijkstra