An email packed with info may cause readers to skim and overlook important details. Use GIFs to make pieces of information stand out—or to help explain concepts. When we redesigned MailChimp’s dashboard in 2013, for example, we used several GIFs in an email to show the changes to our users. Breaking up this information with short, digestible visuals helped our users understand the concepts and get them used to the new dashboard without spending a ton of time reading. Most email clients don’t offer great support for video. Currently, videos only render in Apple Mail, Thunderbird, iOS10’s native client, and Samsung Galaxy’s native client. On top of poor support, video files can be huge and slow to load if subscribers are using a poor connection, which can disengage the viewer. Making subscribers download large files isn’t a great idea, either, because it can send users over their data plans if they’re not on WiFi.
IoT enablement in Insurance is the new normal for both Insured and Insurer. For Insurers, it helps improve the underwriting process through finer risk segmentation, agile pricing, improved loss and combined ratios, cross sell and up sell opportunities. Further, it enables customer- centric product offerings, increased brand loyalty, customer churn reduction, simplified claims processing and more. Similarly, the Insured is able to reap the benefits of competitive pricing, quicker policy and claims servicing, personalized offerings, constant updates on risk variations through proactive alerts and advice on risk management and more. The incremental adoption of IoT enabled connected device usage by Insurers is helping conceptualize a “Pay as You Use” model that offers customized pricing and servicing to eligible customers.
Sigler notes that cloud-native means developers no longer have to keep reinventing the wheel, and “going cloud native acts as a ‘forcing function’ for how applications are built on top of infrastructure”. By standardising on the behaviour of lower-level components such as compute and networking, he says businesses are effectively telling individual teams working on these smaller, more agile units of software to stop wasting their time on changing everything below the application layer. This, says Sigler, is different to the approach previously taken with traditional or virtualised application designs, where developers tended to spend lots of time reinventing how they would ship the software. Not only is this a painful process, it is one that does not often result in useful business value, he says.
Most intriguing is the possibility of AI identifying new associations and correlations that are yet to be detected by humans. For example, UK researchers turned over the data of about 295,000 patients to AI, to allow them to correlate medical history with the rate of heart attacks. After that, the AI was given another record of 82,000 patients whose history of heart attacks were already known for the AI to predict the ones that are most likely to have a heart attack. The result of the AI when compared to the predictions based on current “best practice” American College of Cardiology/American Heart Association (ACC/AHA) guidelines, which include patient age, smoking history, cholesterol levels, diabetes history, etc. the AI beat the human's hands down.
Design Thinking is one of the ways in which this change can be brought about. Design Thinking is part of a broad methodology that amalgamates elements of imagination, intuition, holistic reasoning, and logic to explore all the probable solutions for a given problem. It includes the identification of all unarticulated needs expressed by a consumer. After the identification of the needs, the team creates solutions that address all needs and end up creating the “wow” effect. The solutions are generated creatively and analytically as Design Thinking is more solution oriented than being problem oriented. Reaching a feasible conclusion is frequent in Design Thinking. The risk inherent within innovative solutions is minimized by transitioning users through numerous prototypical solutions that give leverage for learning, testing and completely refining the ultimate solution.
In many cases these new databases were “NoSQL” or “non-relational”—solutions based on data models other than that dominant relational model, such as document, key-value, column oriented, and even graph databases. Frequently these databases sacrificed some of the familiar guarantees of relational databases like strong consistency, ACID transactions, and joins. At the same time as this revolution in database technology, the SOA trend of the early 2000s was maturing into the microservices architectural style, as many organizations began to move away from heavyweight SOA infrastructures such as the enterprise service bus (ESB) toward a more decentralized approach. The appeal of microservices architecture lies in the ability to develop, manage and scale services independently. This gives us a ton of flexibility in terms of implementation choices, including infrastructure technology such as databases.
"One of the big reasons we continue to see such demand for data scientists is every company out there is becoming a tech company," Allison Berry, Glassdoor community expert, told TechRepublic. "In any industry that has to deal with digitized data, or has an app or an online presence, you need people who can help support all of that and find insights from the data." However, we are currently facing a shortage of professionals with data science skills: By 2020 the number of annual job openings for all data savvy professionals in the US will increase to 2.7 million, IBM predicted. Those with data science skills can command an average salary of $96,441 in the US as of October 2017, with 0.9% year-over-year growth, according to Glassdoor. To help those interested in the field better understand how to break into a career in data science, we've created a guide with the most important details and resources.
“The new cyber threat to deal with — which we’ve never dealt with before — is how do we ensure that the information from our customers is really accurate? Is it really our customers?” Sloan said, pointing to the “amount of data that is now out there” following the Equifax hack. “We haven’t dealt with that, and we’re going to all figure it out,” Sloan added. Several of the executives emphasized the importance of collective action in addressing the growing threat from cybercriminals. “We have a tremendous amount of data on our customers,” says Grayson Hall, chairman and CEO of Regions Financial. “With that information comes an awful lot of responsibility and accountability.” The comments — made at an industry conference sponsored by The Clearing House — illustrate some of the most pointed commentary to date on what the massive Equifax breach means for banks’ core businessess.
There are two forces that demonstrate my point. First is the reality that breaking news on a weekly basis surrounds enormous data leaks — just recently, Equifax, Yahoo, the Securities and Exchange, and Sonic — and a stunning lack of clarity around the extent and scope of data that has been compromised in each case. The second force is the European Union's General Data Protection Regulation. Organizations have not mapped out their data, and they're struggling now to comply with EU regulations. As a result, enterprises are making moves to locate, classify, and understand who's accessing their data and where it's being stored, and utilizing more advanced frameworks for data monitoring and controls. This data transparency is no longer a nice-to-have, particularly given impending regulatory deadlines.
There is a perception problem with encryption, where companies consider it to be a time-consuming process that is not worth the effort when compared to the perceived risk of being hacked. The “it won’t happen to us” mentality is pervasive, despite the industry predictions that cybercrime damages will cost the world $6 trillion annually by 2021 (according to Cybersecurity Ventures). Whether a firm believes their current safeguards are sufficient, or that hackers won’t target their business, they avoid encryption until it’s simply too late. They are not performing the usual risk/reward that organizations should consider when weighing the value of data and the downsides of a breach. Encryption is also not as mysterious and complex as many believe. It simply involves taking data and translating it into a different form that requires an access key to read, share and edit.
Quote for the day:
"Don't wait for inspiration. It comes while one is working." -- Henri Matisse