Daily Tech Digest - December 27, 2016

Industries Thrive On Cognitive Cloud

Referred to as “Cognitive on cloud”, this model delivers cognitive services running in the cloud that are consumable via representational state transfer (REST) APIs. These services are available as part of platform-as-a-service (PaaS) offerings such as Bluemix and can be easily bound to an application while coding. Using this approach, cognitive analytics such as voice (tone analyzer, speech-to-text) and video (face detection, visual recognition) capabilities enables quick analysis of petabytes of unstructured data. Developing cognitive applications to run on mobile devices has provided new insights which help organizations create totally new revenue streams. When selecting a cloud service provider however cognitive on cloud ROI requires more than just a total cost of ownership comparison. In addition to this basic analysis, an organization must consider which cloud is cognitive enabled at the Platform-as-a-Service (PaaS) layer.


Could IT change control have prevented an IT deployment failure?

Build all configuration cleanup into IT change plans. Take the necessary time to flesh out a plan to address all related configurations for a specific maintenance task. IT teams typically should remove switch port configurations for decommissioned NICs, update firewall policies and groups when changing a server IP address, and remove domain name system records that are no longer in use. Managing the issue on a daily basis is the optimal approach to prevent IT configuration build up over time. Combat risk of larger change scope through a robust IT change control process. Teams dealing with live production systems fear the service impact of any change. A formal IT change control policy will lessen these unknowns. Thoroughly plan changes, formulate test and backout plans, have peer reviews, follow a set approval process, and schedule and communicate maintenance windows for all changes to critical systems.


After a big 2016, next year may be A.I. tipping point

"We are on the cusp of a change as big as when e-commerce hit," said Chris McCann, president and CEO of 1-800-Flowers.com, in an interview this fall. "It's giving us the opportunity to have such deep relationships with our customers that it'll be like the company hasn't existed before." This past May, Google showed how focused it is on A.I. during its annual Google I/O developers conference, unveiling A.I.-powered products like Google Assistant, its Google Home device, the Allo chat app and the Duo video chat app. And this past October, IBM president and CEO Ginni Rometty said during a keynote at the company's World of Watson conference that in the next five years, every major decision -- personal or business -- will be made with the help of IBM's Watson A.I. system. A statement like that takes a lot of confidence in advances coming in the technology.


How blockchain can create the world’s biggest supercomputer

One of the fields where centralized and cloud-based computing falls short is the Internet of Things, Sønstebø says. “As IoT grows the need for distributed computing becomes an absolute necessity,” he says. Latency in round trips, network congestion, signal collisions and geographical distances are some of challenges faced when processing data produced at edge devices in the cloud. “Devices need to be able to trade computational resources with each other in real time so that the computational load can be distributed,” he says. Some of the emerging lines of software will not be supported by centralized architectures at all, iEx.ec’s Fedak says, such as decentralized applications (DApps), which among others will power fog computing, distributed AI and parallel stream processing. “This class of application is extremely challenging because they’re both data and compute-intensive, and they don’t cope well with centralized infrastructure,” Fedak says.


The Full Spectrum: How a Visual Analytics Platform Empowers the Business

BI and analytics software can help uncover this story, but there are now hundreds of companies offering technologies designed to dig into data. This proliferation of tools is both a blessing and a curse. While competition breeds excellence, there are a few standards or best practices embraced across the board. As a result, the onus is on individual businesses to embrace and uphold policies that will enable the effective use of data in a responsible, governable way. One increasingly attractive solution for doing data right is to leverage a visual analytics platform. Unlike standalone data visualization tools (which can provide useful but sometimes misleading views of the enterprise), a visual analytics platform weaves together all the elements of a full technology stack.


Ransomworm: The Next Level Of Cybersecurity

To make matters worse, Nachreiner expects cybercriminals will mix ransomware with a network worm. Years ago, network worms like CodeRed, SQL Slammer, and more recently, Conficker were pretty common. Hackers exploited network vulnerabilities and tricks to make malware automatically spread itself over networks. “Now, imagine ransomware attached to a network worm. After infecting one victim, it would tirelessly copy itself to every computer on your local network it could reach,” he says. “Whether or not you want to imagine such a scenario, I guarantee that cyber criminals are already thinking about it.” Nir Polak, Co-Founder & CEO of Exabeam, a provider of user and entity behavior analytics, agrees that ransomware will move from a one-time issue to a network infiltration problem like Nachreiner describes. “Ransomware is already big business for hackers, but ransomworms guarantee repeat business.


When the Data Thief is a Company

Companies are particularly vulnerable to this type of attack when they often offer their data for free or at low cost to the public, but professionals a fee to access and use it on their own site or in other materials. What’s to be done? Companies that offer such online data stores for a subscription need to be ever vigilant about unusual traffic patterns that suggest a machine – not a human – is behind the wheel. The differences in use patterns between a crawler and an individual human researcher are easy to spot – if you’re looking for them. Behind the scenes, companies also need to pick any low hanging fruit: making sure that free or temporary accounts can’t be abused to siphon off reams of data and watching for patterns of abuse among registered and paying members. Application security flaws such as weak authentication and SQL injection should, of course, be patched.


Using big data for security only provides insight, not protection

Big data cheerleaders will say you can use this hindsight to fix the problems that let the hacker into your system in the first place. After all, since you know what went wrong, you can patch your system so that it doesn’t happen again, right? While that may be true – you may be able to prevent that specific problem from happening again – cybersecurity simply doesn’t work that way. The threat landscape is dynamic, with new technologies, and thus, new vulnerabilities, emerging every day. Additionally, hackers are like any other criminal: They are savvy, adaptable, and know how to play on human nature. They’re always going to find your weaknesses – and your biggest weakness is your own people, your trusted employees. Most hackers don’t break into systems through the back door. They get their hands on legitimate login credentials and, essentially, walk right in the front door.


Cloud data recovery is critical, but won't always come easy

Public cloud vendors focus more on the front end of the data issues, such as system availability and uptime, and less on recovery. They figure that making their systems available 99.999% of the time should enable users to work with needed information. However, glitches, such as a read/write error, arise, causing corporations to need to recover data. And typically, public cloud vendors offer rudimentary recovery functions. In addition, these vendors draw lines between their own and their customers' backup responsibilities, something not seen with on-premises backup systems. For instance, Microsoft Azure tries to restore customer data lost due to Azure outages, but won't attempt to restore data if users delete files or if files become infected by a virus. In response, more sophisticated ways to backup cloud applications are emerging. Vendors like Commvault Systems and Veeam Software have well-developed, on-premises systems that they are extending to the cloud.


How Zalando Delivers APIs with Radical Agility

As you know, REST is more an architectural style and does not really specify API design details. We need to have some standards in the API design practices to establish a consistent API look and feel. Ideally, all the APIs should look like they were created by the same person. That’s a very ambitious target, but our guidelines help. We recently open-sourced them and have already received external contributions. The API guidelines standardize easier things like naming conventions and resource definitions, but also includes more complex things like non breaking changes and how we want to do versioning. ... In the end, the more critical aspect is that all the different services that are part of the platform fit in an overall architecture where you have really clear, separated functions that can easily be orchestrated to build the business functionality that we have in mind.



Quote for the day:


"I believe that the only courage anybody ever needs is the courage to follow your own dreams." -- Oprah Winfrey


No comments:

Post a Comment