June 03, 2014

High availability strategy key to saving costs, says Sungard AS
Enterprises have to determine what apps are critical to their business. For airline companies, applications that deal with taking customer bookings and tracking pilot hours are most critical and need to be highly available, he said. But for retailers, it may be their website; and for financial service providers, it could be their trading desk apps, he said. “Businesses need to understand what apps are critical for them. It is not always the most obvious ones," he said. “They should also think about seasonal peaks. For instance, payroll apps run only once a month but, for that time-frame, availability of those apps are critical.”

Making the Clinical Data Warehouse Relevant Again
It's the "minimal transformation" part that reduces the overhead and complexity typically associated with data warehousing, Barlow says. Instead of reconciling every data element that's mapped to the model, Health Catalyst's approach sticks to the core elements. In turn, this lightens the data governance load that's typically necessary for an enterprise master patient index. "Having been exposed to a lot of data warehousing solutions, the successful ones need to be adaptive and agile," Barlow says. "The data models that are common in healthcare are incredibly dynamic and complex, so a platform needs to be able to adjust to that."

Samsung Pushes its Triple Level Cell SSD into Data Centers
Samsung plans to drive down SSD costs the way Intel and AMD drove down processor costs: by creating mass-produced consumer versions first, driving up production yields and driving down prices, then developing datacenter adaptations at competitive price points. That acceleration strategy is what put Intel one, then two, steps ahead of AMD in 2009, and Samsung is hoping for the same effect. While competitor Toshiba still sells its PX series SSDs with 24 nm multi-level cell NAND, Samsung is already mass-producing consumer SSDs based on its 19 nm TLC process. (Samsung marketing calls this its “10-nm class” process generation, though that’s probably jumping the gun a bit.)

The Big Data Dump: How Info-Hoarding Hinders Startups, Spy Agencies
"How much data is enough data?" de Vries said. "It's a lot less than what a lot of people think it is." In fact, some companies and analysts are pushing an idea that could be the next phase in the evolution of big data: small data. By that, they mean information that's not only more manageable, but accessible, immediate and can be acted upon today, not months down the road after a lengthy analysis. It's what Internet companies are accustomed to doing -- making many small, quick decisions based on real-time data.

Data Analytics as a Service: unleashing the power of Cloud and Big Data
In order to validate the base concepts of a DAaaS platform, Atos’ Scientific Community conducted research, based on a concrete scenario for the Oil and Gas industries, that exercises some of the most important points that need to be solved in a DAaaS. This White Paper from Atos’ Scientific Community describes how the Data Analytics as a Service model that we propose in this paper can be applied to multiple use cases and industries. Beyond that, DAaaS puts analytics as a first-level element component in a new vision of Enterprise Computing, that makes extensive usage of the advantages of Cloud technologies.

Beware the next circle of hell: Unpatchable systems
"An attacker with ICS knowledge would use the features rather than an unpatched [vulnerability] to compromise the system," Peterson says. Of course, not all IT systems are the same. Security experts agree there are scenarios in which a lower level of security is acceptable.Perry Pederson, a principal at The Langner Group, says those customers who have taken steps to harden and isolate systems should be more confident that they are protected. However, it is harder than ever for companies to know for sure that air-gapped systems aren't accessible from the Internet or an adjacent network.

Met Office swaps Oracle for PostgreSQL
Tomkins said: "We have a large number and heavy investment in Oracle skilled staff but we are looking at a programme of upskilling development and operational staff in PostgreSQL." Initially, 2ndQuadrant gave all staff initial training to prepare them for deploying the two pilot Postage migration projects into production.  "Our intention is to build skills and increase our PostgreSQL deployments. But we haven’t yet reached the same level of skills as we have in Oracle, so we need to build our skills and confidence in PostgreSQL," he said.

10 steps to building a project timeline in Excel
An informed team is a productive team. Everyone on your team should be familiar with a project's schedule -- when each task is scheduled to start and end. A project timeline, shares the project's overall schedule and progress while communicating each task's relationship to the others. In this article, I'll show you an old but true method that uses a scatter chart to implement a project timeline in Excel. ... Excel 2013's chart interface is significantly different from earlier versions, so this article includes instructions for finding options in 2013 and 2010. I'll also include Excel 2003 instructions for finding options via the older versions menu.

Tibco Acquires Jaspersoft to Energize Analytics Portfolio
The acquisition makes sense in that more capabilities are needed to address the expanding scope of business intelligence and analytics. In practice, it will be interesting to see how the open source community and culture of Jaspersoft meshes with the culture of Tibco’s Spotfire division. For now, Jaspersoft will continue as a separate division so business likely will continue as usual until management decides specific areas of integration. With respect to development efforts, it will be critical to blend the discovery capabilities of Tibco Spotfire with Jaspersoft’s reporting which will be a formidable challenge. Another key to success will be how Tibco integrates both with the capabilities from Extended Results, a mobile business intelligence provider Tibco bought in 2013.

Ansible’s View on IT Automation
Foremost, it’s important to understand that Ansible is a general-purpose IT automation system, but does not wish to be considered exclusively a configuration management system. This is because, for many of our users, the more interesting part of the equation is in how business applications are deployed on top of the OS configuration - or how upgrade processes and ad-hoc server maintenance processes are orchestrated. While Ansible does contain a robust series of modules for Configuration Management tasks, Ansible is also concerned with cloud provisioning, software deployment, and automation of orchestration procedures like zero downtime rolling upgrades.

Quote for the day:

"Hire late when it already hurts, not early in anticipation of needing people later" -- Jason Fried

No comments:

Post a Comment