December 13, 2014

Interview: Lucie Glenday, chief digital officer, Surrey County Council
“It’s a hard thing,” she says, calling herself a troublemaker. “That’s what the role is – somebody to come in and ask questions no one else has asked.” But CIO Paul Brocklehurst has taken it all in his stride. Already the council is advanced when it comes to digital thinking and has managed to deliver £250m worth of savings over the past four years. “But we’ve got more to go, and when you’ve got that constant battering of someone saying ‘I think you can do that better’, that’s really tricky,” says Glenday. “Especially when, to all intents and purposes, it’s a new face, someone who’s come down from cabinet officer and doesn’t seem to act in the same way everyone else does.”

The Cost of Data Loss on a Smaller Scale
If these numbers don’t scare you enough, they pull out another familiar figure: the dollar sign. These companies like to estimate the cost of data loss for a company. In 2012, Seagate estimated that $40 million is lost annually on data loss events. According to an Aberdeen research study, the average company loses $163,674 in unused labor and lost revenue for each hour of downtime due to data loss. But most of the time, these daunting percentages and huge numbers seem like empty threats. Data loss can’t possibly be that common, right? And even if it were, it certainly couldn’t cost that much, right?

Seven areas to focus on when complying with PCI DSS v3.0
Merchants and service providers are required to comply with the Payment Card Industry Data Security Standard (PCI DSS), which represents a set of guidelines for securing card data. If a business is found to be noncompliant, it can suffer considerable repercussions. Despite this, the Verizon 2014 PCI Compliance Report revealed that in 2013 only 11.1% of organisations fully complied with the requirements of the PCI DSS, and only one in five organisations came close to complying and passed 95%+ of controls.

Building a Data Governance Team with a Eye on Information Security
Potter noted that information is essential to supporting good patient care, as well as operational effectiveness, reducing costs; but data in itself also produces a level of risk for the organizations. Historically, she said, there has been a perception that data should be available any time, anywhere. There needs to be recognition that provider organizations must maintain control of data, which is essential to the care of the patient, she said. She noted that provider organizations today have access to vast amounts of data, from personal identifiable information, personal health information, corporate information, intellectual property and research.

For Long-Term Data Archive Solutions, Tape Storage Still on Top
For long-term data archive solutions, why does Google prefer to use tape storage drives and devices to archive and back up every email it stores? Because Google, recognizes that tape is less expensive, has greater longevity and reliability and is more portable and compatible with a variety of data formats than hard disk drives (HDDs). The need for long-term data archive solutions that will endure well into the future is only increasing. Recent advances in the Linear Tape File System (LTFS) and tape libraries from IBM, Oracle, Quantum, Spectra Logic and others are making data access times much faster. In addition, Linear Tape-Open (LTO) standardization, now on its sixth iteration (LTO-6), guarantees data access across devices well into the future.

Hyper-V Replica for Disaster Recovery
Appropriately called Hyper-V Replica, Microsoft introduced it with Windows Server 2012 R2 and upgraded it in the subsequent release. While it provides replication designed to ensure business continuity, Hyper-V Replica is not a substitute for failover clustering. If your organization has the budget to build a clustered Hyper-V deployment, you should definitely do so. Although there are similarities between replication and failover clustering, failover clustering is the preferred method for protecting your virtual machines (VMs).

Managed Failovers To Overtake Traditional Disaster Recovery Testing By 2018
“It has been a target for criticism that while it’s technically an open-source technology, it’s really controlled by the company behind it. Rival products are being launched to ‘rein in’ the problem, producing a slew of open projects that developers can tailor to meet their specific needs. We can expect to see more of this in the next couple of years.” Cloud object storage is another “one to watch” in 2015, says Dymacz: “We have been saying this for years, but object storage is something that’s hugely underused at the moment. Data is continuing to grow faster than most organisations know what to do with, and the costs associated with storing that data are growing year on year.

US technology companies facing growing UK pressure over internet spying
No such criticism of American institutions, much less companies, has ever been made before by a senior British government official. According to John Hemming, the Liberal Democrat MP for Birmingham and a cryptographer, "this statement from GCHQ is without precedent". "The US has an unfortunate track record of ignoring other countries' laws, in the area of high tech especially. It is good that they have been given a clear message from the UK that this must stop happening in the UK. I welcome GCHQ's openness and clarity on this matter. This is a very welcome development," he said.

Mac McMillan’s Clarifying Moment on Data Security
“The events of this past year ,” McMillan, CEO founder of the consulting firm CynergisTek Inc., told his audience, “have begun to show what’s going on, that the folks who want to do harm to us in healthcare have absolutely found us, and they’re not going away.” McMillan cited and briefly summarized 12 different, very serious, data breaches in 2014 that in a variety of ways are illustrative of all the threats facing patient care organizations in the U.S. What’s more, as he pointed out, the external threats—from hostile foreign governments, foreign-based criminal syndicates, and other entities—are beginning to emerge as potentially devastating for the U.S. healthcare system.

Forecasting to Improve Your Data Center Portfolio
Building and managing data center infrastructure represent large amounts of investments; it can easily reach up to hundreds of millions of dollars. Knowing whether or not you need additional capacity or whether your company can wait a few months to make this additional investment, can translate into a significant financial improvement to your data center portfolio. In retail, there are two forecasting models that allow you to manage your inventory levels between Original Equipment Manufacturers (OEM), distribution centers, retail stores and end customer: sell in and sell through models.

Quote for the day:

"To handle yourself, use your head; to handle others, use your heart." -- Eleanor Roosevelt

No comments:

Post a Comment