2015 Top Five Data Center Trends
As the conversations continue to circulate between IT companies and Datacenter Providers, it appears we have some differing opinions on what is to emerge in the New Year. Although there are many predictions on what will increase and what will decrease in popularity, some areas of the industry have brought about an increased chatter among the experts. In this article, we walk though the top five up-and-coming themes predicted to take place in 2015. They include trends in cloud, virtualization, internet of things (IoT), and the size of the industry.
Mobile users encountered malware 75% more often in 2014 compared to 2013
"We've seen a significant increase in both the frequency and sophistication of attacks that would truly represent a concern for the enterprise, like exploits that would let the bad guys get access to corporate networks," he said. "We also saw a greater prevalence and sophistication of applications that enable rooting or jail breaking the device." For enterprises in particular, the top security threats associated with mobile devices are loss of sensitive data and illicit access to corporate networks. "The threats that we found targeted both of these issues," Cockerill said.
Bruns-Pak: Datacenters vs. collocation vs. cloud computing
Owning and operating one's own data center might actually be the lowest cost option if the overall cost is considered. This approach, however, is not without its challenges. This approach requires the largest up-front investment as well as the need for the largest staff. Using this approach makes it possible, however, for an enterprise to take advantage of the reduced costs produced by purchasing systems, storage, software, power and networking in bulk. It also offers the opportunity of potential tax advantages of owning real estate, buildings and the like.
Intel-backed OIC advances in fast-moving IoT standards race
Though it seems too soon to be pushing out specifications and code, given that the industry isn’t expected to settle on standards until next year or later, this may be the best time to capture the hearts and minds of product developers. The International CES show last week in Las Vegas was rife with emerging (and some half-baked) IoT devices, especially for smart homes. Those that make it to market will eventually need to lock into some platform for working with other connected products. The OIC is developing its own standard for IoT connectivity but turned to the Linux Foundation to organize the project that is developing IoTivity. That project is open to anyone who wants to participate, whether they belong to OIC or not.
Nine CIO tips for surviving and thriving in 2015
In part one of Harvey's 2015 predictions column for CIOs, he singles out three trends that will continue to have big ramifications for the CIO role and enterprise IT next year. Here, he offers readers nine CIO tips for surviving and thriving in 2015, plus a cautionary compilation of quotes illustrating the danger of making technology predictions:
How to make applications resilient on AWS
Amazon provides different services to decouple systems and make them more reliable. One of the first services was Simple Queuing Services (SQS). Amazon describes SQS as a distributed queue system that enables service applications to quickly and reliably queue messages that one component in the application generates to be consumed by another component. Later, other services such as Simple Notification Service (SNS) or Simple Workflow Service (SWF) followed. One of the main characteristics of the cloud is elasticity, which means not making any assumptions about the health, availability or fixed location of other components.
Keeping Big Data Secure: Should You Consider Data Masking?
As Girard points out, one of the problems associated with traditional data masking is that, “every request by users for new or refreshed data sets must go through the manual masking process each time.” This, he explains, “is a cumbersome and time-consuming process that promotes ‘cutting corners’– skipping the process altogether and using old, previously masked data sets or delivering teams unmasked versions.” As a result, new agile data masking solutions have been developed to meet the new demands associated with protecting larger volumes of information.
5 Agile Ways to Achieve your New Year’s Resolutions
Perhaps we can use recent advances in software project management to get that success rate higher. Traditional software development efforts last one to two years and are managed by planning everything up front with what are called “waterfall” management practices. According to the Standish Group, the failure rate for waterfall projects from 2002 to 2011 was 29%. The costs for these failures can be measured in billions of dollars wasted. Agile management practices, which introduce frequent inspection and adaptation, have succeeded in reducing project failures to about 9%.
New report: DHS is a mess of cybersecurity incompetence
The report cautions about DHS's limited strategies, noting: "While patch management and cyber hygiene are clearly important, they are only basic security precautions, and are unlikely to stop a determined adversary, such as a nation state seeking to penetrate federal networks to steal sensitive information." The section on cybersecurity is titled: "The Department of Homeland Security is struggling to execute its responsibilities for cybersecurity, and its strategy and programs are unlikely to protect us from the adversaries that pose the greatest cybersecurity threat." One example in that section shows DHS departments effectively lying about performing critical and well-known security updates -- updates that DHS warned the public about via US-CERT.
New service wants to rent out your hard drive's extra space
The service works by first uploading a file-sharing application onto a user’s computer then breaking file data into small 8MB or 32MB blocks, or “shards,” as Storj calls them. Each block of data is encrypted with a unique hash, and then the pieces are distributed throughout the cloud network, according to a white paper the company published on its peer-to-peer storage technology. The file blocks get distributed throughout the network on nodes called “DriveShares” located all over the world. Storj uses hash chains or Merkle Trees, as they are sometimes called, to verify the contents of a file after it has been broken up into blocks or “leaves” off of a master or root hash.
Quote for the day:
"A budget tells us what we can't afford, but it doesn't keep us from buying it." -- William Feather
As the conversations continue to circulate between IT companies and Datacenter Providers, it appears we have some differing opinions on what is to emerge in the New Year. Although there are many predictions on what will increase and what will decrease in popularity, some areas of the industry have brought about an increased chatter among the experts. In this article, we walk though the top five up-and-coming themes predicted to take place in 2015. They include trends in cloud, virtualization, internet of things (IoT), and the size of the industry.
"We've seen a significant increase in both the frequency and sophistication of attacks that would truly represent a concern for the enterprise, like exploits that would let the bad guys get access to corporate networks," he said. "We also saw a greater prevalence and sophistication of applications that enable rooting or jail breaking the device." For enterprises in particular, the top security threats associated with mobile devices are loss of sensitive data and illicit access to corporate networks. "The threats that we found targeted both of these issues," Cockerill said.
Bruns-Pak: Datacenters vs. collocation vs. cloud computing
Owning and operating one's own data center might actually be the lowest cost option if the overall cost is considered. This approach, however, is not without its challenges. This approach requires the largest up-front investment as well as the need for the largest staff. Using this approach makes it possible, however, for an enterprise to take advantage of the reduced costs produced by purchasing systems, storage, software, power and networking in bulk. It also offers the opportunity of potential tax advantages of owning real estate, buildings and the like.
Intel-backed OIC advances in fast-moving IoT standards race
Though it seems too soon to be pushing out specifications and code, given that the industry isn’t expected to settle on standards until next year or later, this may be the best time to capture the hearts and minds of product developers. The International CES show last week in Las Vegas was rife with emerging (and some half-baked) IoT devices, especially for smart homes. Those that make it to market will eventually need to lock into some platform for working with other connected products. The OIC is developing its own standard for IoT connectivity but turned to the Linux Foundation to organize the project that is developing IoTivity. That project is open to anyone who wants to participate, whether they belong to OIC or not.
Nine CIO tips for surviving and thriving in 2015
In part one of Harvey's 2015 predictions column for CIOs, he singles out three trends that will continue to have big ramifications for the CIO role and enterprise IT next year. Here, he offers readers nine CIO tips for surviving and thriving in 2015, plus a cautionary compilation of quotes illustrating the danger of making technology predictions:
How to make applications resilient on AWS
Amazon provides different services to decouple systems and make them more reliable. One of the first services was Simple Queuing Services (SQS). Amazon describes SQS as a distributed queue system that enables service applications to quickly and reliably queue messages that one component in the application generates to be consumed by another component. Later, other services such as Simple Notification Service (SNS) or Simple Workflow Service (SWF) followed. One of the main characteristics of the cloud is elasticity, which means not making any assumptions about the health, availability or fixed location of other components.
Keeping Big Data Secure: Should You Consider Data Masking?
As Girard points out, one of the problems associated with traditional data masking is that, “every request by users for new or refreshed data sets must go through the manual masking process each time.” This, he explains, “is a cumbersome and time-consuming process that promotes ‘cutting corners’– skipping the process altogether and using old, previously masked data sets or delivering teams unmasked versions.” As a result, new agile data masking solutions have been developed to meet the new demands associated with protecting larger volumes of information.
5 Agile Ways to Achieve your New Year’s Resolutions
Perhaps we can use recent advances in software project management to get that success rate higher. Traditional software development efforts last one to two years and are managed by planning everything up front with what are called “waterfall” management practices. According to the Standish Group, the failure rate for waterfall projects from 2002 to 2011 was 29%. The costs for these failures can be measured in billions of dollars wasted. Agile management practices, which introduce frequent inspection and adaptation, have succeeded in reducing project failures to about 9%.
New report: DHS is a mess of cybersecurity incompetence
The report cautions about DHS's limited strategies, noting: "While patch management and cyber hygiene are clearly important, they are only basic security precautions, and are unlikely to stop a determined adversary, such as a nation state seeking to penetrate federal networks to steal sensitive information." The section on cybersecurity is titled: "The Department of Homeland Security is struggling to execute its responsibilities for cybersecurity, and its strategy and programs are unlikely to protect us from the adversaries that pose the greatest cybersecurity threat." One example in that section shows DHS departments effectively lying about performing critical and well-known security updates -- updates that DHS warned the public about via US-CERT.
New service wants to rent out your hard drive's extra space
The service works by first uploading a file-sharing application onto a user’s computer then breaking file data into small 8MB or 32MB blocks, or “shards,” as Storj calls them. Each block of data is encrypted with a unique hash, and then the pieces are distributed throughout the cloud network, according to a white paper the company published on its peer-to-peer storage technology. The file blocks get distributed throughout the network on nodes called “DriveShares” located all over the world. Storj uses hash chains or Merkle Trees, as they are sometimes called, to verify the contents of a file after it has been broken up into blocks or “leaves” off of a master or root hash.
Quote for the day:
"A budget tells us what we can't afford, but it doesn't keep us from buying it." -- William Feather
No comments:
Post a Comment