June 09, 2015

Are you prepared for the future of data centers?
Colocation requires a shift in data center skillsets, Koppy noted, not handing the data center over to a third party. Ask questions -- specifics about the colocation provider's network and power paths and so on -- and if the colocation provider is unwilling to share information your own facilities team would know, consider that a red flag, Courtemanche said. Also, talk to the provider's long-term customers to gauge how your own experience might be. ... There are two problem areas data centers with more than 1,000 servers experience at a much higher rate than smaller ones, according to survey results from IDC: downtime due to human error and security breaches. As one AFCOM Symposium attendee put it, when you outsource, your job goes from managing the data center to managing the colocation provider.


The top 10 myths about agile development
To be flexible has become vital for a business in today’s global markets, and therefore, the ability for IT systems to be equally flexible is essential. The purpose of agile is to allow organisations to react to the increasingly dynamic opportunities and challenges of today’s business world, in which IT has become one of the key enablers. Agile is defined by four values and 12 principles found in the Agile Manifesto. The manifesto provides an umbrella definition, in which there are many other delivery and governance frameworks, such as Scrum or extreme programming, for example.


Is Nepotism Undermining Your Business Technology Innovation?
We no longer do the break-fix relationship. We have a strategy manager that essentially acts as a CIO and manages technology as our clients grow and innovate. You need someone to be there every time you grow and change out a piece of technology and that person needs to have extensive experience throughout your industry with companies of all sizes. A small company that is a family friend doesn’t have that kind of expertise. ... Most “family friend” businesses don’t have this in place and have no idea what sort of support their users are getting, how the response time is or which issues are being resolved and escalated. You don’t have the capital to pay your users to hang out waiting for a call back on an issue.


Erasure Coding For Fun and Profit
Erasure coding essentially uses maths to add a little bit of extra data to the end of the actual data so that if you lose part of this new, bigger amount of data, you can still get all of the original data back. A simple version is a checksum: sum all the ones and zeros and put that at the end. If you lose any one of the bits, you can figure out what it was by re-calculating the checksum and comparing it to the stored checksum. The difference is what the bit was, basically. This is a vast over-simplification, but that’s basically it. ...  There’s a downside (there’s always a downside). If you lose a disk, you have to rebuild all the data from the parity blocks scattered around the place, which reduces the performance of the array because some of the time is spent on the rebuild instead of serving up the data.


Obama vows to boost U.S. cyber defenses amid signs of China hacking
"We have to be as nimble, as aggressive and as well-resourced as those who are trying to break into these systems," Obama told a news conference at the Group of Seven (G7) summit in Germany. U.S. officials, speaking on condition of anonymity, have blamed Chinese hackers for breaching the computers of the Office of Personnel Management and compromising the records of up to four million current and former employees in one of the biggest known attacks on U.S. federal networks. The mission of the intruders, the officials said, appears to have been to steal personal information for recruiting spies and ultimately to seek access to weapons plans and industrial secrets.


Rise of the Surveillance Platform
Hildyard likened a trade-surveillance platform to a buy-and-build hybrid. Such a system requires customization to effectively detect and prevent abuse, as each market ecosystem is unique. But at the same time, building the capability from the ground up is unrealistic. Delivering surveillance via a platform rather than an application gives developers leeway to develop code that’s unique to their organization and the types of behaviors they need to monitor. Sell-side banks “can’t rely on an application to do that,” Hildyard said. “The frequency with which regulatory hot topics emerge is increasing over time,” Hildyard said. Additionally, trade surveillers’ “goal should be to ‘create’ the next big scandal and make sure it doesn’t happen on their watch, in their bank. That requires that they understand behaviors they weren’t previously monitoring for.”


Transforming Text and Data Into a True Knowledge Base
One of the steps in text mining is “relationship identification.” Once entities are identified and enriched, they are connected to other entities; for example, “Foggy Bottom is in Washington, DC”, “Foggy Bottom is near The White House” and “Foggy Bottom is east of Georgetown.” What just happened? We used Open Linked Data (LOD) to verify Foggy Bottom as a neighborhood that exists in Washington DC while also connecting it to other entities. LOD knows that DC is a “District” (not a state) and that it is within the United States. Preexisting facts were combined with results from text analysis to expand the knowledge base.


APIs with Swagger : An Interview with Reverb’s Tony Tam
First, we don’t want to try to stuff every possible feature inside the specification itself. Early on, someone brought up embedding rate-limiting information into the spec. But it would be very difficult to generalize, and would pollute the spec over a feature that possibly many people wouldn’t care about. Next, one thing we learned through the initial versions of Swagger is that it’s easy to write invalid specifications without a simple and robust validator. We chose to use JSON Schema validations, and even built it directly into Swagger-UI. It is an important part of the tooling to help developers write valid Swagger definitions. Removing structural constraints from the spec AND having a robust validation tool would be very difficult.


Case study: What the enterprise can learn from Etsy's DevOps strategy
“You have to be able to demonstrate to the larger business why it’s not just a buzzword and can add value to the business, and the only way to do that is to give them a concrete project and show them how it has positively affected the business,” he says. “The people who make the decisions at the top of the pile may be more business-minded than technically so, and you need to speak their language and demonstrate the impact it has had on key performance indicators or revenue that quarter. “You need to sell the idea to them in business terms because IT and development are service organisations that exist to fulfil the priorities of the business,” Cowie adds.


A Brief History of Big Data Everyone Should Read
Long before computers (as we know them today) were commonplace, the idea that we were creating an ever-expanding body of knowledge ripe for analysis was popular in academia. Although it might be easy to forget, our increasing ability to store and analyze information has been a gradual evolution – although things certainly sped up at the end of the last century, with the invention of digital storage and the internet. With Big Data poised to go mainstream this year, here’s a brief(ish) look at the long history of thought and innovation which have led us to the dawn of the data age.



Quote for the day:

"Every leader needs to look back once in awhile to make sure he has followers." -- Kouzes and Posner