May 22, 2015

Fido Alliance launches authentication standards certification
“Where passwords are still used, the Fido authenticator supplants the security dependence on the password, which is then just an identifier,” said Fido Alliance executive director Brett McDowell. “Security shifts to the U2F device, and it is much easier to use than any other two-factor authentication method available before Fido 1.0,” he told Computer Weekly. Announcing the certification programme, the Fido Alliance said 31 suppliers have already passed Fido certification for existing products and services. These include Google’s login service that uses a USB security key as a simpler, stronger alternative to the six-digit, one-time passcodes (OTPs) used by its 2-Step Verification facility.

How Virtual Reality May Change Medical Education And Save Lives
Spio’s hope is that Next Galaxy’s virtual reality model will better educate and prepare health care providers–as well as consumers–for learning CPR, based on a more realistic learning environment. She advocates a paradigm shift, away from the current approach–which relies upon passively watching videos and taking written exams–to a method for learning that involves the use of gestures, voice commands and eye gaze controls, thereby transforming the how medical providers and laypersons experience such situations. As a first step towards developing this new reality, Next Galaxy Corporation recently announced an agreement with Miami Children’s Hospital to engage Next Galaxy’s VR Model and develop immersive virtual reality medical instructional content to educate medical professionals as well as patients.

Americans’ Attitudes About Privacy, Security and Surveillance
Key legal decisions about the legitimacy of surveillance or tracking programs have hinged on the question of whether Americans think it is reasonable in certain situations to assume that they will be under observation, or if they expect that their activities will not be monitored. A federal appeals court recently ruled that a National Security Agency program that collects Americans’ phone records is illegal. In striking down the program, Judge Gerald Lynch wrote: “Such expansive development of government repositories of formerly private records would be an unprecedented contraction of the privacy expectations of all Americans. Perhaps such a contraction is required by national security needs in the face of the dangers of contemporary domestic and international terrorism. But we would expect such a momentous decision to be preceded by substantial debate, and expressed in unmistakable language.”

Bring your own cloud: Understanding the right policies for employees
By ignoring cloud policies, employees are also contributing to cloud sprawl. More than one quarter of cloud users (27%), said they had downloaded cloud applications they no longer use. Moreoever, with 40% of cloud users admitting to knowingly using cloud applications that haven’t been sanctioned or provided by IT, it’s clear that employee behaviour isn’t going to change. So, company policies must change instead – which often is easier said than done. On the one hand, cloud applications help to increase productivity for many enterprises, and on the other, the behaviour of some staff is unquestionably risky. The challenge is maintaining an IT environment that supports employees' changing working practices, but at the same time is highly secure.

Description, Discovery, and Profiles: A Primer
Most of the approaches today are support the API-First concept. You describe your API using a meta-language based on XML, JSON, or YAML and the resulting document (or set of documents) is used to auto-generate implementation assets such as server-side code, human-readable documentation, test harnesses, SDKs, or even fully-functional API clients. An example of the API-First approach is Apiary's API Blueprint format. It’s based on Markdown and has the goal of supporting human-readable descriptions of APIs that are also machine-readable. In the example below you can see there is a single resource (/message) that supports both GET and PUT. You can also see there is support for human-readable text to describe the way the API operates.

How Big Data Can Drive Competitive Intelligence
The practice of selling data to the marketplace appears to be much more prevalent in Asia than in Europe or the United States, according to Tata. That may reflect regulatory considerations. U.S. data brokers generally ensure that big data sets have been stripped of individually-identifiable consumer information, both to ensure regulatory compliance and to prevent the inevitable public backlash. But it’s telling that China’s southwestern province of Guizhou is establishing an exchange,GBDex, to provide data cleaning, modeling, and data platform development. Alibaba is a partner in the exchange in Guiyang. A small firm with a progressive attitude toward analytics may be able to carve out a competitive advantage against a much bigger rival simply by understanding their niche in the market better.

CIO interview: Myron Hrycyk, CIO, Severn Trent Water
“A lot of organisations that run large asset bases are always looking for ways they can run that infrastructure more productively, ultimately giving customers a better service,” says Hrycyk.  “The two technologies that I see as key to driving the productivity and efficiency that are needed to drive bills down are improved telemetry and technologies related tothe internet of things that can pull data back from the infrastructure so we can proactively manage it.  “That way, we can have a lower-cost infrastructure overall and avoid reactive work and outages by managing our assets to keep the flow of water to our customers going, and doing a lot more predictive and proactive maintenance.”

Why Skills Matter More than Ever in Our Data-Driven Economy
There are no easy solutions. Two well-known factors affecting employment decisions — compensation and culture — require flexible budgets and organizational change, neither of which plays to government’s strengths. But government should not give up. The UK’s Government Digital Service fundamentally rebuilt the nation’s public-sector strategy for IT, proving that disruptive innovation in government is possible. Moreover, government agencies do have an advantage in that many of the problems they’re working on — like increasing access to affordable health care, improving the quality of schools, and making cities safer and cleaner — are the types of problems that attract the sharpest minds. While they may not be able to match the pay or benefits of Silicon Valley, they offer the chance to improve the world.

Harnessing the power of your hidden leaders
To the naked eye, it may seem they are simply able to get things done. Look closer, and you’ll see that they are demonstrating strong leadership and influence by dint of relationships they’ve developed. Look closer still, and you’ll see that it isn’t simply niceness or collegiality that has earned them this influence. Too many people seek to establish trusting business relationships centering on likeability. ... Try identifying your Hidden Leaders. Who are they? What do they do differently? Ask yourself what kind of an impact it would have on your business if more employees behaved as they do — even 20% or 30% more? My bet is that you’ll see great power in cultivating more of them. And if you are reading this article, it is likely that is your job.

Here comes the future of application development: Treating infrastructure as code
Key to this approach is the idea of the immutable container. Containerization is perhaps best thought of as a way of adding more abstraction into our virtual infrastructure, though instead of abstracting virtual infrastructures from the physical, here we're making our applications and services their own abstraction layers. With immutable containers, a Docker or similar container wrapping an application or a service is the end of a build process. Deployment is then simply a matter of unloading the old container, installing the new, and letting your application run. The immutable container is an ideal model for a microservice world. Wrapping up a node.js service with all its supporting code in a container means not only having a ready-to-roll service, we also have an element that can be delivered as part of an automated scale-out service.

Quote for the day:

"Whenever you find yourself on the side of the majority, it is time to pause and reflect." -- Mark Twain

May 21, 2015

Q and A on The Scrum Culture
Bluntly speaking, command and control is not compatible with Scrum. As soon as you allow Scrum to spread throughout the command and control enterprise, there is a clash of cultures and only one will survive. On the one hand command and control is more effective in a production line environment, and it is usually also the dominant approach in the organization. So it has the home field advantage and is the primary source of "gravity", drawing people back to the old way of doing things. The Scrum Culture on the other hand is more effective in development and research environments and is what more and more people demand from their employers.

Can OpenStack free up enterprise IT to support software-driven business?
Although it is often considered as a way to build a private cloud, OpenStack can also be used to provision datacentre hardware directly. Subbu Allamaraju, chief engineer for cloud at eBay, said he would like to use OpenStack as the API for accessing all datacentre resources at the auction site, but the technology is not yet mature enough. Walmart's Junejan added: "We aim to move more markets onto OpenStack and eventually offer datacentre as a service." OpenStack can also be used to manage physical, bare metal server hardware. James Penick, cloud architect at Yahoo, said the internet portal and search engine had been using bare metal OpenStack alongside virtualisation.

Certification, regulation needed to secure IoT devices
Xie explained in an interview with ZDNet that in traditional networks where components such as switches and routers were wired, there were well-established architecture frameworks that outlined where and how firewalls should be connected to switches, be it redundantly or as a single connection. These guidelines would no longer be effective with SDNs where the these "wires" were now defined by software and where switches could be "relocated" by the stroke of a key, he said. Firewalls, instance, would need to continue to operate the necessary policies to secure a database within a SDN, when that database is virtually relocated to a different city. "So all that becomes more intangible, and the big challenge is for security to be able to adapt to that kind of architecture," he noted.

Net Neutrality Rules Forcing Companies To Play Fair, ... Giant ISPs Absolutely Hate It
While the FCC's rules on interconnection are a bit vague, the agency has made it clear they'll be looking at complaints on a "case by case basis" to ensure deals are "just and reasonable." Since this is new territory, the FCC thought this would be wiser than penning draconian rules that either overreach or contain too many loopholes. This ambiguity obviously has ISPs erring on the side of caution when it comes to bad behavior, which is likely precisely what the FCC intended. ... And by "well functioning private negotiation process," the ISPs clearly mean one in which they were able to hold their massive customer bases hostage in order to strong arm companies like Netflix into paying direct interconnection fees. One in which regulators were seen but not heard, while giant monopolies and duopolies abused the lack of last mile competition.

Leaderless Bitcoin Struggles to Make Its Most Crucial Decision
The technical problem, which most agree is solvable, is that Bitcoin’s network now has a fixed capacity for transactions. Before he or she disappeared, Bitcoin’s mysterious creator, Satoshi Nakamoto, limited the size of a “block,” or group of transactions, to one megabyte. The technology underlying Bitcoin works because a network of thousands of computers contribute the computational power needed to confirm every transaction and record them all in a permanent, publicly accessible ledger called the blockchain (see “What Bitcoin Is and Why It Matters”). Every 10 minutes, an operator of one of those computers wins the chance to add a new block to the chain and receives freshly minted bitcoins as a reward. That process is called mining.

Machine learning as a fluid intelligence harvesting service
Developers are only human. They have limited capabilities, attention spans and so on. But data and the knowledge that can be gained from it are seemingly unlimited. Even the world’s data scientists and domain experts have to prioritize their efforts to extract insights from some relevant portion of the vast ocean of information that surges around them.  With only so many hours in the day, data scientists and analysts need to leverage every big data acceleration, automation and productivity tool in their arsenals to sift, sort, search, infer, predict and otherwise make sense of the data that’s out there. As a result, many of these professionals have embraced machine learning.

Software development skills for data scientists
You should learn a principle called DRY, which stands for Don't Repeat Yourself. The basic idea is that many tasks can be abstracted into a function or piece of code that can be reused regardless of the specific task. This is more efficient from a "lines of code" perspective, but also in terms of your time. It can be taken to an illogical extreme, where code becomes very difficult to follow, but there is a happy medium to strive for. A good rule of thumb: if you find yourself writing the same line of code with only minor changes each time, think about how you can turn that code into a function that takes the changes as parameters. Avoid hard-coding values into your code. It is also good practice to revisit code you've written in the past to see if the code can be made cleaner, more efficient, or more modular and reusable. This is called refactoring.

Marketing vs. IT: Data Governance Bridges the Gap
The key is to first understand how to govern information in the modern data era – not going back to the stone ages where marketers – and for that matter all business users -- had to follow naming conventions, put everything into schemas and build their work into models. Today, IT teams can empower the data-driven marketing organization by providing better tools and automation across the entire analytic process, including a new class of self-service data preparation solutions, which simplify, automate and reduce the manual steps of the analytic process. This new self-service data preparation “workbench” empowers marketing, sales, finance and business operations analysts with a shared environment that captures how they work with data, where they get it from and ultimately what BI tool they use to analyze it.

Full Stack Web Development Using Neo4j
Neo4j is a Graph database which means, simply, that rather than data being stored in tables or collections it is stored as nodes and relationships between nodes. In Neo4j both nodes and relationships can contain properties with values. ... While Neo4j can handle "big data" it isn't Hadoop, HBase or Cassandra and you won't typically be crunching massive (petabyte) analytics directly in your Neo4j database. But when you are interested in serving up information about an entity and its data neighborhood (like you would when generating a web-page or an API result) it is a great choice. From simple CRUD access to a complicated, deeply nested view of a resource.

Executive's guide to the hybrid cloud (free ebook)
Hybrid strategies have begun making inroads in several industries, including the financial sector, healthcare, and retail sales. In a widely cited report, Gartner predicted that nearly 50 percent of enterprises will have hybrid cloud deployments by 2017. Hybrid clouds can help ensure business continuity, allow provisioning to accommodate peak loads, and provide a safe platform for application testing. At the same time, they give companies direct access to their private infrastructure and let them maintain on-premise control over mission-critical data. Is hybrid an ideal strategy for all companies — or a panacea for all cloud concerns? ... This ebook will help you understand what hybrid clouds offer, and where their potential strengths and liabilities exist.

Quote for the day:

“It’s what you do in your free time that will set you free—or enslave you.” -- Jarod Kintz

May 20, 2015

Gartner Doubles Estimate Of Amazon Cloud Dominance
The revised Magic Quadrant kept Amazon in the desired top right of the leaders quadrant, with Microsoft also in the leader quadrant -- far below but moving a little closer to Amazon on the "completeness of vision" axis. On the second measure, the "ability to execute" axis, the companies remained basically the same as a year ago. Gartner put only those two vendors in the leaders quadrant, and that status is unlikely to change anytime soon. That's because the upper left quadrant next door, meant to illustrate the challengers to the leaders, was completely empty in this year's chart. No one is threatening Amazon as the dominant public cloud infrastructure provider, nor Microsoft as the runner up.

NoSQL for Mere Mortals: Designing for Document Databases
Redundant data is considered a bad, or at least undesirable, thing in the theory of relational database design. Redundant data is the root of anomalies, such as two current addresses when only one is allowed. In theory, a data modeler will want to eliminate redundancy to minimize the chance of introducing anomalies. ...  There are times where performance in relational databases is poor because of the normalized model. ... Document data modelers have a different approach to data modeling than most relational database modelers. Document database modelers and application developers are probably using a document database for its scalability, its flexibility, or both. For those using document databases, avoiding data anomalies is still important, but they are willing to assume more responsibility to prevent them in return for scalability and flexibility.

How do you solve a problem like big data?
Knowing where to begin with all of this information is one thing; having the time to actually get to work on it is completely different. So much of the data mentioned above is useful to marketers, but sifting through to identify and collect the necessary parts is an extremely long-winded task; far from ideal in an industry where spare hours are a rarity. Unfortunately, this tedious aggregation process is a necessity for most marketers, despite the availability of so many useful tools. According to a January 2015 Econsultancy report, just over half (51 per cent) of organisations are using more than 20 digital marketing technologies at present. With such a collection of data sources to tend to, though, it’s no surprise that so much valuable time is being wasted.

Executive interview: Google's big data journey
“Google fundamentally rethought the practice of building bigger machines to solve these problems. We only build using commodity machines and we assume systems will fail. “We have done several iterations of almost every piece of technology we showed in the white papers.” The use of massively scalable low-cost commodity infrastructure is almost diametrically opposite to how the big four IT suppliers go about tackling big data. Yes, they do NoSQL and offer Hadoop in the cloud. But SAP, for example, wants customers to spend millions on S/4 Hana, Oracle pushes Exadata and its engineered appliance family, IBM sells the merits of the z13 mainframe, and Microsoft has SQL Server.

What a new survey on payment solutions reveals about your security leadership
“Companies in the payments industry face a huge challenge in securing emerging technologies like virtual currencies, mobile payments and e-wallets. While the industry has always prioritized the implementation of new technologies for customer convenience, in today’s landscape, it is critical that they equally emphasize the security of new technologies to protect customer data.” -- Michael Bruemmer, vice president of Experian Data Breach Resolution ... The challenge is the balance between customer convenience (especially when it comes to their ability to give your company money) and the appropriate level of protection . This survey underscores that we’re under pressure to adopt new systems without a clear understanding of the risks or methods to reduce those risks.

Back-end complexity slows XenMobile deployments
The problem for some organizations is that they don't have the expertise in house to handle a XenMobile implementation. Deploying XenMobile is much different than Citrix virtual desktops, applications or cloud infrastructure, so the IT department's resident Citrix experts might not be able to easily transition, Gamble said. "Just because you're a good Citrix guy, doesn't equate to being a good XenMobile guy," he said. But it's not always the back-end complexity that makes XenMobile deployment difficult. Handling users is a challenge, too. "Once we did a pilot, the deployment wasn't that bad," said Noel Prevost, a services delivery manager at Ingalls Shipbuilding in Pascagoula, Miss.

SaaS and the Cloud are Still Going Strong
Aside from the prominent cost benefits of cloud computing, innovation and mobility are highlighted as key reasons for uptake. Cloud technology enables faster, cheaper software development, with key cloud usage scenarios including collaboration, file sharing, business productivity, CRM and marketing. Mobile applications including Vend applications, PayPal platforms and secure VPN access are some of the top requirements of businesses in 2015. ... QuoteColo’s infographic sums up many of the key stats and predictions for the future of cloud computing throughout the world, and highlights the importance of strong cloud infrastructure and application development through 2015 and into the future.

Celebrate mistakes: Creating a culture of forgiveness
When you encourage healthy risk-taking, you encourage innovative behavior in your team. Employees who know that they’ll have your help and support when problems arise feel empowered to integrate changes into new projects and daily operations. Those changes could save time, save money or bring in a big win for the organization — just the sort of behavior you want to encourage. But does your team know you’ll make it a learning opportunity and not a mark of shame if something doesn’t work? Of course we’re talking about reasoned risk, with plenty of planning. There are always ways to learn from a thought-out endeavor that failed.

Toward Omniscient Cybersecurity Systems
CISOs recognize this disjointed situation and many are undertaking cybersecurity integration projects to address this problem. This is certainly a step in the right direction, but I find that a lot of these projects are one-off point-to-point integration efforts. Good idea, but CISOs should be pushing toward an ambitious endgame – omniscient cybersecurity systems. ... In summary, CISOs need a single system or an integrated architecture that can tell them everything about everything – in real-time. This system must be smart enough to recognize patterns and offer user-friendly visual analytics interfaces enabling analysts to easily pivot from one data point to all others. Armed with this type of system, cybersecurity professionals could move on to the next task – automated remediation and security operations.

Finance and retail sectors struggle to detect cyber intrusions, study finds
Key findings in the financial services sector included that 71% of organisations polled view technologies that provide intelligence about networks and traffic as most promising at stopping or minimising advanced threats during all phases of an attack. But the study showed that only 45% have implemented incident response procedures, and only 43% have established threat-sharing agreements with other companies or government groups. More than half of financial services firms consider distributed denial-of-service (DDoS) attacks as an advanced threat, but only 48% say they are effective in containing DDoS attacks, and only 45% have established threat-sharing agreements to minimise or contain the impact of DDoS attacks.

Quote for the day:

"The measure of success isn't if you have a tough problem, but whether it's the same one you had last year." -- J.F. Dulles