July 04, 2015

Recovering from a Storage System Failure
Protecting from storage system failure requires more than just copying data from point A to point B. To meet any reasonable recovery expectation requires that a secondary storage system be available to start the application on. The good news is secondary storage is very affordable today and these systems can play a larger role than just being a standby storage system for the primary storage system. But before implementing the secondary storage system, the IT planner needs to understand what the acceptable recovery point (RPO) and recovery time objectives (RTO) will be for the applications counting on that system. Once the RPO and RTO is understood IT planners will know what method they should use to get data to the secondary storage system and what type of secondary storage system they should buy.


When Big Businesses Collaborate in the Cloud, Rapid Innovation Will Follow
The industry can also expect to witness accelerated technological advancements and innovation — when big businesses collaborate, rapid innovation will inevitably follow. Competitive divides between managed cloud and commodity cloud are growing; businesses need to understand which is right for their needs. Those who don’t have either the in-house expertise or the inclination to develop it are wise to choose a managed cloud solution, for efficiency and to guarantee uptime. Competitive collaboration plus partnerships through channel programs up and down the stack drive value for end customers. In order to drive success with the majority of the SMB through Enterprise market, we all need to consider partnerships.


Coffee Shops and Home Routers Could Offer Nearby Phones a 4G Data Connection
It encodes data in the same way as the LTE technology used by cellular networks today, but is designed to be used over the same part of the radio spectrum as Wi-Fi and has roughly the same range. The company says it can provide faster, less glitchy connections than Wi-Fi because LTE was developed for cellular networks where performance and reliability is more crucial. MuLTEfire opens new possibilities because the radio bands that Wi-Fi uses are not reserved for the exclusive use of any company. LTE is used today only by cellular networks on radio bands licensed from governments at costs of millions or billions of dollars. MuLTEfire is also different in that—as with Wi-Fi—a MuLTEfire hotspot can serve any device, regardless of which cellular carrier it is a subscriber to.


What’s the Difference between M2M and IoT?
To quote Mark Andreessen, “Software is eating the world” in industry after industry and it’s no different in ours. Old school SCADA (Supervisory Control and Data Acquisition) hardware is being consumed by Internet of Things software. Often people ask, “What’s the difference between M2M and IoT?” Besides the scope of the networks – M2M lives on a local area network or no network while IoT lives on a wide area network – the other major difference is SCADA consists of proprietary hardware like programmable logic controllers whereas the heart of IoT is software.


Why the Internet of Things won't be about the 'things'
“If you think, ‘Oh, I’ll never need that,’” pointing to a slide depicting the Apple Watch, “get that out of your system.” Forrester surveyed consumers about the Apple Watch upon its release for preorder in April, and 26%—representing about 50 million people in the United States—said they expect to own a smartwatch someday. What’s more, “8% want their next car to be a self-driving car, and they don’t even exist yet,” McQuivey said. “This is a consumer base that has learned the lesson of the last 10 years: ‘New stuff is going to come out, and it’s going to benefit me,” he added. “We are all early adopters now.”


Planning Cloud Analytics: conceptual architecture alternatives
Alternative architectures will result in cost variations for a cloud analytics solution. Choosing an optimal Cloud analytics architecture requires consideration of key tradeoffs described below. In addition, it requires consideration of factors such as skills, licensing, complexity, and maintenance. ...
In contrast to placing all essential components on a single CSP’s platform, distributing the components among multiple CSP platforms allows leveraging the strengths of product offerings available through each separate CSP. This approach necessitates consideration of costs associated with multiple CSPs as well as careful evaluation of architecture due to the potential for increased complexity.


Singapore wants to be a living lab for smart startups
“Singapore has always positioned itself as a great base for global companies,” he says when we meet for coffee on his visit to London. “We’re working hard to make it a great base but for startups and we’re talking about companies that have the opportunity to be big because people often comingle ‘SME’ and ‘startup’ and really they’re night and day.” Leonard says that this is a great time to foster startups: “You can be anywhere in the world and you don’t need to fill it up with infrastructure.” However, he says he’s not interested in building some California clone convoy. “One thing we’re not trying to do is mimic Silicon Valley. We have some good raw materials such as the intersection of universities and businesses, investment capital and high mobile use, and that gives us nice young talent and companies that want to tap into that.”


Why trust, rather than security, could be the bigger barrier to Google cloud adoption
“Once in the Google cloud, you have no idea where your data is. You don’t and won’t know when it’s being moved as Google looks to ensure that data is stored efficiently and regularly shifts data from one location to another to maximise resources,” he added. Hall said, as there is a risk their data could fall under the jurisdiction of another country with all the shifting around. Another concern, aired by a Computer Weekly reader, was that a company the size of Google is likely to be a top target for hackers, despite its assurances about the security resources it has in place to protect its platforms. “I have no doubt Google will have better security resources, but it will also be one of the top companies in the world on that target list,” the reader said.


Conditional Compilation in Universal Apps
Conditional compilation is the process of defining compiler directives that cause different parts of the code to be compiled, and others to be ignored. This technique can be used in a cross-platform development scenario to specify parts of the code that are compiled specific to a particular platform.Conditional compilation allows the compiler to skip some parts of the source code when compiling based on the conditional clauses. The #if ,#endif #elif and #else directives are used for conditional compilation. These directives add conditions to parts of a source file. With the use of these directives we can keep a single source file.


eBook: Problem Management
So what is problem management and what does it aim to achieve? And how is it different from incident management? This extract from Michael G Hall’s book, Problem Management, an implementation guide for the real world, explains how problem management differs from incident management and looks at the differences between reactive and pro-active problem management. The book draws on the principles of ITIL.



Quote for the day:

“Being a leader is making the people you love hate you a little more each day.” -- Patrick Ness

July 03, 2015

Ireland gears up for cyber war – new strategy to protect critical infrastructure
The Irish Government has established a National Cybersecurity Centre (NCSC) within the Department of Communications that will be tasked with securing government networks and critical national infrastructure. As well as being accredited to the CSIRT-IE, the NCSC will develop capabilities to respond swiftly when attacks occur and develop capabilities in the area of industrial control and SCADA systems, which are used to run electricity and water networks. The threat to such networks became clear when the Stuxnet worm, malware designed by Israel and the CIA to compromise Iranian nuclear facilities, began to roam wild and threaten utilities infrastructure worldwide.


Facebook wins first round in European privacy battle
The court said it lacked jurisdiction to hear the case seeking €500 compensation for each claimant, totalling €12.5m. "This litigation was unnecessary and we’re pleased that the court has roundly rejected these claims,” Facebook said in a statement. Commentators said the court’s ruling is a victory for Facebook, which argued the case was not legitimate, but Schrems has vowed to take the case to a higher regional court and appeal to the Austrian Supreme Court if necessary. The group led by Schrems is suing the social networking firm for several privacy violations, including tracking their data and Facebook’s alleged involvement in the US National Security Agency’s (NSA) Prism surveillance programme.


M-Disc optical media reviewed: Your data, good for a thousand years
As to that thousand-year claim, the U.S. Navy will back that up. It tested M-Disc DVD+Rs along with archival quality DVD+R/RW and DVD-R/RW, subjecting them three times to a 185-degree, 85-percent humidity, full-spectrum light environment for 26.25 hours. Every DVD failed—except the M-Discs, which suffered no noticeable degradation. The Department of Defense hasn’t tested the new M-Disc BD-R, but as the technology is largely the same, the results should be as well. The only failure point for the material used in the M-Disc data layer is oxidation, which, according to Millenniata materials scientists, shouldn’t be an issue for about ten millennia. Yikes. The comparative delicacy of the polycarbonate outer layer of the disc is why the media lasts “only” a thousand years.


Security threats, hackers and shadow IT still plague health IT
"You don't know what you don't know, so the first thing CIOs can do to help their employees adopt the cloud safely is to discover all the services in use across the organization," Rick Hopfer, CIO at Molina Healthcare, writes in an email. "Employees rarely have the information to determine whether a particular cloud application complies with organization's security and compliance policies." The average healthcare employee uses 26 different cloud services, Skyhigh found. And those applications often have very different levels of security protections, highlighting the importance of the IT department working with the business units to ensure that cloud services are deployed safely and managed by the CIO's team.


How Big Data Is Changing Recruitment Forever
It’s hard to say if this is true or not, but many people in positions with responsibility for hiring would probably admit that they had made appointments based on a “gut feeling” – simply whether or not they felt the person was the right fit for the vacancy. Well, all that is changing. Taking on a new employee represents a huge investment for most companies, particularly in a managerial or professional role. A large proportion (40% to 60% by most estimates) of a company’s revenue goes on staff salaries. So in an age where everything can be measured, quantified and analyzed, it makes sense to put a bit more planning and strategic thought into the recruitment process.


How to survive in the ‘Digital Amnesia’ world
It’s simply impossible to remember everything. To look into this, Kaspersky Lab has initiated research to analyse how digital devices and the Internet affect the way people recall and use information today – and what, if anything, they are doing to protect it. ... So, smartphones are the ubiquitous companions for many of us. They have become an extension of the human brain and just as the skull protects the brains, mobile phones need protection as well. The majority of motorcyclists put on helmets, but only a few of those surveyed managed to adequately protect their phones with IT security. One of the previous Kaspersky Lab studies also shows that women often secure everything less than men.


Cyber-Espionage Nightmare
The failure of the companies’ supposed security technologies was stupefying. Lance Wyatt, the IT director for the steelworkers’ union, thought he ran a tight ship. An IT audit in 2010 had found no major deficiencies. His e-mail server screened all incoming messages for attachments that contained executable code. He had the latest antivirus software. His network checked IP addresses to avoid sites that contained malware. Yet Wyatt and the FBI eventually found infected computers, one of them used by the union’s travel manager. “None of those machines were on our radar as being infected or suspect,” he says.


Microservices, the Reality of Conway's Law, and Evolutionary Architecture
There are lots of monoliths that are very highly coupled, in fact most of them, and so it is not a trivial exercise to break them up. So, as a practical matter, here is what I recommend to people and people that I consult with now – first, we think it is a good idea to move to the new model and so first, we have to agree to that. Step zero is to take a real customer problem, something with real customer benefit, maybe a reimplementation of something you have or ideally some new piece of functionality that was hard to do in the monolith. Do that in the new way, first, and what you are trying to do there is to learn from mistakes you inevitably will make going through that first transition, right? You do it in a relatively safe way, but at the same time you do with real benefit at the end.


India innovates on the Internet of Things
While India may not have that many successful products in the software space, it is making IoT devices that are comparable to any in the world. Its large talent pool of mobile app developers is helping create interfaces between products and users. It helps that companies such as Intel, Cisco, Broadcom and MediaTek are making open-source hardware that companies can use to launch IoT prototypes. Crowdfunding platforms such as Kickstarter and Indiegogo are also useful for start-ups. We looked at a number of IoT products being made in India and picked some of the best.


Software Still Playing Catch-Up to Flash Memory Advancements
At the data-tier level, we see some consistency as well. All major vendors integrate flash drives/modules as an option and have native interfaces to flash devices. Another strategy is using high-bandwidth flash as a cache on the server node, while retaining disk storage in its traditional role. All these strategies give users access to flash technology using standard SQL methods. Unfortunately, at the middle/processing tier, platform support for flash is inconsistent. The limited number of middle-tier caches that exist and can persist to disk can easily transition to flash. However, if you are not seeking a middle-tier cache but a data grid, only a few data grid vendors have programmable flash integrations.



Quote for the day:

"Problems are only opportunities in work clothes." -- Henri Kaiser

July 01, 2015

Trusted Technology, Procurement Paradigms, and Cyber Insurance
From the customer's perspective, they need to be considering how they actually apply techniques to ensure that they are sourcing from authorized channels, that they are also applying the same techniques that we use for secure engineering when they are doing the integration of their IT infrastructure.  But from a development perspective, it’s ensuring that we're applying secure engineering techniques, that we have a well-defined baseline for our life cycle, and that we're controlling our assets effectively. We understand who our partners are and we're able to score them and ensure that we're tracking their integrity and that we're applying new techniques around secure engineering, like threat analysis and risk analysis to the supply chain.


10 things CIOs need to know about agile development
The full benefits of agile cannot be achieved without engaging with business leaders, management and the user community. If the rest of the business does not have an immediate appetite for working in a new way, careful planning and communication will be needed to bring different communities of managers and users on board. ... The basic organisational unit of delivery in agile development is a small team, typically expressed as "seven, plus or minus two" people — both developers and quality assurance. ... If people are moved too frequently, the teams fail to develop into highly productive units; if people are not moved between teams enough, then each team starts to become isolated and diverges from the other teams. It is important to note that physical location of teams is much more important with agile methods than with conventional approaches to development.


When the Toaster Shares Your Data With the Refrigerator, the Bathroom Scale ...
Everything will be connected, including cars, street lighting, jet engines, medical scanners, and household appliances. Rather than throwing appliances away when a new model comes out, we will just download new features. That is how the Tesla electric cars already work — they get software updates every few weeks that provide new features. Tesla’s latest software upgrades are enabling the cars to begin to drive themselves. But the existence of all these sensors will create many new challenges. Businesses have not yet figured out how to use the data they already have. According to McKinsey, for example, oil rigs have as many as 30,000 sensors, but their owners examine only one percent of the data they collect.


It’s not just the weather: Southern Europe’s startup ecosystem is heating up
Getting down to the nitty-gritty of growth metrics, each startup has shown tangible returns on investment. Bluemove has between 15 and 20 thousand active users compared to 9 thousand about a year ago. According to Gonz├ílez-Iglesias, “Each month we are triplicating what we did last year. We expect to close our year on a stand-alone basis between 1.5 and 2 million euros in revenues. Last year we closed a bit under 1 million in revenues.” ...  COO Christian Picard expects the revenue “coefficient will be tripled or multiplied by four” as they expand beyond Barcelona and Madrid into bigger cities with more business travellers like London, Paris and Berlin.


Not So Fast: Questioning Deep Learning IQ Results
If the work truly shows that computers can now pass the written IQ exam with stronger scores than humans, it is definitely interesting. ... This system is hand-engineered to identify the specific patterns in formulaic standardized tests. It's hard-wired to know the types of questions that exist. This system may be a powerful demonstration of word2vec style distributed representations for words, but it is hardly a display of true human intelligence. If the format of the question were changed significantly, or were not formulaic, it would seem that this system couldn't cope. As with many standardized tests, the verbal reasoning section tests the breadth of a participant's vocabulary more than anything else, and it would hardly come as a surprise that the computer can maintain a larger vocabulary than a human.


Big data in financial services begets chief data officer
If you look at the innovation side of things, it is very confusing. If you look at all the Hadoop distributions, all the ETL tools, all the SQL tools and all the data visualization tools, it is very confusing. In the traditional relational space, it was simpler. It's confusing and some people are scared away with all these choices. What I see on the ground is that the financial industries, for several different reasons, is behind in the adoption of the new big data revolution. The first reason for that is most of the companies have yet to find a killer app -- one that would move the dial. Other reasons are that there are lots of regulatory changes and lots of pressure on cost savings. The focus is not on the innovation.


8 High Performance Apps You Never Knew Were Hybrid
Some of the topmost brands have recently ditched native and gone the hybrid way. With the new hybrid frameworks like Ionic, Phonegap etc becoming more mature, one cannot assume that hybrid apps perform worse anymore. In fact, a recent Gartner report says that by next year more developers will be going the hybrid way and by another account, the average end user ratings of hybrid apps are already 12 percent better than native apps. So where are all the hybrid apps? You use them a lot, probably without even realizing that you’re actually on “the web”. Well, that’s the beauty of it! Here are 8 very popular hybrid apps that you could never have imagined to be, hybrid:


Rebooting the Automobile
Cars are far more computerized than they might seem. Automakers began using integrated circuits to monitor and control basic engine functions in the late 1970s; computerization accelerated in the 1980s as regulations on fuel efficiency and emissions were put in place, requiring even better engine control. In 1982, for instance, computers began taking full control of the automatic transmission in some models. New cars now have between 50 and 100 computers and run millions of lines of code. An internal network connects these computers, allowing a mechanic or dealer to assess a car’s health through a diagnostic port just below the steering wheel. Some carmakers diagnose problems with vehicles remotely, through a wireless link, and it’s possible to plug a gadget into your car’s diagnostic port to identify engine problems or track driving habits via a smartphone app.


The end of IT consumerization
Now cloud vendors are driving IT evolution. When Google realized that inefficient power supplies were costing it millions, its suppliers quickly fixed that problem. The rest of us, who never cared, and enterprises - who lacked Google's clout - benefited.Other web-scale technology is moving into the enterprise. The commodity scale-out compute and storage architectures that Google and Amazon pioneered are now being offered by companies such as Nutanix and Scality, and in open source software like OpenStack. Now networking giant Cisco is under attack because their costly, complex switches have thousands of features that cloud vendors don't need. So they're building their own, at much lower cost, and enterprise IT pros are noticing.


Enterprise network disaggregation is inevitable
“Disaggregation will make incremental, steady progress within the broader Fortune 500, though that progress will by no means be immediate,” states IDC analyst Brad Casemore in an as-yet-unreleased report on network disaggregation. “Just as with software-defined networking, not everybody is ready to embrace change and be an early adopter.” The IDC report, though, also states that disaggregation is an inevitability in the enterprise -- more specifically, large enterprises -- because it offers a means of standardizing network resources while allowing for continuous software innovation “beyond the confines of vendor-specific product release schedules.” This is in addition to the capital and operational cost reduction often viewed as the primary driver of the trend.



Quote for the day:

“God uses imperfect people for impossible tasks” -- John Paul Warren