July 04, 2015

Recovering from a Storage System Failure
Protecting from storage system failure requires more than just copying data from point A to point B. To meet any reasonable recovery expectation requires that a secondary storage system be available to start the application on. The good news is secondary storage is very affordable today and these systems can play a larger role than just being a standby storage system for the primary storage system. But before implementing the secondary storage system, the IT planner needs to understand what the acceptable recovery point (RPO) and recovery time objectives (RTO) will be for the applications counting on that system. Once the RPO and RTO is understood IT planners will know what method they should use to get data to the secondary storage system and what type of secondary storage system they should buy.


When Big Businesses Collaborate in the Cloud, Rapid Innovation Will Follow
The industry can also expect to witness accelerated technological advancements and innovation — when big businesses collaborate, rapid innovation will inevitably follow. Competitive divides between managed cloud and commodity cloud are growing; businesses need to understand which is right for their needs. Those who don’t have either the in-house expertise or the inclination to develop it are wise to choose a managed cloud solution, for efficiency and to guarantee uptime. Competitive collaboration plus partnerships through channel programs up and down the stack drive value for end customers. In order to drive success with the majority of the SMB through Enterprise market, we all need to consider partnerships.


Coffee Shops and Home Routers Could Offer Nearby Phones a 4G Data Connection
It encodes data in the same way as the LTE technology used by cellular networks today, but is designed to be used over the same part of the radio spectrum as Wi-Fi and has roughly the same range. The company says it can provide faster, less glitchy connections than Wi-Fi because LTE was developed for cellular networks where performance and reliability is more crucial. MuLTEfire opens new possibilities because the radio bands that Wi-Fi uses are not reserved for the exclusive use of any company. LTE is used today only by cellular networks on radio bands licensed from governments at costs of millions or billions of dollars. MuLTEfire is also different in that—as with Wi-Fi—a MuLTEfire hotspot can serve any device, regardless of which cellular carrier it is a subscriber to.


What’s the Difference between M2M and IoT?
To quote Mark Andreessen, “Software is eating the world” in industry after industry and it’s no different in ours. Old school SCADA (Supervisory Control and Data Acquisition) hardware is being consumed by Internet of Things software. Often people ask, “What’s the difference between M2M and IoT?” Besides the scope of the networks – M2M lives on a local area network or no network while IoT lives on a wide area network – the other major difference is SCADA consists of proprietary hardware like programmable logic controllers whereas the heart of IoT is software.


Why the Internet of Things won't be about the 'things'
“If you think, ‘Oh, I’ll never need that,’” pointing to a slide depicting the Apple Watch, “get that out of your system.” Forrester surveyed consumers about the Apple Watch upon its release for preorder in April, and 26%—representing about 50 million people in the United States—said they expect to own a smartwatch someday. What’s more, “8% want their next car to be a self-driving car, and they don’t even exist yet,” McQuivey said. “This is a consumer base that has learned the lesson of the last 10 years: ‘New stuff is going to come out, and it’s going to benefit me,” he added. “We are all early adopters now.”


Planning Cloud Analytics: conceptual architecture alternatives
Alternative architectures will result in cost variations for a cloud analytics solution. Choosing an optimal Cloud analytics architecture requires consideration of key tradeoffs described below. In addition, it requires consideration of factors such as skills, licensing, complexity, and maintenance. ...
In contrast to placing all essential components on a single CSP’s platform, distributing the components among multiple CSP platforms allows leveraging the strengths of product offerings available through each separate CSP. This approach necessitates consideration of costs associated with multiple CSPs as well as careful evaluation of architecture due to the potential for increased complexity.


Singapore wants to be a living lab for smart startups
“Singapore has always positioned itself as a great base for global companies,” he says when we meet for coffee on his visit to London. “We’re working hard to make it a great base but for startups and we’re talking about companies that have the opportunity to be big because people often comingle ‘SME’ and ‘startup’ and really they’re night and day.” Leonard says that this is a great time to foster startups: “You can be anywhere in the world and you don’t need to fill it up with infrastructure.” However, he says he’s not interested in building some California clone convoy. “One thing we’re not trying to do is mimic Silicon Valley. We have some good raw materials such as the intersection of universities and businesses, investment capital and high mobile use, and that gives us nice young talent and companies that want to tap into that.”


Why trust, rather than security, could be the bigger barrier to Google cloud adoption
“Once in the Google cloud, you have no idea where your data is. You don’t and won’t know when it’s being moved as Google looks to ensure that data is stored efficiently and regularly shifts data from one location to another to maximise resources,” he added. Hall said, as there is a risk their data could fall under the jurisdiction of another country with all the shifting around. Another concern, aired by a Computer Weekly reader, was that a company the size of Google is likely to be a top target for hackers, despite its assurances about the security resources it has in place to protect its platforms. “I have no doubt Google will have better security resources, but it will also be one of the top companies in the world on that target list,” the reader said.


Conditional Compilation in Universal Apps
Conditional compilation is the process of defining compiler directives that cause different parts of the code to be compiled, and others to be ignored. This technique can be used in a cross-platform development scenario to specify parts of the code that are compiled specific to a particular platform.Conditional compilation allows the compiler to skip some parts of the source code when compiling based on the conditional clauses. The #if ,#endif #elif and #else directives are used for conditional compilation. These directives add conditions to parts of a source file. With the use of these directives we can keep a single source file.


eBook: Problem Management
So what is problem management and what does it aim to achieve? And how is it different from incident management? This extract from Michael G Hall’s book, Problem Management, an implementation guide for the real world, explains how problem management differs from incident management and looks at the differences between reactive and pro-active problem management. The book draws on the principles of ITIL.



Quote for the day:

“Being a leader is making the people you love hate you a little more each day.” -- Patrick Ness

July 03, 2015

Ireland gears up for cyber war – new strategy to protect critical infrastructure
The Irish Government has established a National Cybersecurity Centre (NCSC) within the Department of Communications that will be tasked with securing government networks and critical national infrastructure. As well as being accredited to the CSIRT-IE, the NCSC will develop capabilities to respond swiftly when attacks occur and develop capabilities in the area of industrial control and SCADA systems, which are used to run electricity and water networks. The threat to such networks became clear when the Stuxnet worm, malware designed by Israel and the CIA to compromise Iranian nuclear facilities, began to roam wild and threaten utilities infrastructure worldwide.


Facebook wins first round in European privacy battle
The court said it lacked jurisdiction to hear the case seeking €500 compensation for each claimant, totalling €12.5m. "This litigation was unnecessary and we’re pleased that the court has roundly rejected these claims,” Facebook said in a statement. Commentators said the court’s ruling is a victory for Facebook, which argued the case was not legitimate, but Schrems has vowed to take the case to a higher regional court and appeal to the Austrian Supreme Court if necessary. The group led by Schrems is suing the social networking firm for several privacy violations, including tracking their data and Facebook’s alleged involvement in the US National Security Agency’s (NSA) Prism surveillance programme.


M-Disc optical media reviewed: Your data, good for a thousand years
As to that thousand-year claim, the U.S. Navy will back that up. It tested M-Disc DVD+Rs along with archival quality DVD+R/RW and DVD-R/RW, subjecting them three times to a 185-degree, 85-percent humidity, full-spectrum light environment for 26.25 hours. Every DVD failed—except the M-Discs, which suffered no noticeable degradation. The Department of Defense hasn’t tested the new M-Disc BD-R, but as the technology is largely the same, the results should be as well. The only failure point for the material used in the M-Disc data layer is oxidation, which, according to Millenniata materials scientists, shouldn’t be an issue for about ten millennia. Yikes. The comparative delicacy of the polycarbonate outer layer of the disc is why the media lasts “only” a thousand years.


Security threats, hackers and shadow IT still plague health IT
"You don't know what you don't know, so the first thing CIOs can do to help their employees adopt the cloud safely is to discover all the services in use across the organization," Rick Hopfer, CIO at Molina Healthcare, writes in an email. "Employees rarely have the information to determine whether a particular cloud application complies with organization's security and compliance policies." The average healthcare employee uses 26 different cloud services, Skyhigh found. And those applications often have very different levels of security protections, highlighting the importance of the IT department working with the business units to ensure that cloud services are deployed safely and managed by the CIO's team.


How Big Data Is Changing Recruitment Forever
It’s hard to say if this is true or not, but many people in positions with responsibility for hiring would probably admit that they had made appointments based on a “gut feeling” – simply whether or not they felt the person was the right fit for the vacancy. Well, all that is changing. Taking on a new employee represents a huge investment for most companies, particularly in a managerial or professional role. A large proportion (40% to 60% by most estimates) of a company’s revenue goes on staff salaries. So in an age where everything can be measured, quantified and analyzed, it makes sense to put a bit more planning and strategic thought into the recruitment process.


How to survive in the ‘Digital Amnesia’ world
It’s simply impossible to remember everything. To look into this, Kaspersky Lab has initiated research to analyse how digital devices and the Internet affect the way people recall and use information today – and what, if anything, they are doing to protect it. ... So, smartphones are the ubiquitous companions for many of us. They have become an extension of the human brain and just as the skull protects the brains, mobile phones need protection as well. The majority of motorcyclists put on helmets, but only a few of those surveyed managed to adequately protect their phones with IT security. One of the previous Kaspersky Lab studies also shows that women often secure everything less than men.


Cyber-Espionage Nightmare
The failure of the companies’ supposed security technologies was stupefying. Lance Wyatt, the IT director for the steelworkers’ union, thought he ran a tight ship. An IT audit in 2010 had found no major deficiencies. His e-mail server screened all incoming messages for attachments that contained executable code. He had the latest antivirus software. His network checked IP addresses to avoid sites that contained malware. Yet Wyatt and the FBI eventually found infected computers, one of them used by the union’s travel manager. “None of those machines were on our radar as being infected or suspect,” he says.


Microservices, the Reality of Conway's Law, and Evolutionary Architecture
There are lots of monoliths that are very highly coupled, in fact most of them, and so it is not a trivial exercise to break them up. So, as a practical matter, here is what I recommend to people and people that I consult with now – first, we think it is a good idea to move to the new model and so first, we have to agree to that. Step zero is to take a real customer problem, something with real customer benefit, maybe a reimplementation of something you have or ideally some new piece of functionality that was hard to do in the monolith. Do that in the new way, first, and what you are trying to do there is to learn from mistakes you inevitably will make going through that first transition, right? You do it in a relatively safe way, but at the same time you do with real benefit at the end.


India innovates on the Internet of Things
While India may not have that many successful products in the software space, it is making IoT devices that are comparable to any in the world. Its large talent pool of mobile app developers is helping create interfaces between products and users. It helps that companies such as Intel, Cisco, Broadcom and MediaTek are making open-source hardware that companies can use to launch IoT prototypes. Crowdfunding platforms such as Kickstarter and Indiegogo are also useful for start-ups. We looked at a number of IoT products being made in India and picked some of the best.


Software Still Playing Catch-Up to Flash Memory Advancements
At the data-tier level, we see some consistency as well. All major vendors integrate flash drives/modules as an option and have native interfaces to flash devices. Another strategy is using high-bandwidth flash as a cache on the server node, while retaining disk storage in its traditional role. All these strategies give users access to flash technology using standard SQL methods. Unfortunately, at the middle/processing tier, platform support for flash is inconsistent. The limited number of middle-tier caches that exist and can persist to disk can easily transition to flash. However, if you are not seeking a middle-tier cache but a data grid, only a few data grid vendors have programmable flash integrations.



Quote for the day:

"Problems are only opportunities in work clothes." -- Henri Kaiser

July 01, 2015

Trusted Technology, Procurement Paradigms, and Cyber Insurance
From the customer's perspective, they need to be considering how they actually apply techniques to ensure that they are sourcing from authorized channels, that they are also applying the same techniques that we use for secure engineering when they are doing the integration of their IT infrastructure.  But from a development perspective, it’s ensuring that we're applying secure engineering techniques, that we have a well-defined baseline for our life cycle, and that we're controlling our assets effectively. We understand who our partners are and we're able to score them and ensure that we're tracking their integrity and that we're applying new techniques around secure engineering, like threat analysis and risk analysis to the supply chain.


10 things CIOs need to know about agile development
The full benefits of agile cannot be achieved without engaging with business leaders, management and the user community. If the rest of the business does not have an immediate appetite for working in a new way, careful planning and communication will be needed to bring different communities of managers and users on board. ... The basic organisational unit of delivery in agile development is a small team, typically expressed as "seven, plus or minus two" people — both developers and quality assurance. ... If people are moved too frequently, the teams fail to develop into highly productive units; if people are not moved between teams enough, then each team starts to become isolated and diverges from the other teams. It is important to note that physical location of teams is much more important with agile methods than with conventional approaches to development.


When the Toaster Shares Your Data With the Refrigerator, the Bathroom Scale ...
Everything will be connected, including cars, street lighting, jet engines, medical scanners, and household appliances. Rather than throwing appliances away when a new model comes out, we will just download new features. That is how the Tesla electric cars already work — they get software updates every few weeks that provide new features. Tesla’s latest software upgrades are enabling the cars to begin to drive themselves. But the existence of all these sensors will create many new challenges. Businesses have not yet figured out how to use the data they already have. According to McKinsey, for example, oil rigs have as many as 30,000 sensors, but their owners examine only one percent of the data they collect.


It’s not just the weather: Southern Europe’s startup ecosystem is heating up
Getting down to the nitty-gritty of growth metrics, each startup has shown tangible returns on investment. Bluemove has between 15 and 20 thousand active users compared to 9 thousand about a year ago. According to González-Iglesias, “Each month we are triplicating what we did last year. We expect to close our year on a stand-alone basis between 1.5 and 2 million euros in revenues. Last year we closed a bit under 1 million in revenues.” ...  COO Christian Picard expects the revenue “coefficient will be tripled or multiplied by four” as they expand beyond Barcelona and Madrid into bigger cities with more business travellers like London, Paris and Berlin.


Not So Fast: Questioning Deep Learning IQ Results
If the work truly shows that computers can now pass the written IQ exam with stronger scores than humans, it is definitely interesting. ... This system is hand-engineered to identify the specific patterns in formulaic standardized tests. It's hard-wired to know the types of questions that exist. This system may be a powerful demonstration of word2vec style distributed representations for words, but it is hardly a display of true human intelligence. If the format of the question were changed significantly, or were not formulaic, it would seem that this system couldn't cope. As with many standardized tests, the verbal reasoning section tests the breadth of a participant's vocabulary more than anything else, and it would hardly come as a surprise that the computer can maintain a larger vocabulary than a human.


Big data in financial services begets chief data officer
If you look at the innovation side of things, it is very confusing. If you look at all the Hadoop distributions, all the ETL tools, all the SQL tools and all the data visualization tools, it is very confusing. In the traditional relational space, it was simpler. It's confusing and some people are scared away with all these choices. What I see on the ground is that the financial industries, for several different reasons, is behind in the adoption of the new big data revolution. The first reason for that is most of the companies have yet to find a killer app -- one that would move the dial. Other reasons are that there are lots of regulatory changes and lots of pressure on cost savings. The focus is not on the innovation.


8 High Performance Apps You Never Knew Were Hybrid
Some of the topmost brands have recently ditched native and gone the hybrid way. With the new hybrid frameworks like Ionic, Phonegap etc becoming more mature, one cannot assume that hybrid apps perform worse anymore. In fact, a recent Gartner report says that by next year more developers will be going the hybrid way and by another account, the average end user ratings of hybrid apps are already 12 percent better than native apps. So where are all the hybrid apps? You use them a lot, probably without even realizing that you’re actually on “the web”. Well, that’s the beauty of it! Here are 8 very popular hybrid apps that you could never have imagined to be, hybrid:


Rebooting the Automobile
Cars are far more computerized than they might seem. Automakers began using integrated circuits to monitor and control basic engine functions in the late 1970s; computerization accelerated in the 1980s as regulations on fuel efficiency and emissions were put in place, requiring even better engine control. In 1982, for instance, computers began taking full control of the automatic transmission in some models. New cars now have between 50 and 100 computers and run millions of lines of code. An internal network connects these computers, allowing a mechanic or dealer to assess a car’s health through a diagnostic port just below the steering wheel. Some carmakers diagnose problems with vehicles remotely, through a wireless link, and it’s possible to plug a gadget into your car’s diagnostic port to identify engine problems or track driving habits via a smartphone app.


The end of IT consumerization
Now cloud vendors are driving IT evolution. When Google realized that inefficient power supplies were costing it millions, its suppliers quickly fixed that problem. The rest of us, who never cared, and enterprises - who lacked Google's clout - benefited.Other web-scale technology is moving into the enterprise. The commodity scale-out compute and storage architectures that Google and Amazon pioneered are now being offered by companies such as Nutanix and Scality, and in open source software like OpenStack. Now networking giant Cisco is under attack because their costly, complex switches have thousands of features that cloud vendors don't need. So they're building their own, at much lower cost, and enterprise IT pros are noticing.


Enterprise network disaggregation is inevitable
“Disaggregation will make incremental, steady progress within the broader Fortune 500, though that progress will by no means be immediate,” states IDC analyst Brad Casemore in an as-yet-unreleased report on network disaggregation. “Just as with software-defined networking, not everybody is ready to embrace change and be an early adopter.” The IDC report, though, also states that disaggregation is an inevitability in the enterprise -- more specifically, large enterprises -- because it offers a means of standardizing network resources while allowing for continuous software innovation “beyond the confines of vendor-specific product release schedules.” This is in addition to the capital and operational cost reduction often viewed as the primary driver of the trend.



Quote for the day:

“God uses imperfect people for impossible tasks” -- John Paul Warren

June 30, 2015

How three years of Yammer has changed Microsoft for the better
Early Yammer use at Microsoft wasn't all positive, but even the arguments could be productive, Pisoni noted when we met last year. "Someone at Microsoft made a comment about a product in a mean-spirited way and the person who made the product popped up and said, 'Hey I make that product,' and they had a conversation. I think there were parts of the company so used to being in a silo that they had to realise they could communicate and get used to having constructive conversations and people got more constructive with each other." Fast forward to 2014 and if you worked at Microsoft and you wanted a new laptop, you had to use Yammer to request it. But it wasn't just a different way of filling in a form, Coplin told us. "It's not just another inbox, it's a culture. You're now measured on your contribution across the company."


The C-Suite Needs a Chief Entrepreneur
So if the CEO isn’t someone who can innovate, then who should? It’s a question that I’ve discussed with Lean Startup founder Steve Blank and business thinkers such as Yves Pigneur, Henry Chesbrough, and Rita McGrath. We believe that CEOs need a partner for innovation inside their companies, someone who will create and defend processes, incentives, and metrics that encourage radical ideas and find new areas for growth. It’s an executive who can help large companies reinvent themselves while they’re still successful. And this new role needs to sit in the C-suite. You could call this person the Chief Entrepreneur (CE) — someone who can lead the future of the company while the CEO takes cares of running the existing business. This is a huge divergence from the traditional norm for chief roles, but the CE is a necessary position of power to ensure that a company innovates.


Avoid culture shock in your next cloud computing project
"An engineer thinks, 'Oh, if you automate this… you're not going to have any work for me,'" said Jason Cornell, manager of cloud and infrastructure automation at Cox Automotive, an Atlanta-based provider of vehicle remarketing services. Enterprises must emphasize that cloud's automation will allow engineers to perform other, higher-value tasks for the business. "What I've been trying to do within my organization is to get us to the point where we can work higher and higher up the stack and get further and further away from managing things like patching [or the] maintenance of servers," Maddison said. "Those kinds of things we want to throw out the window, and get us to focus on stuff that is actually going to add business value."


CIOs warned of IT overspending risks as US dollar strength grows
“We’re looking at the US dollar being high through 2017, but the effect we’re seeing now is around product vendors re-pricing, using obfuscation and trying to keep the revenue they generate in line [with their in-house projections],” he said. “Next year it will be about recovering margin and the year after that, there will be a whole new world out there.” The latter point is in reference to the long-term effect the strong dollar will have on imports and exports arrangements between different countries, which could benefit some suppliers and harm others. “Printers are an interesting market right now, particularly in Europe, as it’s the only market where we have a vendor who has non-US based products and that’s Fujitsu,” said Lovelock.


Announcing the R Consortium
While the R Foundation continues its role as the maintainer of the core R language engine, the R Consortium will initiate projects to help the user community make even better use of R, and to help the R developer community further extend R via packages and other ancillary software projects. Projects already proposed include: building and maintaining mirrors for downloading R; testing and quality assurance platforms; financial support for the annual useR! Conference; and promotion and support of worldwide user groups. In general, the Consortium will seek the input of its members and the R Community at large for projects that foster the continuing growth of R and the community of people that drives its evolution.


OPM Chief's new Cyber Defense operation has potential
"Within almost any organization, there is a tendency for structure to drive behavior and for execution toward goals to be the ones that are measured by management," Harkins said. "By publicly demonstrating the leadership of accountability," OPM will surely "be able to stay on top of future risks because they will have the structure to drive prevention of issues and learn from incidents that may occur." Cylance late last year published an analysis labeling Iran a rising power in cyberspace, comparable to China, and specifically cited a campaign dubbed Operation Cleaver. On Friday, The Hill reported the group behind that series of attacks provided WikiLeaks with about 70,000 confidential cables from Saudi Arabia’s Foreign Ministry.


Cisco fleshes out its Internet of things system, portfolio
Cisco's IoT approach revolves more around a portfolio of products, reference architecture and ecosystem with the likes of Rockwell Automation and GE to name a few. Not surprisingly, Cisco's IoT rollout includes a heavy dose of infrastructure ranging from networking gear to security cameras. But Cisco also added tools for analytics, application management as well as "fog computing." Fog computing is an extension of the cloud designed to manage data from sensors and edge devices. For instance, a temperature reading every second doesn't need to be uploaded to the cloud. Fog computing techniques would take that real-time data, average it out based on parameters and upload it to the cloud every half hour or so. If temperature got out of whack the sensor would have enough intelligence to act quickly.


Re-imagine Master Data Management -- With Graph Databases
Current MDM solutions typically store their data in an Relational Database Management System (RDBMS) which makes it hard to see relationships within the data and leverage those insights in real time for competitive advantage. The new MDM needs to use a different type of data store optimized to quickly discover new insights in existing data, provide a 360-degree view of master data, answer questions about data relationships in real time. The good news is that this is not only possible but happening today, thanks to new technologies and approaches that transform the concept and execution of MDM to enable companies to consolidate data from many channels into one and offer a highly related, true view of this data.


Cybersecurity’s future will require humans and machines to work symbiotically
Although companies are building their own security solutions to detect and mitigate attacks at the earliest possible stages, as time goes on and more devices get shared across contexts by multiple users. That means the methods by which attacks will be perpetrated will multiply. The modern enterprise lives across the cloud, mobile devices, and the Internet of Things, which means the approaches we previously used to defend against cyber threats are no longer viable. ... In most enterprise settings, security data gets collected and correlated in SIEM (Security Incident and Event Management) products made by Splunk, LogRhythm, and others, and it ends up overwhelming the security analysts tasked with making sense of it.


Compliance Is Now Calling the Shots, and Bankers Are Bristling
“These people are in great demand,” says Maurice Gilbert, founder of Conselium, a headhunting firm in Dallas. Gilbert used to do executive searches for all sorts of positions. “Then, about eight or nine years ago, we got a compliance search,” he says. “And then we got another one. And we said, ‘Is this the tip of the iceberg?’” It was. Now, compliance is all Gilbert does. His biggest payday usually comes when he places a chief compliance officer. These people sometimes report straight to the board of directors, and they make really good money, Gilbert says. In April, a very large pharmaceutical company had him looking for a compliance head to come aboard at $1.5 million a year.



Quote for the day:

"People not only notice how you treat them, they also notice how you treat others." -- Gary L. Graybill

June 29, 2015

Blending humans and technology in the workforce
A key enabler behind this co-operation lies in the interfaces. Advances in natural language processing (NLP) and speech recognition are making it a lot easier for humans to interact with machines in real time. Speech recognition is becoming more effective thanks to the growing capability of machines to “understand” unstructured conversations. This is aided by the ability of machines to make instant internet searches and to use contextual clues. At the same time they are becoming more effective at incorporating user feedback to improve their accuracy. Such is the interest in NLP technologies that the market for this application is expected to grow quickly to reach $10bn by 2018 from $3.7bn in 2013.


DevOps & Product Teams - Win or Fail?
Developers are expected to deliver product features or user stories, preferably in a predictable way. When unforeseen problems cause delays, developers - keeping the release date in sight - struggle frantically to compensate, releasing incomplete features (although some would argue that there’s no such thing as releasing too early). Operations is usually prized on availability. MTTR may be more DevOps-friendly than MTBF, but regardless of how it's measured, outages are more difficult to prevent in face of constant change. This can cause engineers in operations to be over-cautious and too conservative. If lots of new product features are deployed to production, it’s the developers’ merit, but if any of those shiny new features cause an outage, the operations guys will be waking up to fix it.


UC San Diego Researchers Amp Up Internet Speeds
The results of the experiment, performed at UC San Diego's Qualcomm Institute by researchers from the Photonics Systems Group and published in the June edition of the research journal Science, indicate that fiber information capacity can be notably increased over previous estimates by pre-empting the distortion effects that will happen in the optical fiber. The official name of the paper is "Overcoming Kerr-induced capacity limit in optical fiber transmission." "Today's fiber optic systems are a little like quicksand," Nikola Alic, a research scientist from the Qualcomm Institute, the corresponding author on the Science paper, and a principal of the experimental effort, wrote in a June 25 statement.


Urgency of Present and Past in IoT Analytics
The most basic obstacle to extracting value from this OT-generated data is connecting it to traditional Information Technology (IT) systems. This integration is problematic because IT and OT evolved in different environments and use different data types and architectures. While IT evolved from the top down, primarily focusing on serving business and financial needs, OT grew from the bottom up, with many different proprietary systems designed to control specific equipment and processes. IT systems are based on well-established standards that can integrate large volumes of information across different applications. In the world of OT, no single industry standard yet exists for Machine to Machine (M2M) connectivity.


Why is Virtualization creating Storage Sprawl?
Storage sprawl has become worse in organizations that have made a significant investment in virtualization technologies. For example most virtual desktop infrastructures (VDI) have an all-flash array to handle desktop images, preventing boot storms, logout storms and maintaining acceptable performance throughout the day. But most VDI environments also need a file store for user home directories. There is little to be gained if this data is placed on the all-flash array, but certainly data centers need to provide storage to their users to support user created data. As a result most organizations end up buying a separate Network Attached Storage (NAS) device to support user home directories and other types of unstructured data.


Dealing With Data Privacy in the Cloud
“If the single biggest concern centres around the security of placing data in multi-tenant public clouds then this is a misjudgement," he said. "If anything, providers of public cloud managed hosting services know a lot more about system security than most individual firms. Second, privacy policy controls stipulated upon instances of private cloud may be harder to update that those held in public environments where service layers are stronger.” Indeed, confidence in cloud security is growing, according to the 2014 IDG Enterprise Cloud Computing Study. The survey found that the vast majority of enterprises were “very” or “somewhat” confident that the information assets placed in the cloud are secure.


Overcoming the business and technology barriers to DevOps adoption
"There was a degree of customer dissatisfaction with the service we provided, in the sense that we couldn’t keep up with the demand from developers, which meant there were long lead times in providing them with environments; and they were created in a manual way with semi-automated scripts," says Watson. "So, not only did it take us a long time to provide these individual environments, sometimes they weren’t always the same." This often led to disagreements between the software development and infrastructure building teams, as the environments they delivered didn’t always quite fit the bill. To rectify this, Watson created small groups of developers and infrastructure architects, while doing away with the ticketing system used to communicate requests between these groups.


Why is a cloud provider like a restaurant?
In my experience the thing that marks the unsuccessful restaurant is the belief that having the best kitchen equipment (or the cheapest, depending on their business model) – is the defining factor in their success. What rubbish! Any experienced restaurant patron can tell you that it’s all about the customer experience – and how the customer perceives the value delivered by the restaurant. The customer only has the ‘front-office’ experience – the location, parking valet, Maitre d’, bar, seating arrangement, ambience and of course, the menu and service. The restaurant might only be a drive-through fast-food outlet, but the same principles apply. The customer makes a choice of provider based on their required value proposition and expect that to be delivered.


The Road Ahead for Architectural Languages
MDE is a possible technological solution for successfully supporting the requirements of next-generation ALs. In MDE, architects use domain-specific modeling languages (DSMLs) to describe the system of interest. The concepts of a DSML - its first-class entities, relationships, and constraints - are defined by its metamodel. According to this, every model must conform to a specific metamodel, similar to how a program conforms to the grammar of its programming language. In MDE, it’s common to have a set of transformation engines and generators that produce various types of artifacts. Practitioners can take advantage of transformation engines to obtain source code, alternative model descriptions, deployment configurations, inputs for analysis tools, and so on.


Cloud computing may make IT compliance auditing even cloudier
There may be several issues at work that reduce IT's preparedness for compliance audits. Time is likely the leading culprit, as tending to compliance reporting is one more thing to be squeezed into a busy day. Related to that is lack of resources -- short-staffed IT departments may find it difficult to put someone on the case more than a few hours a week. Some industry observers say cloud computing has made the matter of compliance even, well, cloudier. The results are "not surprising when you consider the degree to which cloud systems and mobile access have penetrated most enterprises," says Gerry Grealish, CMO of Perspecsys. "Cloud SaaS systems and BYOD policies take the control of enterprise data -- including sensitive data -- away from enterprise IT teams and put it in the hands of third party vendors.



Quote for the day:

"It's not about how smart you are--it's about capturing minds." -- Richie Norton

June 28, 2015

8 Ways Business Intelligence Software Improves the Bottom Line
"With quick access to your internal data, you can more efficiently use your time to analyze internal information and make decisions," says Ryan Mulholland, president, Connotate, a provider of Web data monitoring and extraction solutions. For example, "as president of my company, I can look at all trends in our sales cycle and see which are going to affect our business," he says. "Then I can decide our course of action much more quickly and efficiently." ... "A strong BI system, if well-configured, can help eliminate the time spent copying and pasting data and performing calculations," says Max Dufour


Student Privacy at Risk in the Age of Big Data
The fear is that the multi-billion-dollar education technology (or “ed-tech”) industry that seeks to individualize learning and reduce drop-out rates could also pose a threat to privacy, as a rush to commercialize student data could leave children tagged for life with indicators based on their childhood performance.“What if potential employers can buy the data about you growing up and in school?” asks mathematician Cathy O’Neil, who’s finishing a book on big data and blogs at mathbabe.org.


The Future Of Algorithmic Personalization
Personalization should bring together collective intelligence and artificial intelligence. The connections become faster and the computers smarter and more efficient. To decrease the computing gap the focus is on enhancing the information flow between humans and machines. Humans are (still) the best pattern-recognition systems in the known universe. We can help each other to find and discover meaningful signals. Artificial intelligence should empower this sense-making by powering adaptive interfaces and predictive learning systems.Human-centered personalization brings together human-curated signals and adaptive machine-learning solutions.


The Limits of 3D Printing
Creating printable files involves two steps: creating a three-dimensional volume model that can be printed, and “slicing” that volume model in the best possible way to avoid material wastage and prevent printing errors. Both steps require tacit knowledge. Following the printing, the parts produced have to be recovered, cleaned, washed (or sanded and polished, in the case of metal prints), and inspected. This, in turn, means that using 3D printing for the aftermarket services — an application where it makes a lot of sense — requires making a significant upfront investment in generating the printable files of the spare parts that would likely be needed.


How Big Data Affects Us Through the Internet of Things
Although big data is ever-present in our lives, it can be difficult to understand how much it really has changed our day-to-day living. Let’s take a closer look at how big data has weaved its way into the lives of many consumers today, via the Internet of Things (IoT). The Internet of Things can be thought of as the interconnectivity of everyday objects that use network connectivity to send and receive data. Whether we consciously realize it or not, we are surrounded by objects like these that depend on big data to make our lives better.


Coming soon: An API for the human genome
The genome is in many senses a database that we have constructed and curated and built new interfaces to. As a result, it will soon be the latest addition to what has for been referred to as the API Economy. Computer programs themselves will be increasingly able to accommodate genomics, perhaps in ways no more remarkable than how Mint.com pulls together your bank balances. It’s this piece that I’m most interested in because a whole generation of software and hardware developers will soon be able to think about personalization at a molecular level, without the need for a bioinformatics team or a PhD. This will be the fourth and perhaps final wave.


Professionalism, waivers and the hard questions
Our role as enterprise-architects is mainly one of decision-support, not decision-making – we can, do and should give advice on architectural concerns, developed to the best of our ability, but unless we are explicitly asked to make decisions, the final decisions are not ours to make. And that distinction is crucial (not least for our continued employment…). If we’ve done our job well, we should have a pretty clear idea of what would work in the enterprise, and what won’t – what will support ‘things-working-together’, and what won’t – and our advice should indicate and describe that overall understanding, in terms appropriate for the respective audience.


Microsoft Cloud Meets Cisco’s Application Centric Infrastructure (ACI)
The path to fast and efficient IT requires more than just technology. Technology must enable a new process model for speeding up workflows across siloed organizations within the IT function. This session will introduce Cisco’s Application Centric Infrastructure and its tight integration into Microsoft Azure clouds. We'll show how you can deliver new tenant services while transforming your IT organization and workflows with a common policy model, centralized control, and simplified operational visibility across your data center. We’ll demonstrate how your applications, network and security teams can leverage a new operational model to generate compelling business outcomes for your enterprise.


Universal Limits on Computation
In 1965 Gordon Moore speculated that the number of transistors on a chip, and with that the computing power of computers, would double every year[10]. Subsequently this estimate was revised to between 18 months and 2 years, and for the past 40 years this prediction has held true, with computer processing speeds actually exceeding the 18 month prediction. Our estimate for the total information processing capability of any system in our Universe 8 implies an ultimate limit on the processing capability of any system in the future, independent of its physical manifestation and implies that Moore’s Law cannot continue unabated for more than 600 years for any technological civilization.


The Difference Between Culture and Values
If a company’s values are its bedrock, then a company’s culture is the shifting landscape on top of it. Culture is the current embodiment of the values as the needs of the business dictate. Landscapes change over time — sometimes temporarily due to a change in seasons, sometimes permanently due to a storm or a landslide, sometimes even due to human events like commercial development or at the hand of a good gardener. So what does it mean that culture is the current embodiment of the values as the needs of the business dictate? Let’s go back to the value of Transparency. When you are 10 people in a room, Transparency means you as CEO may feel compelled to share that you’re thinking about pivoting the product, collect everyone’s point of view on the subject, and make a decision together.



Quote for the day:

"Computer technology is so built into our lives that it's part of the surround of every artist." -- Steven Levy

June 27, 2015

Evolution of Continuous Delivery and DevOps
Ultimately engineering teams are employed to provide solutions to business problems. If your organisational structure is built around your technical platforms and your engineering teams are built around your organisational structure, the solutions they provide will in part solve business problems but mostly will try to compensate for organisational problems - this is by no means a new problem. When it comes to delivering value quickly and consistently there's nothing better than having everyone you need working together (say in the form of a cross-functional team) - as long as they have a consistent flow of work that they can all contribute to. The potential problem however is ensuring the work keeps everyone busy. Unless you have multi-skilled engineers (or engineers who are willing to simply muck in) things can become uneven and inefficient.


The Business Processes of Information Governance
Regardless of industry or line of business, an analogous quote can easily be made, be it selling shirts, producing pharmaceutical drugs, or buying and reselling products. The business processes that organizations put in place determine how efficiently and effectively companies conduct their work. These processes are often unique by industry, tailored to specific companies, and studied and improved via major initiatives and strategic investments. The business processes of information governance should be added to this list of differentiating, business value-driving processes. Indeed, highly effective organizations run highly effective business processes -- and information governance is no exception. To succeed, the business processes of information governance can be divided into six key areas:


How Will Businesses Change with Virtual Reality?
Virtual reality could also be meaningful to people with limited physical mobility as it would allow them to tour places they might not have otherwise been able to visit and do so by viewing actual video footage of real places. However, everyone might be interested in the ability of virtual reality to provide the realistic sights and sounds of faraway places. Combined with other sensory stimuli, such as a beach scent, light breeze, and some heat to simulate sunlight, virtual reality could allow you to create a fairly immersive, multi-sensory simulation of sitting on your favorite beach. While it would not be a real substitute, it might just be the next best thing.


TCS's Ignio can predict problems & automate: CEO N Chandrasekaran
It is a neuroscience-based self-learning platform. So if you take Ignio and you install it then the moment you put it into an environment, it sucks up the information about that environment and creates a context. If you take infrastructure context, it will know how many different kinds of hardware are there, how many different kinds of software are there. So it will learn about all the things about your infrastructure from the data that's already present in the your network. From an infrastructure point of view it has knowledge of the different infrastructure pieces that are there. Then what it does, is that it can automate.


Big Data and the Rise of Augmented Intelligence
Each year computers are getting faster, but at the same time we as humans are getting better at using them. The top chess players in the world are not humans OR computers, but combinations of humans AND computers. In this talk, Sean Gourley examines this world of augmented intelligence and shows how our understanding of the human brain is shaping the way we visualize and interact with big data. Gourley argues that the world we are living in is too complex for any single human mind to understand and that we need to team up with machines to make better decisions.


The rise of SSDs over hard drives, debunked
If there's one upgrade a consumer can make to a desktop or laptop computer that will make the greatest difference in performance, it's swapping in an SSD. NAND flash manufacturers such as Samsung, Toshiba, Micron, and Intel, have continued to shrink the lithography technology for making flash transistors. Last fall, at the Flash Memory Summit, Toshiba revealed its smallest lithography process for NAND flash with a 15-nanometer, 16GB MLC NAND wafer. The 15nm wafer was developed in partnership with SanDisk. Flash makers have also increased the number of bits -- from one to three -- that can be stored per NAND flash cell, all of which has increased density and reduced manufacturing costs.


Forget China, There’s An E-Commerce Gold Rush In Southeast Asia
The paucity of payment systems in the Southeast Asian is a case in point itself. Thanks to Alibaba and WeChat, COD decreased from more than 70 percent of total payments in 2008 to less than 21 percent recently in China during last year’s Singles’ Day mega sales (think Cyber Monday / Black Friday for China). Southeast Asia, however, doesn’t have an Alibaba or WeChat yet. Due to their size and reach, Alibaba and Tencent were able to turn their online payment methods into the de-facto standard for e-commerce in China. Southeast Asia is still fragmented. Until local financial institutions and/or e-commerce players get their act together in terms of payment platforms, COD will remain the dominant payment methods covering over 80 percent of total payments.


The Internet was a mistake, now let’s fix it
In fact, with government regulation and support, mathematically secure communication is eminently possible. Crypto theory says that a truly random key that is as long as the message being sent cannot be broken without a copy of the key. Imagine a world where telecommunication providers working under appropriate regulations issued physical media similar to passports containing sufficient random digital keys to transmit all of the sensitive information a household would share in a year or even a decade. We would effectively be returning to the model of traditional phone services where telecommunication companies managed the confidentiality of the transmission and government agencies could tap the conversations with appropriate (and properly regulated) court supervision.


The Pitfalls that You Should Always Avoid when Implementing Agile
An agile transformation is only likely to succeed if heavily supported on an ongoing basis by the management. It is also a prerequisite for a robust agile implementation that the management level is acting according to agile principles, and not only the project teams. ... Adopting agile will impose significant changes to an organization that may have worked according to traditional principles for maybe even 10 – 20 years or more. It may look simple but it is not. ... Top-level managers often do not see the necessity to involve themselves in projects on an ongoing basis. Yes, they allocate (a lot of) money to projects, and yes, they do get upset, when projects overspend or get delayed, but usually they do not see themselves as more than an escalation point for their Project Managers.


Will Google Artificial Intelligence Run An IT Help Desk?
The system can respond to questions, and have long, complex conversations with people. In tests, it was able to help users diagnose and fix computer problems, including Internet browser and password issues. The AI also taught itself how to respond to inquiries about morality and philosophy. The answers were coherent enough that you might mistake them for something your stoner roommate from college once said. ...  The machine is able to do this because it was designed to come up with an appropriate response based on context. “These predictions reflect the situational context because it's seen the whole occurrence of words and dialogue that's happened before it,” Jeff Dean, a Google senior fellow, said at a conference in March.



Quote for the day:

"The signs of outstanding leadership are found among the followers." -- Max DePree

June 26, 2015

The Wait-for-Google-to-Do-It Strategy
When it comes to the current state of innovation and the economy, the implications of Google Fiber are complicated. On the one hand, it is a testament to the power of competition. Google’s willingness to invest the money in a new network threatened cable and telecommunications companies’ dominance, and took customers away from them. That shifted the economic calculus. It’s no coincidence that the cities and regions where cable companies first announced they were building fiber, and offering high-speed connections at affordable prices, have been the places where Google Fiber either is or is going. At the same time, though, it’s depressing that ensuring competitive broadband markets required the intervention of an outsider like Google.


Are you suffering from a cloud hangover?
Recent research from Sungard Availability Services found that the hangover is costing European businesses an average of more than £2 billion; with the overwhelming majority of businesses (81%) in the UK, Ireland, France and Sweden having encountered some form of unplanned cloud spending. Not only is each organisation within these countries paying an average of £240,000 per year to ensure cloud services run effectively, but they have also spent an additional £320,000 over the last five years thanks to unforeseen costs such as people, internal maintenance and systems integration. And despite being hyped as a way to reduce IT complexity, many early adopters of the cloud (43%) have found that the complexity of their IT estate has in fact increased since their initial cloud investment.


Why datacentre zero is an impossible dream, even with help from the cloud
Rob Fraser, CTO for cloud services at Microsoft UK, agrees with the assessment that eventually the price of cloud services will fall to a point where it becomes near impossible for firms to compete with the public cloud on computing cost. "Fundamentally, from an economic point of view, there must come a point at which the cost per unit of compute, unit of storage, unit of analysis becomes hard to compete with the scale of public cloud. Economically look at all the forces of commoditisation and that point will have to occur," he said. But to believe that businesses will move wholesale to the cloud on the basis of cost alone, he said, is to ignore a swathe of issues beyond price. "There are still going to be hugely valid reasons why on-premise infrastructure needs to run its own level of scale, even if it might be more pricey, because of other issues around the business."


How CISOs can create security KPIs and KRIs
"If I don't know what you're doing, how can I help you? I'm going to make some assumptions about what you're doing and I could be completely wrong," Durbin says. "Security guys are always talking about cost. If we realign this, the security guys can now go to the business and say, 'look, if this is what is important to you, this is the role I can play in helping you protect that, but I don't have the funding for a variety of reasons.' The business can then make the call as to whether to find the funding for that problem. It's no longer the security guy's problem, it's the business's problem."


Commodity Data Center Storage: Building Your Own
New architectures focusing on hyperscale and hyper-convergence allow you to directly abstract all storage and let you manage it at the virtual layer. These new virtual controllers can reside as a virtual machine on a number of different hypervisors. From there, it acts as an enterprise storage controller spanning your data center and the cloud. This kind of hyper-converged virtual storage architecture delivers pretty much all of the enterprise-grade storage features out there including dedup, caching, cloning, thin provisioning, file replication, encryption, HA, and DRBC. Furthermore, REST APIs can directly integrate with proprietary or open source cloud infrastructure management systems.


Red Hat builds on its open source storage portfolio
This is the first version of the software that does not require RAID (redundant array of inexpensive disks) technologies to ensure data integrity, meaning organizations could save on storage hardware by as much as 75 percent, given they would not have to buy the additional storage to make duplicate copies of the data. Gluster can now also guard against bit rot, or the gradual decay of files on disk that can, over time, render them impossible to read. Gluster can now also take the place of hierarchical management systems (HSM), which offload old or less consulted data to less costly, slower storage systems. Gluster now offers operators fine-grain control of where to store data.


Don't be afraid to give your strategic goals a tuneup
As is often the case with startups, strategic goals and objectives will change based on reasons such as changes in technology, desires and directions of customers' needs -- or just a shift in the end-game vision itself. For us, our long-term strategic goals were to provide our customers with a host of offerings that effect a total digital transformation. Yet with our current and growing client list, we were not landing the larger jobs that fit in to this end-state. Instead, we were working on many tactical projects related to Web presence redesign and re-definition (modernization) and customer engagement. What my astute business partner had realized was that our corporate messaging was saying one thing, yet what our new clients needed was something a little closer to the ground.


Computers Are Getting a Dose of Common Sense
Making computers better at understanding everyday language could have significant implications for companies such as Facebook. It could provide a much easier way for users to find or filter information, allowing them to enter requests written as normal sentences. It could also enable Facebook to glean meaning from the information its users post on their pages and those of their friends. This could offer a powerful way to recommend information, or to place ads alongside content more thoughtfully. The work is a sign of ongoing progress toward giving machines better language skills. Much of this work now revolves around an approach known as deep learning, which involves feeding vast amounts of data into a system that performs a series of calculations to help identify abstract features in, say, an image or an audio file.


Who is responsible for digital leadership in the boardroom?
Soon, every member of the C-suite will need to be leading digital. ... there needs to be a concerted effort to raise the digital IQ of the whole senior leadership team which, in turn, will ensure the broader organisation appreciates emerging digital opportunities and imperatives. Developing and, where necessary, recruiting digital talent across the functions will also be required to shape and deliver the desired transformation agenda. The alternative is not attractive. There are still too many firms where the status quo prevails – an enterprise IT organisation that is disconnected from or can’t keep up with emerging digital business activities, a C-suite where “technology is not my job” attitudes are still deemed acceptable, and isolated pockets of digital activity in marketing, engineering and elsewhere that have yet to coalesce into a real digital strategy.


The Rise and Risk of BYOD (INFOGRAPHIC)
The BYOD trend presents its own concerns when it comes to protecting confidential workplace data. As the trend continues to gain steam, CIOs and IT departments are faced with a whole different set of variables due to the variety of devices and the relative level of security that comes with each device. The push to be able to your own mobile device to work is also a push to house personal information and apps alongside company data. It can be a risky endeavor. Although convenient, the conglomeration of data in this way is like a treasure trove for potential thieves.



Quote for the day:

"To do one must set goals; to set goals one must have a dream; to dream one must have an opportunity." -- ‏@Orrin_Woodward

June 25, 2015

Refactoring with Loops and Collection Pipelines
A common task in programming is processing a list of objects. Most programmers naturally do this with a loop, as it's one of the basic control structures we learn with our very first programs. But loops aren't the only way to represent list processing, and in recent years more people are making use of another approach, which I call the collection pipeline. This style is often considered to be part of functional programming, but I used it heavily in Smalltalk. As OO languages support lambdas and libraries that make first class functions easier to program with, then collection pipelines become an appealing choice.


Security should be enabling, says HP strategist Tim Grieveson
“Information security professionals have an important role in helping organisations build a culture of security, so that everyone can make a contribution because they understand the value of data and the associated security risks,” he said. By raising people’s situational awareness, he said, they are more likely to be self-policing when dealing with company data and wary of things like “shoulder surfing” or revealing personal and business information during phone calls on trains. “Like health and safety, information security should be a concern for everyone, but changing an organisation’s culture is challenging. Security needs to be built into every new product, application and business process,” he said.


Can LibreOffice successfully compete with Microsoft Office?
"If a customer has a problem opening your files (created in LibreOffice) then they can always download a free copy of LibreOffice," he says. But while this is true in theory, in practice it's probably more likely that a customer will expect you to get a copy of Office if you want to do business with them, rather than download some software they’ve probably never heard of themselves. Meeks also works at Collabora – a U.K.-based company that provides commercial support and maintenance for LibreOffice – and he says the company uses LibreOffice software without any interoperability problems. "We run a multimillion dollar business using LibreOffice and we routinely exchange documents with lawyers – and it works fine," he maintains.


Office 365 – Common Exchange Online Hybrid Mail Flow Issues
There are a number of symptoms that might indicate that hybrid mail flow is not working properly even when messages are routing. If messages between environments occasionally end up in a user’s junk mail, that’s definitely a good sign there is a misconfiguration. Issues booking conference rooms or receiving the wrong out-of-office message are also symptoms. It basically comes down to whether the messages between environments appear as “internal” or “external”. So just like a conference room is not going to accept a booking request from a random Internet user, it won’t accept a booking from your cloud user if that message appears as external. The quick way to check is to look at a message received in each environment and check the message headers.


Succeeding with Automated Integration Tests
Automated tests that are never or seldom executed can even be a burden on a development team that still try to keep that test code up to date with architectural changes. Even worse, automated tests that are not constantly executed are not trustworthy because you no longer know if test failures are real or just because the application structure changed. Assuming that your automated tests are legitimately detecting regression problems, you need to determine what recent change introduced the problem — and it’s far easier to do that if you have a smaller list of possible changes and those changes are still fresh in the developer’s mind. If you are only occasionally running those automated tests, diagnosing failing tests can be a lot like finding the proverbial needle in the haystack.


Dropbox Is Struggling and Competitors Are Catching Up
Stability may be in short supply in the company’s executive ranks. Ilya Fushman, the head of product for Dropbox for Business, became a venture capitalist in June. The company’s head of design, Gentry Underwood, has stepped down, too, though he remains with Dropbox in an unspecified capacity. Woodside hasn’t been able to hire an overall head of product management, the person who’d be trying to match the security and other features in place at Microsoft and Box. Box, which went public in January, is something of a cautionary tale for Houston and Woodside. Its total 2014 revenue was about 60 percent of Dropbox’s, according to IDC, but its market value is now only one-fifth of Dropbox’s private valuation, suggesting that the office cloud market may not grow fast enough to bridge the gap between investor fantasy and reality.


To 'fail fast,' CIOs need to think strategically from the get-go
It's almost like building out the framework. But unlike a house, where once you put a lot of the infrastructure in place, it's not so easy to tear apart and put new stuff in, with software, it is. We built an actual working product pretty darn quickly, and then we've iterated. As opposed to a pure design phase and build phase and then a customer test phase, it's almost been bunches of loops of that. Of course we had to think strategically in the beginning. What's going to be the broad framework for the technologies that we need to have in place to support what we want to do? But then when I saw that framework, I felt pretty confident that we could achieve the implementation. One of the challenges in failing fast and what's important is that you find that different languages need to be in place: the business language, the clinical language, the technology language.


Digital government 'a chance to build a new state'
Hancock also echoed his predecessor, Francis Maude, in emphasising the importance of building digital government around the “user” – that is, the citizen – rather than the mechanics of government. “In Finland, town planners will visit a local park immediately after a snowfall, because the footprints reveal the paths that people naturally choose to take. These ‘desire paths’ are then paved over the following summer. We too must pave the paths people travel,” he said. “Let’s take a small example. 'Registry offices' are officially known as 'register offices'. But everyone in real life calls them called 'registry offices', so no-one ever really searches for ‘register offices’ online. We’ve paved the path that people travel, so the Gov.uk page comes top of the search results, even if you search ‘registry’ office.”


Vitaly Kamluk tells how Interpol catches cyber criminals and other stories
Very often we use common techniques and tools for computer forensic examination: Encase, Sleuthkit, various data carvers, data format recognizers, and even standard binutils. We develop a lot of scripts and tools ourselves, sometimes just for a single case: unpackers, deobfuscators, custom debuggers, dumpers, decryptors, etc. Reverse engineering binaries takes quite a lot of time as well. We also may do mapping infrastructure, scanning networks, ports. Developping sinkholing software and log parsers is yet another important part of quality research ... The Internet is not owned by a single entity — it’s a network of equal participants. The solution is the union of all participants of the global network against cybercrime.


Phil Zimmermann speaks out on encryption, privacy, and avoiding a surveillance state
At a recent private viewing of the exhibition that features the Blackphone, Zimmermann pondered what the emergence of whistleblowers like Snowden says about the current state of privacy. "The moral problems with the behaviour of our intel agencies should give us pause, should get us to step back and question, 'What are we getting our intel agencies to do?' We should take another look at this. We should try to restrain them more," he told the audience. "This has been my motivation for my entire career in cryptography," he says. "The driving force is the human rights aspect of privacy and cryptography and ubiquitous surveillance, pervasive surveillance... We live in a pervasive surveillance society."



Quote for the day:

“Being confident and believing in your own self-worth is necessary to achieving your potential.” -- Sheryl Sandberg

June 24, 2015

Oracle's biggest database foe: Could it be Postgres?
Gartner, for example, forecasts that more than 70% of new in-house applications will be developed on an open-source database by 2018, and that 50% of existing commercial RDBMS instances will have been converted to open-source databases or will be in process. In other words, open-source databases are almost certainly cutting off Oracle's oxygen when it comes to new applications, but it may also be cutting into its hegemony within existing workloads. If true, that's new. Though from a biased source, an EnterpriseDB survey of Postgres users certainly suggests that Postgres users are running the venerable open-source database for increasingly mission-critical workloads, including those that used to pay the Oracle tax:


Infographic: Must Read Books in Analytics / Data Science
There are 2 attributes all the members in our team at Analytics Vidhya share: We all are voracious readers; and We all love to share our knowledge with people in simplified manner, so that everyone gets access to this knowledge. These two attributes lead us to naturally gravitate towards sharing some of the best reads we come across. You can think of this infographic as an ideal list of books to have in bookshelf of every data scientist / analyst. These books cover a wide range of topics and perspective (not only technical knowledge), which should help you become a well rounded data scientist.


Snowflake Launches Virtual Data Warehouses On AWS
Snowflake isn't a data warehouse of big data dimensions or routine enterprise data dimensions. Rather, it's a virtual data warehouse that will be sized to match the job sent to it. When the analytical tasks are finished, the warehouse shuts itself off to save overhead. "In other cloud data warehouses, you would have to unload the data to turn it off and then reload it [to use it again]," he said. Snowflake avoids that data movement task. Although Snowflake runs on AWS at its US West facility in Oregon, customers may use Snowflake without an AWS account. They also don't need to understand the ins and outs of Amazon virtual machine selection.


Report Template for Threat Intelligence and Incident Response
When handling a large-scale intrusion, incident responders often struggle with obtaining and organizing the intelligence related to the actions taken by the intruder and the targeted organization. Examining all aspects of the event and communicating with internal and external constituents is quite a challenge in such strenuous circumstances. The following template for a Threat Intelligence and Incident Response Report aims to ease this burden. It provides a framework for capturing the key details and documenting them in a comprehensive, well-structured manner. This template leverages several models in the cyber threat intelligence (CTI) domain, such as the Intrusion Kill Chain, Campaign Correlation, the Courses of Action Matrix and the Diamond Model.


Startup’s Lightbulbs Also Stream Music
The speaker bulb, which twists into a standard-size light socket, contains white and yellow LEDs, the brightness or dimness of which coördinates wirelessly with other Twist bulbs that contain just LEDs. Astro, which plans to ship the gadgets early next year, says a starter pack with two LED bulbs, a speaker bulb, and a handheld dimmer switch will cost $399, reduced to $249 for two months to encourage people to sign up. While companies like Philips Hue focus on automating and customizing the lights themselves, Twist is among a handful of companies thinking of the lightbulb as a conduit for wireless audio, too. The company says it plans to add additional functions in the future as well.


Why It's Worth Divorcing Information Security From IT
Too often, when Security reports to IT, we find the IT mentality interferes with security processes and priorities. These days, there is little to no common ground between keeping IT systems up and running for authorized users and monitoring them for signs of compromise by smart, stealthy criminals. Identifying and securing an already compromised system requires the capability to differentiate malicious activity from normal behavior, and hackers are very good at making their activity look normal. The only way to find them is through a combination of new technologies and human judgment. Being a subdivision of the IT department makes security blind to important business processes and to decision making at the corporate and department level.


Aligning Private Cloud and Storage: 4 Considerations
Firstly, the private cloud offers a greater degree of control than the public cloud, especially with data. When you build a private cloud, you’re able to keep your data at your fingertips, establish performance levels that your organization demands to best serve end-users and customers and set security policies that align with your customer responsibilities or industry regulations. Secondly, private cloud gives you more control of applications. Most public clouds require apps to fit their cloud mould, but a lot of businesses have unique, custom-made applications and recoding these applications to fit the public cloud is not a good solution.


Finance Hit by 300 Times More Attacks Than Other Industries
As can be expected, cyber-criminals are working hard to ensure their attacks are as successful as possible, firing a large volume of low level threats at their targets in order to distract IT security professionals while the main targeted attack is launched, Websense said. Obfuscation, malicious redirection and black hat SEO have become popular of late, although patterns apparently shift on a month-by-month basis – again to improve success rates. Targeted typosquatting is also making a comeback in the sector, usually in combination with social engineering as part of spear phishing attacks designed to compromise a host or trick a user into instigating a payment or transfer of money, the report claimed.


Mobile app testing for fun and profit
"If you're doing testing for a mobile website you can more or less use the same tools as you would when just testing out a normal website with your browser," Prusak said. "Ultimately, it should still work with your browser, and there are plug-ins and extensions which work with today's browsers which you can modify HTTP headers or even the resolution and make the backend still think you're connecting on a mobile device." "I'm aware of a plethora of different solutions and all of them require either jail-breaking the device or installing software on your computer and then pointing your phone or device to your computer and using that as a proxy," Prusak said. He sees these solutions as rife with issues, inefficient, and too complex.


Why You Should Definitely Migrate Existing Apps to the Cloud
100% security is an illusion. If you have to make a decision based on the available choices, cloud services are in no way less secure than any of the existing systems in place. Cloud service providers are known for their innovations. It is apparent that at any point in time they would implement better physical and logical security practices than a standalone on-premise data center operation. Many cloud providers are now ISO, PCI DSS, EU Model Clauses and other global security agencies certified. Moreover not all the applications require a bank grade security. Do they? In case you’ve highly sensitive data or your app is subjected to specific security & privacy regulations (such as HIPAA and HITECH) you can opt for Hybrid Cloud Service



Quote for the day:

"The time is always right to do what is right." -- Martin Luther King Jr.