June 24, 2016

Mobile Payments: Where’s the Benefit?

Mobile payments continue to be less common than mobile banking — more consumers are checking their bank account balances and paying utility bills online, for example, than paying for their Starbucks coffee with their phones. But there were some positive signs for mobile payments adoption: For example, 24% of all mobile phone owners reported having made a mobile payment in the 12 months prior to the March 2016 survey, up from 20% in 2015. In addition, of current mobile payments users, 10% had started using mobile payments in the six months prior to the survey, and 20% said they had started using mobile payments in the prior one to two years. Younger individuals are adopting mobile payments faster. Of those with a mobile phone ages 18 to 29, 30% had made a mobile payment. Of those ages 30 to 44, 32% had done so.


How Smart Data Lakes are Revolutionizing Enterprise Analytics

Smart Data Lake solutions permit organizations to focus on the data that provides real business benefit. Currently, Smart Data Lakes are being adopted by pharma and financial institutions in use cases ranging from competitive intelligence and insider trading surveillance, to investigatory analytics and risk and compliance. For instance, the regulatory reporting environment for financial institutions is evolving quickly, placing unprecedented demands on legacy processes and technology. Two areas where new smart data solutions are already adding value for banks include report preparation as well as data and technology. Smart Data Lakes also improve the quality of your competitive intelligence by allowing subject-matter experts to curate, correct, and augment the data they know best.


Real Time Audit within the Capabilities of Blockchain

Some studies have even shown that firms are reporting downward pressure on audit fees due to clients questioning the value of audit services, especially given that they are now increasingly ‘commoditised’ as a result of being heavily regulated, and thus there is little differentiation among the services being offered by various auditors. Many believe that #blockchain could transform this process, in part because the #technology removes the need for auditing to depend on trust. Blockchain provides a globally distributed, decentralized ledger of which everyone has the exact same copy. Whereas auditing at present entails the confirmation of transactions and balances on a company’s accounting ledger at the end of the period, a transaction on the blockchain would provide a permanent and immutable record of the transaction almost immediately.


Five Reasons Traditional IAM Can’t Handle the Internet of Things

One particular problem area for IoT security is Identity Management. In many ways, the Internet of Things is fueled by the identities of things to enable connections between people, devices, and apps, all of which require Identity and Access Management (IAM). Indeed, managing identities and controlling access to this valuable information is a critical step in securing the Internet of Things (IoT), but legacy identity and access management (IAM) systems cannot handle the extreme scale and complexity that the IoT brings to the enterprise. The Identity of Things requires a new class of IAM system. The best practice for managing identity in the Internet of Things is to employ a next-generation IoT IAM platform. But what exactly does next-gen IoT IAM entail? And how does it succeed where traditional, workforce IAM fails?


How the blockchain could kill off national currencies

“Money isn’t valuable by itself,” Meiri says, “it’s what you can do with it that matters.” Neighbourhood community managers and municipalities around the world are already agreeing with him and are using Colu to keep wealth circulating within their local communities. The startup also just closed a $9.6 million funding round including investment from venture capital firms Spark Capital and Aleph. Technology requires a perfect storm of circumstances before an innovation really takes hold, Meiri adds. If Spotify was pitching way back in 1995 it would have been a great idea, but it took the widespread growth of high-speed internet to turn music streaming into a reality.


Next-generation enterprise security architecture to combat cyber weaponry

What's interesting is there's almost an inverse correlation of those steps between genius required to do them and criminality of the steps. What we mean by that is it is not illegal to create an attack, create an exploit or discover a vulnerability. If it were, there would be entire universes of white hat attackers and cyber researchers who would no longer exist. That's their job. They're supposed to go around and look for vulnerabilities and see how they respond under attacks. It just happens to be a very hard thing to do, which is why the genius requirement is so high. The other end of this ecosystem, laundering money, is clearly illegal, regardless of whether it's cyber money or real money or anywhere in between. And monetizing that information -- selling things like Social Security numbers -- is obviously pretty illegal as well.


How Does It Work: IPTables

In the Linux ecosystem, iptables is a widely used firewall tool that interfaces with the kernel’s netfilter packet filtering framework. For users and administrators who don’t understand the architecture of these systems, creating reliable firewall policies can be daunting, not only due to challenging syntax, but also because of number of interrelated parts present in the framework. The iptables firewall works by interacting with the packet filtering hooks in the Linux kernel’s networking stack. These kernel hooks are known as the netfilter framework. Every packet that enters networking system (incoming or outgoing) will trigger these hooks as it progresses through the stack, allowing programs that register with these hooks to interact with the traffic at key points. The kernel modules associated with iptables register at these hooks in order to ensure that the traffic conforms to the conditions laid out by the firewall rules.


Gartner's Top 10 Security Predictions

“By 2018, the need to prevent data breaches from public clouds will drive 20% of organizations to develop data security governance programs.” Data security governance will be promoted by insurance companies that will set cyber premiums based on whether businesses have these programs in place. Prediction: “By 2020, 40% of enterprises engaged in DevOps will secure developed applications by adopting application security self-testing, self-diagnosing and self-protection technologies.” Here Perkins looks to maturing technology called runtime application self-protection (RASP) as a way to avoid vulnerabilities in applications that might result from problems overlooked due to the rapid pace at which DevOps teams work. RASP does its work rapidly and accurately to provide protection against vulnerabilities that might be exploited, he says.


Hey ops teams, developers want control of the data center

Application teams are tired of integrating with each and every SDDC to make their apps portable. Instead of integrating with the SDDC, app teams can integrate with a container runtime like Kubernetes, Swarm, or Mesos. All of the "portability" work now happens in the container runtime. Crucially, the industry is making Kubernetes, Swarm, and Mesos work on top of all of the SDDCs so you don't have to. Ops teams and the vendors who love them (like VMware) are rushing to support these new system management tools. In turn, the tools, like Kubernetes, are starting to build in developer-oriented features. According to the Kubernetes blog: "Kubernetes defines not only an API for administrators to perform management actions, but also an API for containerized applications to interact with the management platform." This latter API is new to Kubernetes, but we should expect it (and more like it) to cater to developers.


Multi-cloud is a safety belt for the speed-freaks

In the case of "multi-cloud," then, it's your application and organizational-knowledge portability you're hedging with in return for locking into the cloud native method of application development and delivery. What you get in return is the ability to run your cloud platform on any IaaS, public or private, that comes up. You'll probably want to choose a platform based on open source, like Cloud Foundry that's ensconced in a irreversible foundation. Then, even if you use a commercial, "open-core" distro of it, you'll have a huge degree of portability because the same application packaging, services and microservices architectures, and overall runtime will remain similar no matter which distro you choose in the future. At this point, many people look to build their own platform to maximize their future freedom to leave.



Quote for the day:


"The truth will set you free, but first it will make you miserable." -- James A. Garfield


June 22, 2016

How Big Data Is Changing The Game For Backup And Recovery

In the world of today's next-generation databases -- where data is distributed across small machines -- it's not quite so simple. "There is no concept of a durable log because there is no master -- each node is working on its own stuff," Thakur explained. "Different nodes could get different rights, and every node has a different view of an operation." That's in part because of a trade-off that's been required to accommodate what's commonly referred to as the "three V's" of big data -- volume, velocity, and variety. Specifically, to offer scalability while accommodating the crazy amounts of diverse data flying at us at ever-more-alarming speeds, today's distributed databases have departed from the "ACID" criteria generally promised by traditional relational databases. Instead, they've adopted what are known as "BASE" principles.


How to hire for the right big data skill set

And before you can even determine what skills you need for data collection, it's important to first consider your audience and customer base. Polich gives the example of a bank, which can't withstand any down time or lag in data retrieval, so companies need to hire accordingly. That might mean hiring people who have worked in similar high-stress environments, where certain aspects of data matter more than in other industries. Alternatively, he also gives the example of a social media network, which can probably withstand a minimal amount of lag or inconsistency in data retrieval, especially if it results in cost-savings. That might mean you can hire someone with other skills that are important to your business or someone more accustomed to working in agile and innovative environments.


New life for residential Wi-Fi

The contrast between this modern approach and older residential Wi-Fi routers shows what happens when a market advances in linear fashion. The main selling features of a home router have for years been lower cost and higher headline speeds. This resulted in standard reference hardware implementations, packaged unimaginatively; the physical design and user-interface were contracted out to the lowest bidders, and we got what we paid for. With the exception of Apple (and Google, with a niche product), these are beastly products that even experts shrink from tinkering with. But the primary technical advance promoted by eero and Luma is, in the vernacular, “Wi-Fi that doesn’t suck.” These startups have seized on the universally recognized coverage issue: A single Wi-Fi router installed at the most convenient spot in the house is unable to provide reliable building-wide coverage in perhaps 10 percent of cases.


Expert panel explores the new reality for cloud security and trusted mobile apps delivery

When we look at mobile, we've had people who would have a mobile device out in the field. They're accustomed to being able to take an email, and that email may have, in our situation, private information -- Social Security numbers, certain client IDs -- on it, things that we really don't want out in the public space. The culture has been, take a picture of the screen and text it to someone else. Now, it’s in another space, and that private information is out there.  You go from working in a home environment, where you text everything back and forth, to having secure information that needs to be containerized, shrink-wrapped, and not go outside a certain control parameter for security. Now, you're having a culture fight [over] utilization. People are accustomed to using their devices in one way and now, they have to learn a different way of using devices with a secure environment and wrapping. That’s what we're running into.


Q&A with Roman Pichler about Strategize

Product strategy and product roadmap are neither agile nor anti-agile; it entirely depends on how we apply them. The challenge for any product is to first find a valid strategy—an approach that is likely to be effective—and then to review and adapt it on a regular basis so that the product becomes and stays successful. It would be a mistake to think of strategy as something static that merely has to be implemented: As the product grows and as the market and technologies change, the strategy and the roadmap have to change too. In some cases, these changes can be drastic—think of YouTube, for instance, which pivoted from a video-dating site to a video-sharing product. In other cases, the changes are incremental. Take the evolution of the iPhone, for example. When the first iPhone was launched in 2007, it did not allow people to take videos. But the latest generation uses video to set the product apart from the competition.


Top website domains are vulnerable to email spoofing

Of those vulnerable, 40 percent were news and media sites, and 16 percent were software-as-a-service sites, Detectify said in an email. A common way these domains are trying to prevent email spoofing is through a validation system called Sender Policy Framework or SPF. It essentially creates a public record, telling the Internet which email servers are allowed to use the domain. Ideally, any messages impersonating the domain will be detected as spam and rejected before delivery. In practice, however, the system can often come up short. The SPF will filter out spam emails best when on the so-called "hardfail" setting, but many website domains decide to implement the SPF at the "softfail" level. Although this will flag any forged emails as suspected spam, the messages will still be sent out to the recipient.


Why I Prefer Merge Over Rebase

Why would anyone be based on your work-in-progress branch? Because it happens. Sometimes tasks are not split that strictly and have dependencies – you write a piece of functionality, which you then realize should be used by your teammates who work on another task within the same story/feature. You aren’t yet finished (e.g. still polishing, testing), but they shouldn’t wait. Even a single person may want to base his next task on the previous one, while waiting for code review comments. The tool shouldn’t block you from doing this from time to time, even though it may not be the default workflow scenario.


Infographic: CIOs reveal IT hiring trends for 2016

On Tuesday, Robert Half Technology released the results of a survey that showed 21% of US CIOs plan on adding more staff to their IT department and 63% plan on only filling open IT roles. The report forecast IT hiring trends in the second half of 2016. The data was collected from 2,500 phone interviews with CIOs. John Reed, senior executive director of Robert Half Technology, said that many organizations are getting the go-ahead on new technology projects, which is leading to more hires, but they are still running into problems. "Technology leaders continue to struggle to find highly skilled talent in a market with low unemployment," Reed said. "They seek IT professionals with specialized skills, especially in the areas of cloud computing, data analytics, mobile strategies and cybersecurity."


IT talent biggest roadblock to digital transformation

When looking at hiring for new and in-demand skills, you might first want to look within your own workforce and see if there is an employee who could be trained in that area, says Holland. Another options, he says, to look to at third-party services "who are selling a full solution without the need to unpack what specific skills are necessary." ... Mark Troester, vice president of Solutions Marketing at software provider Progress, hiring for digital transformation is about striking a balance between skills. "From a leadership perspective, look for individuals that live in the middle of business and technology -- individuals that are entrepreneurial in spirit and have the ability to apply technology in new and creative ways. If that skillset is lacking then you may need to go outside the organization," he says.


Power of Teamwork In Data Science

Data science can be summarized as the interplay of data, statistics, technology and business. So by default, doing data science is about collaboration, teamwork and combining different skill sets. It does include but is not limited to statistics and mathematics. It would also include skills like computer science, machine learning, industry expertise (banking, insurance, retail etc.) and expertise on functional domains like sales, customer service or marketing, communication and presentation skills and last but not least, data visualization. A wide set of skills will not necessarily make it easier for organizations to find and recruit data scientists in the war on talent that has already started. My colleague Bhima Auro recently wrote a blog on how organizations can hire talent.



Quote for the day:


"Do not go where the path may lead; go instead where there is no path and leave a trail." -- Juliean Smith


June 20, 2016

Analytics Chalk Talks: Data Warehouses—Past, Present, and Future

We’ve got cloud, we’ve got mobile, we’ve got social media data, we’ve got machine data, sensor data. All these different sources. The data’s growing. In so many presentations on big data, you’ll see slides talking about the petabytes and zetabytes of data that you’ve got to be able to process to be able to handle the new world. But to me, that’s missing the point. You want to know why data warehousing is really, really relevant going forward, and why 70 percent of CIOs are increasing in their data warehousing spending. It’s because it’s not the volume and velocity of data that’s the problem. As that grows, we can just increase our hardware, increase our scale to be able to handle that. The problem is the increasing complexity that comes with the variety, the heterogeneity of data. We’re being asked to relate and integrate this data.


Writing for a Mobile Audience: 4 Things to Remember

Studies show that the experience of reading on a mobile device is different from the experience of reading on a larger screen. On traditional screens, eye-movements tend to start at the upper left corner and move right and then down. On mobile devices, eye-movements tend to stay in the center of the screen. So the way people are engaging your text on a mobile device is fundamentally different than the way they engage content on a larger screen. Your mobile content writing needs to reflect this reality. Long headlines can take up most of the screen on a small mobile device. You want to create short and strong headlines that grab attention without taking up the whole screen. Short headlines are easy to scan and digest on a mobile device.


Machine learning is the new Big Data

Now that the Big Data has landed and people are trying to figure out: How do I store it and manage it? What am I going to do with it, and at what velocity and to what end? That’s why at this Discover, we really need a big bat across the company and shiny neon lights around what we are referring to as applied to machine learning,” Veis said. “And it’s not an if; it’s a when andhow, because there’s no other way to tackle the amount of data hitting organizations.” ... “In that the technology itself, the computational assets that you can put at it and the ability to make it accessible, so it’s not just for the few but for the many,” Veis explained. “We are really taking ML and AI and making it for the developer. It’s not that the data scientist is still not key, but it’s now about going from fine china to everyday ware.”


What happens to those free Windows 10 upgrades after July 29, 2016?

The free upgrade offer never really applied to large businesses that run Windows Enterprise editions. For those customers who also have purchased Software Assurance for those volume licenses, the Windows 10 upgrade offer is, if not free, at least already paid for. The decision of whether and when to upgrade is driven by business needs, not by the cost of an upgrade license. In the new "Windows as a Service" model, Microsoft says it plans to deliver two or three new releases each year. The Anniversary Update is the first release in the Redstone update series, and it's scheduled to arrive at more or less the same time that the original upgrade offer ends. Another Redstone feature update is scheduled to arrive in the first half of 2017.


IT Talent Biggest Roadblock to Digital Transformation

IDT reports that cross-functional knowledge is important for digital transformation. In the study, 88 percent of respondents said "extensive business-related knowledge on the IT side is crucial for developing a digital transformation strategy," but they're experiencing major gaps across departments. Fifty-eight percent say their IT executives have the right knowledge for digital transformation, while only 27 percent said their business executives had the same level of technical knowledge. "The business and IT leaders need to come together to create a single strategy driven by business goals that provides overall guidance which the entire organization can support," says Troester. As a result, new job titles are emerging, targeted at blending the line between business and IT, according to Harry Osle


Why IT Must Pursue an Information Governance Plan

The "State of Information Governance" report defines IG as "the activities and technologies that organizations employ to maximize the value of their information while minimizing associated risks and costs." To support this, the research reveals that most companies are issuing formal data use policies and requiring employees to identify data that is confidential. They're also training staffers on data storage and archiving. In addition, findings break down organizations into those which are "high performing" on IG, and those which are not. While overall adoption rates among both are strong, high performers are more likely to deploy email and file archiving, while issuing formal use policies. "Information is both the lifeblood and the bane of any business, no matter its size, industry or location," according to the report. "Enterprises collect and analyze data from a myriad of internal and external sources to improve business efficiencies and decision-making processes.


Will Fintechs Kill the FICO Score?

"The rumors of the demise of the credit score have been greatly exaggerated, “credit analyst John Ulzheimer said. “It's still the best risk-assessment tool in the market and is used ubiquitously across all mainstream lending-decision platforms."  Yet the discussion at the Future of Fintech conference, and other public comments lately by industry players, leave the strong impression that change is underway. Mike Cagney, the chief executive of the online lender Social Finance, announced in January that SoFi has become a “FICO-Free Zone,” meaning it no longer factors FICO scores into its loan-qualification process. Instead, the company considers potential borrowers’ employment history, track record of meeting financial obligations and monthly cash flow to determine if they are creditworthy.


4 Popular Fintech Topics and What You Should Not Miss

Blockchain, as a pure platform technology, will maybe able to cut out the middlemen (or middle companies) everywhere. Because of that, many fintech entrepreneurs and experts constantly discuss topics such as: What does Blockchain matter? What are the various Blockchain concepts – sidechains, hyperledger, public or private blockchain? Who are the major players? What are they building or what sectors are they targeting? What is the investors’ perspective on blockchain startups? What are the latest notable blockchain startups? It’s time to become part of the digital revolution and join the network and platform-emerging world.


1 in 5 European banks would buy fintech startups

Regardless of which approach the banks take, there is no denying that we’ve entered the most profound era of change for financial services companies since the 1970s brought us index mutual funds, discount brokers and ATMs. No firm is immune from the coming disruption and every company must have a strategy to harness the powerful advantages of the new fintech revolution. The battle already underway will create surprising winners and stunned losers among some of the most powerful names in the financial world: The most contentious conflicts (and partnerships) will be between startups that are completely reengineering decades-old practices, traditional power players who are furiously trying to adapt with their own innovations, and total disruption of established technology & processes


HPE looks to move data between computers at the speed of light

HPE calls its photonics chip module X1, and it is still in early testing. In the future, attaching a fiber optic cable to computers will be as easy as attaching Ethernet cables, said Michael McBride, director at HPE's silicon design lab. Ultimately, the connector technology and cables will be used in The Machine, HPE's new server design that focuses on processing by using storage and memory. The transfer range of X1 is about 30 to 50 meters. HPE also showed off other silicon photonics technology that can transfer data at distances up to 50 kilometers at 200Gbps (bits per second). Light is already being used as a long-range data transfer mechanism in large telecommunications networks. Intel is also working on silicon photonics modules and is expected to ship them later this year. It isn't clear if HPE is working with Intel on its technology.



Quote for the day:


"An approximate answer to the right problem is worth a good deal more than an exact answer to an approximate problem." -- John Tukey


June 17, 2016

Hiring Your First CISO: A How-to

A track record and trusted network of industry relationships are keys to successful CISO searches. The hiring company should be confident in the recruiting firm’s knowledge of market data on compensation, its ability to understand their culture and its network to provide a diverse slate of qualified candidates. With extreme demand for well-qualified candidates, an inverse relationship exists between the length of the interview process and likelihood of acceptance. Organizations should streamline the process by ensuring interviewers understand the CISO role and responsibilities, and remember to sell the benefits of joining the team. Our firm sets up a launch call with the hiring manager and key stakeholders, provides a slate of spot-on candidates within the first 15 days, has biweekly update calls and partners to find the best possible candidate in a timely manner.


Bring on the blockchain future

Crucial functions — such as payments and trading — remain concentrated in large, undercapitalized banks or other central hubs; despite regulators’ efforts, losses at those institutions could still have economywide repercussions. To make matters worse, the authorities don’t yet have a clear real-time picture of what’s happening in financial markets or where risk is concentrated. Blockchain technology is capable of addressing both issues. Finance is all about trust: Essentially, financial institutions evolved to enable transactions with strangers. Centralized intermediaries of various kinds solved that problem, keeping track of who owns what and who owes whom. But centralized intermediaries also create points of systemic vulnerability. Regulators continue to wrestle with this underlying — and hitherto unavoidable — dilemma.


Networking the Cloud for IoT – Pt 3 Cloud Network Systems Engineering

Using this model, security issues must be addressed through a multi-layered approach. From a system engineering point of view, users must be forced to implement complex passwords and Public Key Infrastructure (PKI) certifications must be a minimum requirement for operating across the IoT network. The article, “How to protect Wearable Devices Against Cyberattacks,” in IEEE Roundup online magazine, postulated that, where there are devices with limited functionality, they can be linked to the user’s smartphone, which can act as a conduit for the device’s information, thus securing it from the outside world. Most importantly of all, though, is ensuring that the proper amount of Systems Engineering design rigor has been exercised in the development process. This makes defects easier to find and much less costly than a multimillion-dollar security breach.


Data Science of Variable Selection: A Review

The fact is that when confronted with massive numbers of candidate predictors and multiple possible targets or dependent variables, the classic framework neither works, holds nor provides useful guidance – how does anyone develop a finite set of hypotheses with millions of predictors? Numerous recent papers delineate this dilemma from Chattopadhyay and Lipson's brilliant paper Data Smashing: Uncovering Lurking Order in Data (available here) who state, "The key bottleneck is that most data comparison algorithms today rely on a human expert to specify what ‘features’ of the data are relevant for comparison. Here, we propose a new principle for estimating the similarity between the sources of arbitrary data streams, using neither domain knowledge nor learning."


UNIX®: Lowering the Total Cost of Ownership

TCO is greatly reduced because a UNIX certified operating system lowers the acquisition, maintenance and updating costs. The benefits of UNIX mentioned above also hint at reduced administrative, training and operational costs which also reduces the total cost of ownership which also should be consider in evaluating solution cost. IT decision makers should consider how choosing an operating system that is UNIX certified will benefit the TCO profile of their solution(s). This is especially true because making standards a requirement, during acquisition, costs so little yet can have such substantial benefits to TCO, enabling accelerated innovation and demonstrating good IT governance.


Berkholz on how DevOps, automation and orchestration combine for continuous apps delivery

IT is going through this kind of existential crisis of moving from being a cost center to fighting shadow IT, fighting bring your own device (BYOD), trying to figure out how to bring that all into the fold. How they do so is this transition toward IT as a service is the way we think about it. IT becoming more like a service provider in their own right, pulling in all these external services and providing a better experience in house. If you think about shadow IT, for example, you think about developers using a credit card to sign-up for some public cloud or another. That’s all well and good, but wouldn’t it be even nicer if they didn’t have to worry about the billing, the expensing, the payments, and all that stuff, because IT already provided that for them. That’s where things are going, because that’s the IT-as-a-service provider model.


Hack the hackers: Eavesdrop for intel on emerging threats

“Obviously the time between vulnerability recognition and vendor patch release or workaround is valuable for threat actors, but when detailed exploit guides are available in multiple languages, that time delta can be disastrous for businesses,” Gundert says. The OPcache Binary Webshell vulnerability in PHP 7 is another example of attackers jumping ahead of the game. Security firm GoSecure described the new exploit on April 27, and Recorded Future uncovered a tutorial explaining how to use the proof of concept referenced in GoSecure’s blog post on April 30. As GoSecure noted, the vulnerability didn’t universally affect PHP applications. But with the resulting tutorial, attackers could have an easier time finding servers with potentially dangerous configurations that make them vulnerable to the file upload flaw.


Being a great communicator and facilitator is key to CISO role

In this day and age, CISOs need to be good security communicators. So what security professionals should do is try to communicate the threats, the security risks, in a non-threatening way, and also in [terms] that a business or a board member can understand. I would put it as loss of revenue because of application downtime; loss of brand or image because of a breach, because we lost X amount of records. That communication piece is a big part of the CISO role now to make it a part of the business and make it a facilitator. [CISOs need to make it clear that security is not just] something that they need to do because they want to be compliant or they need to do because they don't want to be featured in The Wall Street Journal.


The Evolution of Code Review [Infographic]

Though we love tool-based code reviews, we’ve also seen that these other forms of code review are still the go-to method for a number of organizations. In fact, when we surveyed more than 600 software developers, testers, and IT professionals earlier this year, we found that: 72% of teams are doing ad-hoc, or “over-the-shoulder” code review; 63% are doing tool-based code review; and 53% are doing meeting-based code review. No matter what form of code review your team is involved in, it’s important that your team is regularly doing code reviews. In the below infographic, we’ll take a closer look at the different types of peer reviews and the benefits, as well as downfalls, of each. Find out which code review method is right for your team.


Virtual Panel on (Cloud) Lock-In

It is important to separate out switching out implementations with switching out interfaces. Standard interfaces (even at the conceptual level) reduce the risk. Users can deal with problems without having to rewrite everything. The core concepts around VMs and object stores are well understood enough that the switching cost is low. Conversely, the more unique the system is at the conceptual level the harder it will be to switch. Unfortunately, these unique features often provide enormous value. The nastiest surprises are those systems that appear to be a safe bet but often end up becoming a nightmare in production. Developer focused storage systems are notorious for this. They can be super easy to get started with and provide a great experience at the start. Often times issues with performance, stability and operability will only show up after the application is launched and taking significant traffic.



Quote for the day:


"Every thought you have can be energetically calibrated, along with its impact on your body and your environment." -- @DrWayneWDyer


June 16, 2016

Here's How AI Is About To Make Your Car Really Smart

By 2020, Gartner predicts, there will be 250 million cars connected to each other and to the infrastructure around them via Wi-Fi systems that will allow vehicles to communicate with each other and the roadways. As the amount of information being fed into IVI units or telematics systems grows, vehicles will be able to capture and share not only internal systems status and location data, but also changes in surroundings in real time, according to Gartner analyst Thilo Koslowski. ... Speaking at the New England Motor Press Association Technology Conference at MIT, Pratt said auto makers are more focused on assisting drivers for years to come instead of producing fully autonomous vehicles that take the steering wheel from drivers. A lot of the discussion among automakers and within their R&D organizations involve how much control the car should have.


Blockchain Technology Successfully Piloted by Allianz Risk Transfer

ART and Nephila have worked with a number of firms to develop the proof of concept and see extensions of this technology having relevance across the insurance industry: for example, in optimizing the payment processes involved in international fronting for captive insurers, where multiple process steps are involved in transferring premium from a corporate to its own subsidiary. Laura Taylor, Managing Principal at Nephila, adds: “We believe technology will drive the future of insurance. We have invested a great deal accordingly and are pleased to extend our long-standing strategic partnership with ART to use of the Blockchain.” “In our journey to become more digital, Blockchain promises to help us create more transparent, more convenient and faster services for our customers,” says Solmaz Altin


Putting Purpose-Built Performance in NFV

Let’s start with a look at why NFV environments elicit performance anxiety, and where it comes from. The core technology for NFV – virtual machines (VMs) running on x86-based servers – emerged from the enterprise world. VMs are designed to “spin up” instances of an operating system that can run applications for an enterprise customer, and then “scale out” by adding more VMs and, if necessary, servers, to keep up with new subscribers. Certain applications in a service provider environment – for example, mobile services – require the capability to handle millions of subscribers. In addition, real-time communications applications have more stringent requirements than, say, a Web server. If a VM fails, there is a process for replacing it or moving it to another server, a move that might take seconds or even minutes in standard cloud environments.


Neural Network Architectures

Christian Szegedy from Google begun a quest aimed at reducing the computational burden of deep neural networks, and devised the GoogLeNet the first Inception architecture. By now, Fall 2014, deep learning models were becoming extermely useful in categorizing the content of images and video frames. Most skeptics had given in that Deep Learning and neural nets came back to stay this time. Given the usefulness of these techniques, the internet giants like Google were very interested in efficient and large deployments of architectures on their server farms. Christian thought a lot about ways to reduce the computational burden of deep neural nets while obtaining state-of-art performance (on ImageNet, for example). Or be able to keep the computational cost the same, while offering improved performance. He and his team came up with the Inception module:


Developing the Next Wave of Data Scientists

With Cortana Intelligence Suite, students had access to a rich set of tools such as Azure ML, Jupyter notebooks with R and Python, and rich visualization capabilities with Power BI. In each city, students were given access to a collection of local data sets and challenged to develop a useful predictive analytical application. Students picked from a wide range of areas including healthcare, environment, smart city design and more. With the support and mentorship of university faculty and Microsoft technical staff, students wrestled through the creative problem solving and technical implementation aspects of data science. Many high-performing teams even published their models in app stores. One example is Live London, the winners from UC London, who developed a safe neighbourhood tracker app available on the Android app store.


Nokia announces horizontal IoT platform called Impact

The IMPACT platform is modular in its approach, Ploumen said, allowing entities to "mix and match" services like device management or analytics, depending on what third-party components they may already use. It also includes a new edition of Nokia's Motive Connected Device Platform (CDP), which supports more than 80,000 device/sensor models and already has connected and managed more than 1.5 billion devices. Nokia has been in a rocky transitional period since its acquisition of Alcatel-Lucent in April last year, slashing jobs and reporting a loss of €613 million for the first quarter of 2016. However, the deal has allowed the company to focus on more forward-looking revenue streams like IoT. In April, Nokia announced its plans to acquire wearable and health-monitoring company Withings, adding to Nokia's portfolio in one of the fastest-growing IoT segments.


Singapore banks adopt voice biometrics for user authentication

“Conceptually, this is an attractive proposition that will allow financial institutions to introduce a completely new safeguard that reinforces existing authentication processes, while leveraging the growing availability and sophistication of consumer mobile hardware and merchant point-of-sale devices.” The hurdle for biometric authentication today is its strong hardware and software dependency, according to Ho. “Not all consumers have compatible devices to support fingerprint scanning, the quality of sound capture varies from phone to phone and is not regulated by a common industry standard, and merchant biometric requires investment in specialised tools which most stakeholders may be hesitant to bear,” said Ho.


Deep Learning Isn’t a Dangerous Magic Genie. It’s Just Math

Deep learning is a subfield of machine learning, which is a vibrant research area in artificial intelligence, or AI. Abstractly, machine learning is an approach to approximating functions based on a collection of data points. For example, given the sequence “2, 4, 6,…” a machine might predict that the 4th element of the sequence is 8, and that the 5th is 10, by hypothesizing that the sequence is capturing the behavior of the function 2 times X, where X is the position of the element in the sequence. This paradigm is quite general. It has been highly successful in applications ranging from self-driving cars and speech recognition to anticipating airfare fluctuations and much more. In a sense, deep learning is not unique.


Building Your Big Data Infrastructure: 4 Key Components Every Business Needs To Consider

Big data can bring huge benefits to businesses of all sizes. However, as with any business project, proper preparation and planning is essential, especially when it comes to infrastructure. Until recently it was hard for companies to get into big data without making heavy infrastructure investments (expensive data warehouses, software, analytics staff, etc.). But times have changed. Cloud computing in particular has opened up a lot of options for using big data, as it means businesses can tap into big data without having to invest in massive on-site storage and data processing facilities. In order to get going with big data and turn it into insights and business value, it’s likely you’ll need to make investments in the following key infrastructure elements: data collection, data storage, data analysis, and data visualization/output. Let’s look at each area in turn.


BMW exec says industry ready to battle hackers and make move to 5G

It matters a lot to connected cars. We have been working with major telecos (telecommunications companies) and also with some equipment manufacturers to shape 5G and make it useful for connected cars. In the past a lot of the connectivity was related to classical entertainment services. In the future a lot of the functionality will be more "serious." For example, automated driving will require the car to be entirely safe even without a mobile connection. On the other hand, a lot of the services that 5G can enable will help make that a really good product. Automated cars will move based on maps and sensors, relating what they see to what’s in the map. Updating that map is going to be something done through mobile connections.



Quote for the day:


"It is not that I'm so smart. But I stay with the questions much longer." -- Albert Einstein


June 15, 2016

IaaS demand soars: Why this is great news for Amazon and IoT startups

The big question is: Can they keep up with the demand? Here's the thing: Amazon is dominating that space in ways that has every other company (even Google) shaking their heads. So when you have the likes of Google, Microsoft, and IBM playing a serious game of catchup with the big 'Zon, you know the demand is nowhere near the supply. No surprise, right? What is surprising, however, is that Google does not rank at the top of the heap. Considering the global domination of the Android platform, one would think Google leads the top seven providers, but Google doesn't come close to Amazon's IaaS profit. In 2015, Amazon Web Services drew in over $7 billion in profit, compared to Google Compute Engine drawing in a mere $281 million. Google knows it is lagging behind Amazon and is doing everything it can to shrink the margin.


How IT4IT Helps Turn IT into a Transformational Service for Digital Business Innovation

The next BriefingsDirect expert panel discussion examines the value and direction of The Open Group IT4IT initiative, a new reference architecture for managing IT to help business become digitally innovative. ...  This panel, conducted live at the event, explores how the reference architecture grew out of a need at some of the world's biggest organizations to make their IT departments more responsive, more agile. We’ll learn now how those IT departments within an enterprise and the vendors that support them have reshaped themselves, and how others can follow their lead. The expert panel consists of Michael Fulton, Principal Architect at CC&C Solutions; Philippe Geneste, a Partner at Accenture; Sue Desiderio, a Director at PriceWaterhouseCoopers; Dwight David, Enterprise Architect at Hewlett Packard Enterprise (HPE); and Rob Akershoek, Solution Architect IT4IT at Shell IT International.


Machine-Vision Algorithm Learns to Transform Hand-Drawn Sketches Into Photorealistic Images

A more promising approach is to use machine-vision algorithms that rely on neural networks to extract features from an image and use these to produce a sketch. In this area, machines have begun to rival and even outperform humans in producing accurate sketches. But what of the inverse problem? This starts with a sketch and aims to produce an accurate color photograph of the original face. That’s clearly a much harder task, so much so that humans rarely even try. Now the machines have cracked this problem. Today, Yagmur Gucluturk, Umut Guclu, and pals at Radboud University in Denmark have taught a neural network to turn hand-drawn sketches of faces into photorealistic portraits. The work is yet another demonstration of the way intelligent machines, and neural networks in particular, are beginning to outperform humans in an increasingly wide variety of tasks.


Ransomware Attacks Taking Huge Toll On Healthcare Resources

Traditional reliance on policies, procedures and training to promote confidentiality also no longer are effective when the data integrity is threatened because it’s not accessible, says Paul Bond, a partner in the Reed Smith law firm who specializes in IT and privacy issues. With the availability of health data in peril, organizations must have contingency plans in place so they have an action plan for what to do when facing a ransom incident. Should they pay the ransom and get their data back? Some organizations may not have an alternative if their data back-up processes were not optimal. Some hospitals have paid ransom. For example, Hollywood Presbyterian Medical Center in Los Angles struggled for 10 days to regain its data, then paid $17,000 in Bitcoin—an Internet currency—to get access back to its data. Kansas Heart Hospital paid an undisclosed amount of ransom, but did not get back all its data after the attackers demanded another ransom, and the hospital refused.


Blockchain’s Benefits Extend Beyond The Financial Services Sector

“Any multi-party process where shared information is necessary to the completion of transactions and the coordination of activity and the exchange of value — that’s where blockchain technology can be put to good use,” Ms. Masters told attendees of The Wall Street Journal’s CFO Network in Washington D.C. “It’s one of the great opportunities, I think, in the financial services sector,” Ms. Masters said. “We’re talking about billions of dollars in annual savings for the banking industry.” ...  Blockchain can help companies in all industries manage the movement of money in exchange for goods and services across multiple different parties in a secure, timely and coordinated way. Instituting a centralized, encrypted repository for such information can help companies make complicated transactions more efficiently, she explained.


What is probabilistic programming?

A probabilistic programming language is a high-level language that makes it easy for a developer to define probability models and then “solve” these models automatically. These languages incorporate random events as primitives and their runtime environment handles inference. Now, it is a matter of programming that enables a clean separation between modeling and inference. This can vastly reduce the time and effort associated with implementing new models and understanding data. Just as high-level programming languages transformed developer productivity by abstracting away the details of the processor and memory architecture, probabilistic languages promise to free the developer from the complexities of high-performance probabilistic inference. What does it mean to perform inference automatically? Let’s compare a probabilistic program to a classical simulation such as a climate model. A simulation is a computer program that takes some initial conditions such as historical temperatures, estimates of energy input from the sun, and so on, as an input.


Air Force uses event-driven framework and SOA to support warfighters

The primary reason is really to expose information that's kind of "stove piped" in all our legacy systems and make that available [while also] protecting it from our adversaries. We're moving into the SOA environment precisely for that reason. Most of the legacy systems ... were built on a client-server framework. ... Data is kind of bottled up in those databases. And with the SOA middleware layer, we're exposing that data and making it available to other users without building custom interfaces that pretty quickly become expensive to manage. The success of [these] money-saving and time-saving innovations is critical to the Air Force's ability to operate, particularly in a fiscally constrained environment. We can show case after case of reuse of the SOA environment where we've been able to transition quickly to another operational need, make connections and make data available very rapidly.


Who’s The Digital Transformation Boss, and Why Should it be the CIO?

More and more, the CIO is taking a leadership role in digital strategy. When it comes to digital transformation, however, the CIO can’t go it alone. Digital transformation requires collaboration, and a joint set of initiatives that combine business and technology. “We’re not just talking about IT for IT’s sake, but about innovation with the business around business capabilities,” says Snyder. Digital disruption, he explains, is no longer just about developing new business models — which was the biggest expectation last year. In 2016, expectations have shifted to focus on digital transformation in the form of new and innovative products and services, as well as new forms of customer engagement.  “That’s why digital transformation must be done collaboratively,” says Snyder. “You can’t do this without the rest of the business...it is the business.”


Service Wiring with Spring Cloud

For simple applications, external configuration for dependency addresses may well be sufficient. For applications of any size though, it's likely that we'll want to move beyond simple point-to-point wiring and introduce some form of load-balancing. If each of our services depends directly on a single instance of its downstream services, then any failure in the downstream chain is likely to be catastrophic for our end users. Likewise, if a downstream service becomes overloaded, then our users pay the price for this through increased response times. What we need is load balancing. Instead of depending directly on a downstream instance, we want to share the load across a set of downstream service instances. If one of these instances fails or becomes overloaded then the other instances can pick up the slack. The simplest way to introduce load balancing into this architecture is using a load-balancing proxy.


How to build an effective ransomware defense

Make sure all systems are promptly updated with the latest operating system security patches; Enforce anti-malware scanning across all departments, and ensure your malware signature databases are up to date; Implement content-based scanning and filtering on email servers, particularly where access to cloud services such as Gmail, Yahoo Mail, and Outlook.com are permitted from the enterprise network; Restrict users’ access to only those systems that are necessary for their roles. Avoid “access sprawl.”; Use two-factor authentication, so a stolen password isn’t enough to grant access; Ensure user accounts are de-provisioned promptly. There should be no orphaned accounts of former employees, especially if they served in a technical role; and Deploy and maintain a comprehensive backup system, including offsite storage, in the event that files need to be restored.



Quote for the day:


"Be decisive. A wrong decision is generally less disastrous than indecision." -- Bernhard Langer


June 14, 2016

How To Make A Digital Risk Plan And Sell It To The Board

The plan should find the top half-dozen risks that threaten the business, and those are not necessarily the same as the ones that affect IT, says Garner analyst Jeff Wheatman. The question to address is, “What are top IT related risks that could lead to business risks becoming real?” he says. That’s what the corporate decision makers care about. Security executives have to create controls that balance the need to protect the business with the need of to keep it running efficiently. To do that the security experts have to talk to the business leaders while they are creating the plan, he says. That acts as a trial run of what might fly when the plan is presented to the board. Reactions from business group leaders can go three ways: We never thought of that; we worry about something else that’s not on your list; your list has items we don’t care about.


No robots required: AI will eliminate these jobs first

"AI doesn't have to pretend to be a person to have a huge value to the world," says Scott Crowder, CTO and vice president of technical strategy and transformation at IBM Systems. "It's about providing information and insight to humans, so we can do a better job." That's one reason why IBM prefers the term "intelligence augmentation" -- IA, not AI -- and defines its "Jeopardy" champion Watson supercomputer as "a cognitive computing technology that extends and amplifies human intelligence, working in partnership with professionals." AI is already serving on the front lines of service and support via voice-enabled virtual customer agents like Amelia. But because it also excels at analyzing massive amounts of unstructured data, the technology is ideally suited for identifying potential security threats or helping drive business decisions.


Blockchain Is Not Going To Change The World

The haggling process does not affect our ledger balances. But it does affect our messaging. We are establishing a relationship of some kind of trust. If we don’t trust the other person – for example, if the trader thinks the coins are debased, or I suspect the sword has been stolen (and I think the real owner might turn up to claim it) – we are unlikely to agree to trade. Money issued by a trusted source reduces the need for personal trust: if the trader trusts that the money is real, he may agree to sell me a sword even if he suspects I am a jihadi. (I am not advocating this, by the way). But even so, he isn’t going to hand over the sword until he knows for certain that I have the money to pay for it.


What’s Next for Artificial Intelligence

Deep learning, modeled on the human brain, is infinitely more complex. Unlike machine learning, deep learning can teach machines to ignore all but the important characteristics of a sound or image—a hierarchical view of the world that accounts for infinite variety. It’s deep learning that opened the door to driverless cars, speech-recognition engines and medical-analysis systems that are sometimes better than expert radiologists at identifying tumors. Despite these astonishing advances, we are a long way from machines that are as intelligent as humans—or even rats. So far, we’ve seen only 5% of what AI can do.


We’ve hit peak human and an algorithm wants your job. Now what?

Bank executives know what’s coming. So they’re setting up coder labs and investing in startups, teaming up with digital competitors or buying them outright. JPMorgan Chase, the biggest U.S. lender by assets, is using AI to identify potential equity clients. And it’s marshaling OnDeck Capital’s client-vetting algorithm to speed lending to small businesses. Both Bank of America and Morgan Stanley, which together employ more than 32,000 human financial advisers, are developing automated robo-advisers. More than 40 global banks have joined forces with startup R3 to develop standards to use blockchain, software that allows assets to be managed and recorded through a distributed ledger, to overhaul how assets are tracked and transferred.


How To Foster Curiosity And Creativity In The Workplace

Leaders, like their teams, must become more curious or risk becoming irrelevant. In fact, if they are going to create a corporate climate of curiosity, it is a must for them. So why don’t they embrace this? Mostly because they have to work at it. Curiosity is a developed skill and one that has to be nourished constantly. This takes time and effort, and unfortunately most leaders want to rely on what has put them in their jobs versus what is going to keep them relevant and in their jobs. Liz Wiseman sums it up best in her book, “Rookie Smarts”: “In a growing company everyone is under qualified every day. In business today, it is not about what you know but how fast you learn.” As such, leaders must become experts at learning and practicing curiosity.


The future of the IoT job market

In a nutshell, IoT will do exactly what technology does everywhere — it supplants low-skill jobs with high-skill jobs. Eventually, the Internet of Things will lead to widespread replacement of simple and repetitive jobs in areas such as manufacturing, administration, quality control and planning. But more importantly,IoT will lead to the creation of new jobs that will help organizations champion and pioneer not only their personal success with IoT, but the success of the business as well. So what are these jobs, and how should you rework your resume to be prepared for them? Many of these opportunities are new enough that they don’t even have titles yet. But don’t worry, we made some up.


Is it time to buy cyber insurance?

Cyber insurance is coverage that public- and private-sector organizations can buy to help manage the costs of cyber incidents -- costs that can be astronomical both in terms of dollar figures and loss of reputation. For example, the Office of Personnel Management has spent at least $133 million just on credit monitoring services. Studies last year of the per-record costs of data breaches ranged from $154 to $964. Cybersecurity insurance has been available for nearly a decade, but it’s only recently begun to catch on.  “Now you have like 60, 70 carriers writing policies, you have annual premiums of $2 billion and growing, which is I think big. I think that’s sizable,” Sasha Romanosky, a policy researcher at Rand Corp., said. “That’s not the level of car insurance or health, but it’s still significant.”


Forget the Cloud, Microsoft/LinkedIn Deal Is All About the Data

“What makes the most sense is the untapped value of all that content,” Laney says. “What makes the least sense is -- well, perhaps the price tag, but I'm no financial analyst -- is whether Microsoft is, or can be, positioned quickly enough to monetize all this content.” Laney takes issue with fellow analysts that claim this deal is all about Microsoft’s move to the cloud. “Financial analysts I listened to this morning yapping about "cloud this" and "platform that" have totally missed the big picture. It's all about the data,” Laney stresses. “What can Microsoft do with all this content? Almost anything,” Laney says. “LinkedIn Ts & C's are pretty clear (just like WhatsApps and every other social media co) that they can do almost whatever they want with the content, including transfer it.”


Hiring Disrupters in the Age of Disruption Part 1: Re-imagining Executive Search

Unless you’ve been living under a rock in recent times, you must be well aware of the waves of disruption sweeping through the world. Uber is upending the taxi industry and to think they are just a software tool and don’t even own any cars! Airbnb is practically the biggest hotel company in the world although they don’t own any properties. Google, Tesla, Netflix, Apple, Amazon. The list is endless – there’s never a dearth of newer business models and new technologies, or companies finding new ways to exploit existing technologies. Disruptive players are coming out of nowhere and toppling empires. Remember Kodak? In 1998, they had over 170,000 employees and sold 85% of photo paper worldwide. But in just a few years, their business model practically disappeared, and they were soon relegated to the has-been list.



Quote for the day:


"One if the hardest things in life to learn are which bridges to cross and which bridges to burn.” -- @Oprah


June 13, 2016

Machine Learning Could Help Companies React Faster To Ransomware

Exabeam's Analytics for Ransomware, a new product that was announced today, uses the company's existing behavior analytics technology to detect ransomware infections shortly after they occur. The product uses data from a company's existing logs to build behavior profiles for computers and users. This allows it to detect previously unknown ransomware without pre-existing detection signatures by analyzing anomalies in the file and document behavior of employees. To avoid false positive detections, the technology flags incidents as ransomware when the combined risk score of multiple suspicious activities that could indicate this type of threat reaches a certain threshold.


Blockchain as a Service – The New Weapon in the Cloud Wars?

Setting up an environment to test and research blockchain is not trivial undertaking. Blockchain is a distributed, peerto-peer technology. It requires an ecosystem with multiple systems in order to be able to develop, research, and test. I recently wrote about the benefits of leveraging the public cloud for test environments. One of the big benefits is the ability to stand up, deploy, test, and break down environments. No large hardware investments are needed, nor any capital investment. The cost involved is only during the time the environments are up and being used. From a cost perspective, this is a definite plus. We still have the complexity of setting up and configuring the blockchain ecosystem. This is where the concept of offering Blockchain As A Service (BaaS) can provide added value.


Linux Mint 18: Hands on with the Cinnamon and MATE betas

The Linux Mint developers were in a particularly difficult position, because they have two desktops that they had to adapt the Gnome utilities for (Cinnamon and MATE). This not only made for a lot of work, it created a significant support burden. The Mint developers finally decided to solve this problem in pretty much the same way that they solved the original Gnome 3 Shell problem - they just gave up on following the Gnome utilities, and they took it upon themselves to develop and maintain an equivalent set of utilities - which are now known as the X-apps. The X-apps are based on older, stable, and well-known versions of the Gnome utilities. Finally, the Mint developers have said that the X-apps will be developed and maintained in such a way that they will always be compatible with both Cinnamon and MATE.


Responding to climate change risks: the role of financial services firms

The pressure on business was ramped up following the formation of the Financial Stability Board (FSB) Task Force on Climate-Related Financial Disclosures (TCFD) at the end of last year, announced by Mark Carney and chaired by Michael Bloomberg. Since publishing its first report in April 2016, the Taskforce set the course for an intensive nine months of work, at the end of which it will produce detailed guidelines for companies to enhance how they disclose to their investors and lenders the climate risks they are financing. Again, this is hugely challenging but there is a good degree of alignment amongst Taskforce members and a shared ambition to fundamentally improve disclosure on climate risks, so we shouldn’t underestimate the outcome. While the guidelines will be voluntary, it is possible that some countries might choose to make them mandatory over time, and even if they don’t, pressure from investors


Why every organization needs Business Process Management

Within every organization there are business processes designed to meet objectives. But for any number of reasons they may have become slow, inefficient or come to the end of their days. BPM basically puts all these processes under the microscope using various metrics and analysis to identify where processes can be improved for maximum performance. When implementing new processes, BPM can ensure they are running as smoothly as possible. Organizations are increasingly acknowledging the need to improve their business processes and understand the advantages that come from process automation. BPM can help reduce paper handling and inefficiencies in areas such as contracts and invoicing, and also improve performance of both people and systems by giving remote workers access to the same ‘ user experience’ as those working within the walls of the organization.


How to build a data-driven culture with emotion

The challenge is that it's often difficult to sell the workforce on a nirvana that sits somewhere out on the strategic horizon. That's why it's important to shoot for quick wins. The quicker you can produce some evidence that your analytic prowess is working, the quicker hope is reinforced with strong belief. Look for an opportunity to run a pilot, and deploy your best tiger team to get through it as quickly as possible. And don't make your life more difficult than it needs to be: segment your opportunities by impact and ease of implementation; hopefully, you'll have at least one opportunity that has a high impact and an easy implementation. Take that opportunity and move through your pilot as quickly as possible so you can demonstrate — with evidence — how much better life will be when your company is more data-driven.


Meet the 'number one prevalent' new ransomware: Crysis

The strain copies files and pulls them from the network, placing organizations into the "territory of an actual data breach," says one security expert. "Especially in HIPAA-compliant organizations, (that's) an area no one wants to be." It can be hard to keep tabs on these types of ransomware strains, Sjouwerman said. "They compete; they come and go. We were expecting with the sudden demise of TeslaCrypt (a ransomware Trojan) that Locky would take over. But no. "If you look at the majority of ransomware attacks," he added. "Crysis, at the moment, is the number one prevalent attack." These attacks first began at financial institutions, and then moved to healthcare. While the next big target is the manufacturing industry, according to Sjouwerman, cybercriminals still have healthcare in their crosshairs and "this is unfortunately going to get a lot worse before it gets better."


Don Quixote and the Philosophy of Data

“The Don Quixote never getting to the windmill, the truth, isn’t a bad thing,” said Sherman. “It’s that more and more truths or contexts are being applied today, which just means more and more expansive use of data.” When data moves around, and the context in which it was generated is not maintained, meaning gets lost. In the case of business policies, practices such as undocumented hand coding, be it from ETL or application integration, can lead to what Sherman calls “data shadow systems.” It’s the age-old scenario where the left hand doesn’t know what the right hand is doing, and they both go on doing their own thing. The result? Inconsistency and inaccuracy. “The business person who probably does understand policy and business processes doesn’t understand the technology and the data integration and the consistency of data and how to create that,” said Sherman.


Nordic CIO interview: Johnny Bröms, Swedish fast-food chain Max Burger

To maximise the efficiency of Max’s small IT team (eight people in-house and eight consultants), its role has been divided into two distinctive parts: one works with support and operational issues, while the other focuses on project delivery, agreements and compliance. Bröms does not hide the fact that getting the whole company behind business-orientated IT has not been easy. But he sees it as the role of a modern CIO to get involved with other business functions and ensure everybody understands one another. “I have a technical background. My challenge has been to develop the business side so I can speak to the business the way they want and translate that back to my organisation,” says Bröms.


Data visualisation will drive enterprise Big Data analytics usage

Data visualisation technologies would need to keep pace with broader scope and tool set. The industry has already seen some disruptive technologies in the form of Tableau and Qlikview. There are also few upcoming open source tools like Datawrapper, Chart.js, D3 (Data-Driven Documents), Dygraphs and more. But the space still needs more maturity and still aren’t up to creating major shuffles. Another implication is the need for newer processes and skills that allows you in creating better data models. A decade ago, the concept of storytelling through data didn’t exist. Hence, there is an emerging demand for resources with skillsets to create a powerful story. Capabilities including animations, speech bubbles, auto-suggest would be weaved into the visualisation to create compelling propositions. By choosing for expensive data management tools organisations are underestimating the importance of people skills and the imperative to drive them.



Quote for the day:


"One of the tests of leadership is the ability to recognize a problem before it becomes an emergency." — -- Arnold Glasow