June 13, 2015

Is customer experience management the new CRM?
Dailes says there are some things companies can do to improve customer experience. “The first thing is to listen to your customers. Understand who your clients are. Understand their needs. You can do this through surveys or by watching users. "Another thing that is really important is to have a policy of constant improvement. Rather than look for major changes, look for slow and constant improvement over time, based on feedback. You are always going to have people who shout quite loudly about what they want, but that’s not always the most useful feedback. Look for feedback that represents the majority of users," she says.


Improving One Process Affected 18 Processes Before it Improved Business
So, every time you are going to look at process improvement, we need to focus on these changes in an incremental manner rather than a big bang approach. Instead of starting with process changes in 12 departments, start with changes in 3 departments and then seven departments and then 12 departments. Every time you are going to initiate process improvement, you must understand the impact on all the departments. Identify which of these are low impact, medium impact, and high impact departments. Secondly, identify the processes in each department. ... So, what are the processes we are talking about? Here we go..”Customer acquisition” and “Customer Relationship” and “Requirement Management”. “Requirement Management” in turn is the link to “Marketing” department as they conduct surveys with existing customers as well as prospects.


Steve Wozniak Says The Internet of Things Is in ‘Bubble Phase’
In the tech world, there’s no concrete definition for a “bubble.” However, one way to gauge bubble-like growth is through irrational industry hype. It’s easy to find bullish forecasts on the IoT market. Cisco believes the number of connected devices worldwide will double from 25 billion in 2015 to 50 billion in 2020. IDC claims the global IoT market will grow from $1.9 trillion in 2013 to $7.1 trillion by 2020. That’s why tech giants such as Google and Apple are pushing into smart homes, connected cars, wearable devices, and mobile payments. Spotting that trend, start-ups are flooding the market with IoT and wearable devices for even the silliest niches. A fart-analyzing wearable, a sex-tracking wearable, and a smart bra that detects binge eating all indicate developers are getting carried away with connecting things to the Internet.


Build your own supercomputer out of Raspberry Pi boards
The RPiCluster provides another option for continuing development of projects that require MPI [Message Passing Interface] or Java in a cluster environment. Second, RPis provide a unique feature in that they have external low-level hardware interfaces for embedded systems use, such as I2C, SPI, UART, and GPIO. This is very useful to electrical engineers requiring testing of embedded hardware on a large scale. Third, having user only access to a cluster is fine if the cluster has all the necessary tools installed. If not however, you must then work with the cluster administrator to get things working. Thus, by building my own cluster I could outfit it with anything I might need directly. Finally, RPis are cheap! The RPi platform has to be one of the cheapest ways to create a cluster of 32 nodes.


Naomi Lefkovitz explains what NIST's privacy risk framework means for agencies
with the Cybersecurity Framework for Improving Critical Infrastructure we were directed to include a methodology for privacy and civil liberties by the executive order. And that methodology was derived from the consensus-based document. And, essentially, at a high level it says, well, 'Consider the privacy implications when you're doing your cybersecurity measures.' That's a very high-level paraphrase, but that's sort of the concept. What we're doing with the risk management framework, which is aimed at federal systems but nonetheless, the concept is, 'OK, how do you consider those privacy implications? How do you go about identifying privacy risk?' Because we're never seen that process laid out.


Keep it simple and risk-based to secure collaboration
Having identified risks, the process of analysing and then treating those risks should be carried out. The key to this process is proportionality.  If the risk treatment becomes too expensive, in terms of time, resources or money, is it worth doing based on the risk? Equally, if the treatment makes doing the job, such as collaborating with a fellow employee, unwieldy and difficult, then the treatment has also failed. It may make the process safer in security terms, but has also made it more difficult and less efficient in achieving operational, business objectives. Security should enable, not inhibit and should always take into consideration the user experience. While risk treatment of a system or process will always be different, there are common themes which form the foundation of a well-managed, and ultimately secure, approach.


Content blocking via geolocation takes world wide out of the web
This is a subject of renewed interest for the European Commission, which formally announced the Digital Single Market initiative last month; the initiative is intended to identify and address issues related to the digital and physical delivery of goods and services across the 28 EU member countries. According to The Guardian, presently only 15% of online shoppers in the EU buy products from another country, while only 7% of small and medium sized businesses sell products across national borders. ... Interestingly, in an effort to limit the need for users to rely on a VPN to access content,Netflix monitors file sharing traffic to identify what films and TV programs are locally popular, and the company acquires the rights for those programs in order to provide a legal (and sanctioned) means to view that content in that country.


Cisco New Intercloud Services Focus on Next Generation Internet of Things Market
Organizations are demanding new ways to manage the exponential growth of data and the ability to obtain real-time analysis. To meet this need, Cisco collaborates with leading Big Data solutions such as MapR, Hortonworks, Cloudera and Apache Hadoop community. Working with these partners, Cisco safely extends Hadoop solutions on-premise to the cloud and provide a true hybrid deployment. It is also providing end customers to maintain the same policies, control and security in their Big Data implementations, as well as greater flexibility and an unlimited virtual scalability.


8 New Big Data Projects To Watch
The big data community has a secret weapon when it comes to innovation: open source. The granddaddy of big data, Apache Hadoop, was born in open source, and its growth will come from continued innovation in done by the community in the open. Here are eight open source projects generating buzz now in the community. ... Zeppelin essentially provides a Web front-end for Spark. The mighty Zep brings a notebook-based approach to giving users data discovery, exploration, and visualization of Spark apps in an interactive manner. The software, which is modeled on the IPython notebook, supports Spark and other frameworks, such as Flink, Tajo, and Ignite.


Cloud tech can make a Supreme Court decision against Obamacare irrelevant
Healthcare.gov is illegal, argue lawyers for Obamacare’s opponents. Their argument against the Affordable Care Act splits hairs about the law’s construction. The underlying legislation designates federal subsidies to be paid through tax credits to the buyers of health insurance purchased on state-operated exchanges. The ACA is silent about the eligibility for subsidies of purchases on the federally operated exchange healthcare.gov. The plaintiffs argue that the subsidies can’t be given to buyers using the federal exchange healthcare.gov, and only can be given to the buyers using the state exchanges such as Cover Oregon.



Quote for the day:

"You must expect great things of yourself before you can do them." -- Michael Jordan

June 12, 2015

Cyber Essentials made mandatory by the Welsh Government
Quoted by SCMagazineUK.com, a Welsh Government spokesperson said: “From 1 April 2015, Cyber Essentials is required for all relevant Welsh Government contracts involving the handling of personal or sensitive information. This will also apply to National Procurement Service collaborative frameworks.” The Welsh Government has identified five levels of risk from 0 to 4. Level 0 is ‘low risk’, and means that no special arrangements are needed when minimal amounts of non-sensitive personal data are processed. ... “The CES defines a set of controls which, when properly implemented, will provide organisations with basic protection from the most prevalent forms of threat coming from the internet. Evidence of holding a Cyber Essentials (or equivalent) certificate is desirable before contract award, but essential at the point when data is to be passed to the supplier.”


3 Accidental Whistleblowers (Fired for Doing their Jobs Well)
As Adam Turteltaub, SCCE VP of Membership Services, puts it: “Whistleblowers are courageous, principled heroes, unless they are on my team, in which case they are dirty rotten traitors.” Whistleblowers are like the foreign body in the organization being attacked by its white blood cells. Or the nail sticking out of the board, begging to be hammered. The modern compliance program has as its stated goal to find, fix and prevent problems. Whistleblowers are a key resource in achieving this goal. But still, the white blood cells remain vigilant. But what happens when the whistleblower is a senior manager, head of a control function or even a CEO, who happens upon the problem – sometimes a very large problem – in the ordinary course of doing their job well?


When Big Hearts Meet Big Data: 6 Nonprofits Using Data to Change the World
When people think of big data, they often think of machines, robots and things that might be generally impersonal. But when you couple data with an altruistic mission, the results can be astounding. As we sink deeper into the digital era, nonprofits are now presented with new opportunities. For example, 56% of people donated to an organization because they read a story via social media. Fundraising sites such as DonorsChoose.org, Causes.com and Network for Good allow organizations to raise money with a simple click of a button. But this is only the beginning. Here we’ll take a look at which organizations have upped the ante by becoming not only socially-driven, but data-driven as well. See how these 6 nonprofits are using data to empower others and make a genuine difference in the world.


Twitter's next CEO faces four challenges
Perhaps the biggest problem Twitter has is that many people who aren't tech enthusiasts still don't understand what it's for or why they should use it. For every occasion Twitter is referred to as a social network, it's also identified as a news source, a publishing system, a feed of real-time events and a micro blog. Perhaps it's all those things, but that doesn't help sell it to people who aren't yet on the service. If it's a social network, why use it when Facebook's around? If it's a micro blog, why not use a proper blog like Tumblr instead? ... The company has tried to address these issues with new tools. Earlier this year, it began rolling out a feature called "instant timeline" that uses a variety of signals, including the contacts on a person's smartphone, to see who they might want to follow and automatically create a list.


Cybersecurity Firm Rapid7 Files For $80 Million IPO
The cybersecurity industry is booming as breaches and nation state attacks continue to dominate headlines. While VC investment in cybersecurity is on the rise, cybersecurity IPOs in the United States have been few and far between. Since November 2009, there have only been 17 IPOs in the security space (seven of which happened in 2012), according to research done by Pitchbook. The most recent security IPO was MobileIron’s $100 million exit almost a full year ago in July 2014. FireEye had biggest security IPO in the past five years at $349 million in September 2013.


Big Data Systems House Sensitive Data, Security Exposures
The result is an exposure that companies may not have counted on as they initiated their pilot big data projects, according to the survey report, "Enabling Big Data By Removing Security and Compliance Barriers," available here (registration required). Cloudera, the supplier of Hadoop system Cloudera Enterprise, sponsored the SANS survey. Many times, those projects demonstrate the utility of bringing together diverse data that was previously hard to assemble given the radically different data types. Big data systems gain utility as more data is brought in. The result is a slow brew of gathering risk without sufficient safeguards, the study warns.


Data as currency: Balancing risk vs. reward
At the heart of good IG is good recordkeeping, and therefore the senior records manager must be a key player in the IG initiative. Also vital to the program are compliance officers to help ensure the recordkeeping practices are satisfying the demands of such laws as Sarbanes-Oxley for the financial industry and the Health Insurance Portability and Accountability Act; IT executives to provide the right tools and to help effect proper protection policies; legal counsel to help assure the defensibility of the program; and senior managers from the business units to provide realistic guidance on how the information is created and used. Organizations wishing to monetize their big data should work to mitigate the security risks by implementing an IG program that treats records as the strategic assets they really are.


Mobility brings new ways to tackle IT security threats
The unique nature of mobile operating systems themselves has also provided new security opportunities. For example, mobile devices have managed to avoid many of the antivirus concerns that threaten Windows PCs, thanks to more closed operating systems such as Apple iOS, said Chris Hazelton, research director for enterprise mobility at 451 Research. OS vendors can still do more to help, including allowing IT to turn off specific app permissions and ensuring third-party apps can't collect employee data, he said. "A developer can sell and monetize your information if they can track your location," he added.


Why Data Lakes Require Semantics
According to Nick Heudecker, research director at Gartner, “Data lakes typically begin as ungoverned data stores. Meeting the needs of wider audiences requires curated repositories with governance, semantic consistency and access controls.” Heudecker also says that “…without at least some semblance of information governance, the lake will end up being a collection of disconnected data pools or information silos all in one place.” ... Adding Semantic technologies can address many of the issues inherent in Data Lakes if an organization needs to rapidly answer complex, real world questions that require the fusion of data in many dimensions. Semantic Data Lake (SDL) is a semantically integrated, self-descriptive data-repository based on graph (network) representation of multi-source, heterogeneous data, including free text narratives.


Q&A with Claudio Perrone on PopcornFlow / Evolve and Disrupt
In lean, we often talk about value streams. Yet, it's not what we do, but rather what we learn by doing it that matters. When I look at a typical scrum or kanban board, however, all I see is a snapshot of the outcome of the thinking behind it. Perhaps we are missing an opportunity. Popcorn flow accelerates, sustains and brings to the surface the reasoning (how and what we learn), specifically through a continuous stream of small and traceable change experiments. This is a vivid example of what I call a "learning stream". Value streams and learning streams work together and help us make progress like rails on a ladder. The trick is to make both visible. Most teams use two separate boards. But some teams who adopted this approach now split their single visual board horizontally.



Quote for the day:

"A man must be big enough to admit his mistakes, smart enough to profit from them, and strong enough to correct them." -- John Maxwell,

June 11, 2015

Q&A: Nina Bjornstad, Country Manager For The UK And Ireland, Google For Work
The core shift has to be us taking those principles of how we interact with our mobile phones on a daily basis and bringing them into the work environment. Introducing a greater sense of play into what we can accomplish, just like we would with an app; or leveraging cloud platform technologies so that we are able to stand a service up, try it out, shut it back down if it's not impacting the business, or grow it further if it is. Businesses need to have their “wow” moment; their ability to have this sense of play and the ability for them to understand what type of consumer-level behaviour is actually possible to bring into the business environment today.


Cloud storage survey highlights security concerns
Cloud storage gateways are replacing and augmenting traditional file servers and tape storage, particularly in remote or branch offices (ROBO). One third of all organisations with more than 50 ROBOs have implemented on-premise cloud storage gateways that support both the private cloud and public cloud, and 27 per cent of all companies have implemented them. Enterprises are coming under pressure to establish contemporary cloud storage solutions that provide the visibility and control required to meet enterprise needs and industry regulations. In the more heavily regulated financial services, government and life sciences industries 42 per cent prefer a completely private cloud that does not rely on external hosted infrastructure, as do 40 per cent of organisations with 10,000 employees or more.


Redefining Loyalty Programs with Big Data and Hadoop
There’s no such thing as too much data when it comes to this sort of analytics work. Every business analyst and data scientist agrees that expanding the data for any given model will typically produce dramatic improvements in analysis. And that data will obviously come in a wide variety of formats—structured, semi-structured, and unstructured, big and small, near and real-time, as well as historical. Try to store it all in a traditional data warehouse, and you may wipe out all of the profits gained from segmentation. You will almost certainly have availability issues, and you will spend a lot of time waiting for IT to massage the data into a form that can be analyzed.


Dutch Cyber Security Council boosts focus on privacy
“The Netherlands aims at being an open, secure and economically promising digital society. A society that is innovative and entrepreneurial, but which is also strong enough to face the risks that go hand-in-hand with our great dependency on IT,” the council says in its 2015 briefing document. “The cyber world is a world full of unknown possibilities and opportunities, but there is also a darker side to it. “Our lives are becoming more comfortable and less tied to times and places, but our privacy is also coming under increasing pressure and cyber crime is on the increase as well. To continue to stimulate our prosperity and economy, cyber security is therefore of crucial importance,” the document says.


Cyber-Espionage Nightmare
It’s not a surprise that such systems are relatively easy to co-opt for nefarious purposes. Ideas for making the Internet more secure have been around for decades, and academic and government labs have churned out interesting proposals. ... “You don’t hear about rebuilding the Internet anymore,” says Greg Shannon, chief scientist at the CERT division of Carnegie Mellon’s Software Engineering Institute. What’s a company to do? Wyatt tightened things at United Steelworkers; among other things, he now gives fewer employees so-called administrative privileges to their computers, and he searches the network for the telltale signs of communications by malware. But none of this would have prevented the intrusions. Wyatt says it “might have slowed them down.”


Under pressure: enterprises want better software, delivered faster and cheaper
Successful applications increasingly require greater technical complexity and sophistication -- 51 percent, for instance, believe that the mobile and web application user experience will become significantly more sophisticated in the next year. Enterprises expect these more sophisticated apps to be created over a shorter timeline without a corresponding increase in developer capacity. For example, 65 percent of companies successfully managing these processes say they need to release new features or bug fixes for their applications at least once a month. Another six percent are even pumping out new releases every other week. The proliferation of different devices also creates development and deployment challenges, the survey finds.


Ciena Builds Advanced Network for NOAA Environmental Research
The new network will enable NOAA to support bandwidth-intensive applications and programs such as the Geostationary Operational Environmental Satellite series R (GOES-R), and the next-generation national weather observation satellite program, which is working to advance weather and climate science and services. NOAA’s mission is to understand and predict changes in climate, weather, oceans and coasts. Its N-Wave science network, initially founded via funding through the American Recovery and Reinvestment Act, is a national spanning network that provides intra-NOAA connectivity, including communication and data transfer (5 Petabytes per month) between NOAA programs, line offices, research facilities and other scientific centers across CONUS, Alaska and Hawaii.


Predicting the next decade of tech
There's another complication in that CIOs increasingly don't control the budget dedicated to innovation, as this is handed onto other business units (such as marketing or digital) that are considered to have a more entrepreneurial outlook. CIOs tend to blame their boss's conservative attitude to risk as the biggest constraint in making riskier IT investments for innovation and growth. Although CIOs claim to be willing to take risks with IT investments, this attitude does not appear to match up with their current project portfolios. Another part of the problem is that it's very hard to measure the return on some of these technologies. Managers have been used to measuring the benefits of new technologies using a standard return-on-investment measure that tracks some very obvious costs -- headcount or spending on new hardware


IT continues to struggle to find software developers, data analysts
Part of the problem may lie with candidates' perceptions of a company's brand, says Tejal Parekh, HackerRank's vice president of marketing. "We work with a lot of customers in areas that aren't typically thought of as technology hotspots. For instance, in the finance sector we have customers facing a dearth of IT talent; they're all innovative companies with a strong technology focus, but candidates don't see them as such. They want to go to Facebook or Amazon," says Parekh. Another challenge lies with the expectations hiring companies have of their candidate pool, says Ravinskar. "There's also an unconscious bias issue with customers who sometimes limit themselves by not looking outside the traditional IT talent pool. They're only considering white, male talent from specific schools or specific geographic areas," says Ravinskar.


Managing Technology with CORE Strategy & Architectural C’s & P’s
What do you do to pursue the opportunities in an organization? In my opinion, these opportunities can be realized by taking four actions –Consolidate, Optimize, Refresh and Enable(CORE) on organization’s technology and system portfolios. These four opportunity vectors form the basis of a handy planning tool, which I call the CORE strategy. Inspired by Kim & Mauborgne [1]’s Four Actions Framework, the idea of CORE came from my experience as IT manager and architect. CORE is all about managing technologies, prioritizing IT spending and allocating resource on the basis of raising or creating capabilities (represented at right-hand side in the diagram), and reducing or eliminating cost & risk



Quote for the day:

"When you innovate, you've got to be prepared for everyone telling you you're nuts." -- Larry Ellison

June 09, 2015

Are you prepared for the future of data centers?
Colocation requires a shift in data center skillsets, Koppy noted, not handing the data center over to a third party. Ask questions -- specifics about the colocation provider's network and power paths and so on -- and if the colocation provider is unwilling to share information your own facilities team would know, consider that a red flag, Courtemanche said. Also, talk to the provider's long-term customers to gauge how your own experience might be. ... There are two problem areas data centers with more than 1,000 servers experience at a much higher rate than smaller ones, according to survey results from IDC: downtime due to human error and security breaches. As one AFCOM Symposium attendee put it, when you outsource, your job goes from managing the data center to managing the colocation provider.


The top 10 myths about agile development
To be flexible has become vital for a business in today’s global markets, and therefore, the ability for IT systems to be equally flexible is essential. The purpose of agile is to allow organisations to react to the increasingly dynamic opportunities and challenges of today’s business world, in which IT has become one of the key enablers. Agile is defined by four values and 12 principles found in the Agile Manifesto. The manifesto provides an umbrella definition, in which there are many other delivery and governance frameworks, such as Scrum or extreme programming, for example.


Is Nepotism Undermining Your Business Technology Innovation?
We no longer do the break-fix relationship. We have a strategy manager that essentially acts as a CIO and manages technology as our clients grow and innovate. You need someone to be there every time you grow and change out a piece of technology and that person needs to have extensive experience throughout your industry with companies of all sizes. A small company that is a family friend doesn’t have that kind of expertise. ... Most “family friend” businesses don’t have this in place and have no idea what sort of support their users are getting, how the response time is or which issues are being resolved and escalated. You don’t have the capital to pay your users to hang out waiting for a call back on an issue.


Erasure Coding For Fun and Profit
Erasure coding essentially uses maths to add a little bit of extra data to the end of the actual data so that if you lose part of this new, bigger amount of data, you can still get all of the original data back. A simple version is a checksum: sum all the ones and zeros and put that at the end. If you lose any one of the bits, you can figure out what it was by re-calculating the checksum and comparing it to the stored checksum. The difference is what the bit was, basically. This is a vast over-simplification, but that’s basically it. ...  There’s a downside (there’s always a downside). If you lose a disk, you have to rebuild all the data from the parity blocks scattered around the place, which reduces the performance of the array because some of the time is spent on the rebuild instead of serving up the data.


Obama vows to boost U.S. cyber defenses amid signs of China hacking
"We have to be as nimble, as aggressive and as well-resourced as those who are trying to break into these systems," Obama told a news conference at the Group of Seven (G7) summit in Germany. U.S. officials, speaking on condition of anonymity, have blamed Chinese hackers for breaching the computers of the Office of Personnel Management and compromising the records of up to four million current and former employees in one of the biggest known attacks on U.S. federal networks. The mission of the intruders, the officials said, appears to have been to steal personal information for recruiting spies and ultimately to seek access to weapons plans and industrial secrets.


Rise of the Surveillance Platform
Hildyard likened a trade-surveillance platform to a buy-and-build hybrid. Such a system requires customization to effectively detect and prevent abuse, as each market ecosystem is unique. But at the same time, building the capability from the ground up is unrealistic. Delivering surveillance via a platform rather than an application gives developers leeway to develop code that’s unique to their organization and the types of behaviors they need to monitor. Sell-side banks “can’t rely on an application to do that,” Hildyard said. “The frequency with which regulatory hot topics emerge is increasing over time,” Hildyard said. Additionally, trade surveillers’ “goal should be to ‘create’ the next big scandal and make sure it doesn’t happen on their watch, in their bank. That requires that they understand behaviors they weren’t previously monitoring for.”


Transforming Text and Data Into a True Knowledge Base
One of the steps in text mining is “relationship identification.” Once entities are identified and enriched, they are connected to other entities; for example, “Foggy Bottom is in Washington, DC”, “Foggy Bottom is near The White House” and “Foggy Bottom is east of Georgetown.” What just happened? We used Open Linked Data (LOD) to verify Foggy Bottom as a neighborhood that exists in Washington DC while also connecting it to other entities. LOD knows that DC is a “District” (not a state) and that it is within the United States. Preexisting facts were combined with results from text analysis to expand the knowledge base.


APIs with Swagger : An Interview with Reverb’s Tony Tam
First, we don’t want to try to stuff every possible feature inside the specification itself. Early on, someone brought up embedding rate-limiting information into the spec. But it would be very difficult to generalize, and would pollute the spec over a feature that possibly many people wouldn’t care about. Next, one thing we learned through the initial versions of Swagger is that it’s easy to write invalid specifications without a simple and robust validator. We chose to use JSON Schema validations, and even built it directly into Swagger-UI. It is an important part of the tooling to help developers write valid Swagger definitions. Removing structural constraints from the spec AND having a robust validation tool would be very difficult.


Case study: What the enterprise can learn from Etsy's DevOps strategy
“You have to be able to demonstrate to the larger business why it’s not just a buzzword and can add value to the business, and the only way to do that is to give them a concrete project and show them how it has positively affected the business,” he says. “The people who make the decisions at the top of the pile may be more business-minded than technically so, and you need to speak their language and demonstrate the impact it has had on key performance indicators or revenue that quarter. “You need to sell the idea to them in business terms because IT and development are service organisations that exist to fulfil the priorities of the business,” Cowie adds.


A Brief History of Big Data Everyone Should Read
Long before computers (as we know them today) were commonplace, the idea that we were creating an ever-expanding body of knowledge ripe for analysis was popular in academia. Although it might be easy to forget, our increasing ability to store and analyze information has been a gradual evolution – although things certainly sped up at the end of the last century, with the invention of digital storage and the internet. With Big Data poised to go mainstream this year, here’s a brief(ish) look at the long history of thought and innovation which have led us to the dawn of the data age.



Quote for the day:

"Every leader needs to look back once in awhile to make sure he has followers." -- Kouzes and Posner

June 08, 2015

Using Blocker Clustering, Defect Clustering, and Prioritization for Process Improvement
Although teams and tools often track defects differently from blockers, defects can be clustered like blockers to investigate their root causes and to solve them in an economically sensible way. Defects not only impede work in progress, they also block the team from starting other work by tying up the developers and testers who are correcting the problems. The root causes of defects can be analyzed at the same time as blockers, allowing prioritization to consider both sources of impediments ... The first rule is to avoid fixes that aren’t cost effective. For example, some work might be waiting for a clean test environment, but if this is a rarity then buying, building, and managing a complex environment may not be a cost-effective solution.


Adding a Second Ethernet Port to an Intel NUC via Mini PCIe
Just buy a half-length Mini PCIe Ethernet adapter like this Syba Realtek device on Amazon for just $16 and slap it in there. It works perfectly and there are drivers available for most operating systems, including VMware ESX! The issue is that the Syba card is too tall to also install a mSATA SSD in the NUC and the bulky triple cable it uses doesn’t fit nicely along with a SATA HDD in the little NUC body. The first problem is solved through the use of solder. I de-soldered the 10-pin Ethernet and 4-pin LED headers on the tiny NIC with my handy de-soldering iron and then pulled the pins out with pliers. I then soldered the Syba cable directly to the board, cutting off the connectors and tinning each little lead. I was careful to solder the wire low and slanted towards the back to allow the cable space in the cramped NUC and cut off the excess afterwards.


2015 Roundup Of Analytics, Big Data & Business Intelligence Forecasts And Market Estimates
85% believe that big data will dramatically change the way they do business. 79% agree that ‘companies that do not embrace Big Data will lose their competitive position and may even face extinction.’ 83% have pursued big data projects in order to seize a competitive edge. The top three areas where big data will make an impact in their operations include: impacting customer relationships (37%); redefining product development (26%); and changing the way operations is organized (15%).The following graphic compares the top six areas where big data is projected to have the greatest impact in organizations over the next five years. Source: Accenture, Big Success with Big Data: Executive Summary (free, no opt in).


Linda Powell, on Data Governance for Finance Industry
The Chief Data Officer role oversees the CFPB's governance, acquisition, documentation, storage, analysis, and distribution of data. There are several advantages to having the life cycle of data centrally managed. The first is the increased ability to ensure strong internal controls and adherence to best practices. There are also economies of scale related to data management. Therefore, having data management centralized creates efficiencies and helps to ensure consistency across the Bureau. An advantage for this role at a new agency is that we don’t have legacy systems or processes that we need to accommodate. ... An ontology is a dictionary where the definition is derived in part by relationships. I like to use the analogy that an ontology is like a forest of family trees.


A.I. is too hard for programmers
Computers programmers define data structures to represent general requirements. This follows from Alan Turing’s 1936 design emulating human computers. By keeping track of calculations on paper, human intelligence can make infinitely complex calculations using carefully designed data structures. In brains, the opposite is true. We learn from experience (specific) and generalize from there. In a future post, I will have a lot more to say about how brains store patterns and learn them, but for now, let’s focus on why this difference is the significant roadblock inhibiting our 1956 A.I. objectives


C# 6.0 Gets More Concise with Expression Bodied Properties, Dictionary initializer
C# 6.0 is slated for official release with Visual Studio 2015 later this year. As of now the first release candidate version is available at https://www.visualstudio.com/downloads/visual-studio-2015-downloads-vs.aspx. The sixth iteration of C# brings many small improvements to the language syntax that when combined will make your code more concise and easier to read. Today I'll be covering a handful of these improvements such as expression bodied properties and functions, using static directive, string interpolation, and the new dictionary initializer syntax. First let's take a look at expression bodied properties and functions. An expression bodied property is declared as a lambda expression.


Gartner's 19 In-memory Databases for Big Data Analytics
Amid the big data boom, the in-memory database market will enjoy a 43 percent compound annual growth rate (CAGR) – leaping from $2.21 billion in 2013 to $13.23 billion in 2018, predicts Markets and Markets, a global research firm. What’s driving that demand? Simply put, in-memory databases allow real-time analytics and situation awareness on "live" transaction data – rather than after-the-fact analysis on "stale data,” notes a recent Gartner market guide. Here are 19 in-memory database options mentioned in that Gartner market guide.


Enterprises opting for converged infrastructure as stepping stone to hybrid cloud
Practically, however, IT leaders are right now less inclined to wait for the promised benefits of hybrid cloud. They want many of the major attributes of what the cloud models offer – common management, fewer entities to procure IT from, simplicity and speed of deployment, flexibility, automation and increased integration across apps, storage, and networking. They want those, but they're not willing to wait for a pan-enterprise hybrid cloud solution that would involve a commitment to a top-down cloud dictate.  Instead, we’re seeing an organic, bottom-up adoption of modern IT infrastructure in the form of islands of hyper-converged infrastructure appliances (HCIA).


Analytics, Machine Learning, and the Internet of Things
Intelligent, connected devices require organizations to reexamine how and where they create value in the marketplace and how that value will be enhanced or diminished as the competitive environment and information ecosystem evolves. Analytics will help validate some decisions (for example, getting real-time usage data regarding changes to features or added services and functions); however, business models might be so vastly transformed by new entrants and value-chain structures that analytics based on the company’s traditional business models will no longer be relevant. Products or services might be based on data stream exhaust from legacy products rather than revenue from the products themselves. New business models might extend far beyond the product and into upstream suppliers or downstream consumers.


New Ping Identity Platform Includes Apple Watch Authentication
This is all driven by two-factor authentication, which can come in a variety of guises including the traditional text-based approach. You sign on using your PC or laptop, then get a text with your second sign on. You enter the second ID and you’re good to go. Ping has also come up with a new way using the Apple Watch. You sign on to Ping, then your watch buzzes. You activate it and tap the sign on card on your Watch. It’s a clever way of using the Watch in a useful way to simplify security. ... Ping is trying to redefine itself to offer a more comprehensive policy-based approach to security with authentication at its core. While it is not necessarily breaking new ground (except perhaps that Apple Watch piece), it has put together a broad approach to authentication.



Quote for the day:

"Leaders must see the dream in their mind before they will accomplish the dream with their team.” -- Orrin Woodward

June 07, 2015

Video: Parallel Algorithms Reconsidered
In this video, Peter Sanders from Karlsruhe Institute of Technology presents:Parallel Algorithms Reconsidered. Parallel algorithms have been a subject of intensive algorithmic research in the 1980s. This research almost died out in the mid 1990s. In this paper we argue that it is high time to reconsider this subject since a lot of things have changed. First and foremost, parallel processing has moved from a niche application to something mandatory for any performance critical computer applications. We will also point out that even very fundamental results can still be obtained. We give examples and also formulate some open problems.”


Privacy Risk Managementfor Federal Information Systems
This publication introduces a privacy risk management framework (PRMF) for anticipating and addressing privacy risk that results from the processing of personal information in federal information technology systems. In particular, this publication focuses on the development of two key pillars to support application of the PRMF: privacy engineering objectives and a privacy risk model. In so doing, it lays the foundation for the establishment of a common vocabulary to facilitate better understanding of, and communication about, privacy risks and the effective implementation of privacy principles in federal information systems. The set of privacy engineering objectives defined in this document provides a conceptual framework for engineers and system designers to bridge the gap between high-level principles and implementation.


Interview: Mike Lamble, CEO at Clarity Solution Group
“Better, cheaper, faster” is good. Schema-less writes, fitness for all data types, commodity hardware and open source software, limitless scalability – is also good. That said, out of the box Hadoop-based Data Lakes are not industrial strength. It’s not as simple as downloading the Hadoop software, installing it on a bunch of servers, loading the Data Lake, unplugging the enterprise Data Warehouse (EDW) and — voila. The reality is that the Data Lake architecture paradigm – which is a framework for an object-based storage repository that holds data in its native format until needed – oversimplifies the complexity of enabling actionable and sustainable enterprise Hadoop. An effective Hadoop implementation requires a balanced approach that addresses the same considerations with which conventional analytics programs have grappled with for years: establishing security and governance, controlling costs and supporting numerous use cases.


Google Create Kubernetes-based VM/Docker Image Building Framework
The Google Cloud Platform team have released a technical solution paper and open source reference implementation that describes in detail how to automate image builds via Google Compute Engine (GCE) using open source technology such as Jenkins, Packer, and Kubernetes. The reference implementation can be used as a template to continuously build images for GCE or Docker-based applications. Images are built in a central project, and then may be shared with other projects within an organisation. The Google Cloud Platform blog proposes that ultimately this automated image build process can be integrated as a step in an organisation's continuous integration (CI) pipeline.


Why “Agile” and especially Scrum are terrible
Under Agile, technical debt piles up and is not addressed because the business people calling the shots will not see a problem until it’s far too late or, at least, too expensive to fix it. Moreover, individual engineers are rewarded or punished solely based on the completion, or not, of the current two-week “sprint”, meaning that no one looks out five “sprints” ahead. Agile is just one mindless, near-sighted “sprint” after another: no progress, no improvement, just ticket after ticket. ... “Agile” and Scrum glorify emergency. That’s the first problem with them. They’re a reinvention of what the video game industry calls “crunch time”. It’s not sustainable. ... People will tolerate those changes if there’s a clear point ahead when they’ll get their autonomy back.


Big Data and the Future of Business
The point of Big Data is that we can do novel things. One of the most promising ways the data is being put to use is in an area called “machine learning.” It is a branch of artificial intelligence, which is a branch of computer science—but with a healthy dose of math. The idea, simply, is to throw a lot of data at a computer and have it identify patterns that humans wouldn’t see, or make decisions based on probabilities at a scale that humans can do well but machines couldn’t until now, or perhaps someday at a scale that humans can never attain. It’s basically a way of getting a computer to do things not by explicitly teaching it what to do, but having the machine figure things out for itself based on massive quantities of information.


Datameer adds governance tools for Hadoop analytics
Data silos are one potential consequence, as are regulatory-compliance risks when sensitive data sets are being used. Datameer’s new governance module is designed to give businesses transparency into their data pipelines while providing IT with tools to audit diligently for compliance with internal and external regulations. New data-profiling tools, for example, let companies find and transparently fix issues like dirty, inconsistent or invalid data at any stage in a complex analytics pipeline. Datameer’s capabilities include data profiling, data statistics monitoring, metadata management and impact analysis. Datameer also supports secure data views and multi-stage analytics pipelines, and it provides LDAP/Active Directory integration, role-based access control, permissions and sharing, integration with Apache Sentry 1.4, and column and row anonymization functions.


How UPS uses analytics to drive down costs
Putting it in perspective, the advanced math around determining an order of delivery is incredible. If you had a 120-stop route and you plotted out how many different ways there are to deliver that 120-stop route, it would be a 199-digit number. It’s so large mathematicians call it a finite number that is unimaginably large. It’s in essence infinite. So our mathematicians had to come up with a method of how to come up with an order of delivery that takes into account UPS business rules, maps, what time we need to be at certain places and customer preferences. It had to be an order of delivery that a driver could actually follow to not only meet all the business needs, but with fewer miles than they’re driving today. And this is on top of the 85 million miles we’ve already reduced.


CTO interview: Customer data analytics driving revenue growth at British Medical Journal
The analytics plans have involved investing in a number of tools. Among others, this includes including Google Analytics and AppDynamics, which is used to monitor user behaviour as well as a back office monitoring tool, Cooper said. ”We are using that a lot for performance and to be able to look at not just what an application is doing, but how we are using that to see what people are doing in the application,” she said. ... “Right now we are not in such a mess, but what we have got is so fragmented and we are just trying to work out what it is we need to track, what is the important data, what do we need to measure, because we have a lot of very industry specific data models that come with being an academic publisher.”


Safe Big Data
Data privacy has historically concentrated on preserving the systems managing data instead of the actual data. Since these systems have proven to be vulnerable, a new approach that encapsulates data in cloud-based environments is necessary. New algorithms must also be created to provide better key management and secure key exchanges. Data management concerns itself with secure data storage, secure transaction logs, granular audits and data provenance. This aspect must be concerned with validating and determining the trustworthiness of data. Fine-grained access controls along with end-to-end data protection can be used to verify data and make data management more secure.



Quote for the day:

"When you have exhausted all possibilities, remember this: You haven't." -- Thomas Edison

June 05, 2015

Co-operation driving progress in fighting cyber crime, say law enforcers
FBI assistant legal attaché Michael Driscoll said information security professionals in the private sector often see the evidence of cyber-enabled crime far quicker than law enforcement. He said it is important to engage with information security professionals as law enforcement becomes increasingly reliant on what they do on a daily basis for gathering the evidence they need. Driscoll said private organisations can help broaden law enforcement’s view and understanding of cyber-enabled crime. “Around 22,000 reports are made to the FBI’s internet crime complaint centre each month, but we think that is about 10% of what actually goes on. The volume is unbelievable,” he said.


FBI official: Companies should help us ‘prevent encryption above all else’
"Privacy, above all other things, including safety and freedom from terrorism, is not where we want to go," Steinbach said. He also disputed the "back door" term used by experts to describe such built-in access points. "We're not looking at going through a back door or being nefarious," he argued, saying that the agency wants to be able to access content after going through a judicial process. But many technical experts believe that building intentional vulnerabilities into the systems that people around the world rely on reduces the overall security of the entire digital system, even if done to comply with legal requirements.


The Innerworkings of a Security Operations Center
The SOC does not just consume data from its constituency; it also folds in information from a variety of external sources that provides insight into threats, vulnerabilities, and adversary TTPs. This information is called cyber intelligence (intel), and it includes cyber news feeds, signature updates, incident reports, threat briefs, and vulnerability alerts. As the defender, the SOC is in a constant arms race to maintain parity with the changing environment and threat landscape. Continually feeding cyber intel into SOC monitoring tools is key to keeping up with the threat. In a given week, the SOC likely will process dozens of pieces of cyber intel that can drive anything from IDS signature updates to emergency patch pushes.


Project Seeks to Combine Sustainable Fish Farm and Data Center
This is a rare example of a project that attempts to combine a data center with a completely unrelated facility in way that is mutually beneficial. Because a data center is a massive power and water consumer and a huge source of excess heat, people are often compelled to look for creative ways to utilize those aspects of mission critical facilities. Another example is a project in California’s drought-stricken Monterey County, where a group of entrepreneurs wants to combine a data center with a water desalination plant. The first initiative is the aquaculture facility, a fish farm that will produce 500,000 pounds a year of Mediterranean sea bass. A tech incubator is also planned for the site.


Uber CEO admits company is not perfect
Uber, Kalanick said, provides not just a cheaper, more efficient form of transportation that bests owning a car, regular taxis, or even public transit. The companys technology can also improve cities by getting more cars off the road and reducing pollution, he said. Uber’s service, which lets people hail a ride from their smartphones, is now active in more than 310 cities and nearly 60 countries around the world. In some countries, like Germany and India, Uber has wrestled with regulators over its legality. Kalanick also used the event to make a plea to mayors across the U.S., asking them not to deprive people the right to drive for Uber because of “some outdated regulation.” In the years ahead, Uber will continue to make changes to its service, particularly around the company’s low-cost UberX option, so that using Uber is cheaper than owning a car, Kalanick said.


Put microservices, cloud at heart of your IoT strategy
Users still have to collect IoT data, but also index and store it for easy access. Additionally, this model requires organizations to address IoT security at the cloud level, rather than the network level. Cloud assets growing underneath applications without direct application involvement -- as IoT assets do, since sensors are not part of user applications -- also requir e special planning to address data currency and to support synchronized analysis of multiple IoT sources. While current practices can likely address this, IoT application scale may prove challenging. A database and microservice IoT approach also offers better support for privacy and public policy limits. Because query patterns are directly visible, IoT systems based on microservices and queries make it easier to detect attempts to track a person's location.


"Arrogant" datacentre operators blasted by users for poor customer service approach
“If we’d asked that same question three years ago, the answer would have been cost or location, but the reason for that is because no matter what the datacentre service is – whether it be co-lo, hosting, cloud or managed service – people’s understanding of the market is so much greater now and their expectations are higher.” Because of the contractual and technology complexities involved in moving to a new datacentre supplier, users have traditionally felt inclined to make do with the service they receive, but that’s not necessarily the case anymore. “It’s difficult and disruptive to move, because moving a sizeable IT estate is complex and businesses can’t take the downtime, and it’s very expensive,” said Rabbetts.


Sharing Data, but Not Happily
Companies that are more transparent about why they collect certain customer details and how they use them may find it easier to maintain customer trust. Certainly, millions of people have signed up for store loyalty cards and frequent-flier programs that offer deals or upgrades based on consumers’ purchases. And for the many people who relish personalized services, the idea that Amazon, Facebook, Google Maps or Pandora may remember and learn from their preferences represents an advantage, not a problem. “People are always willing to trade privacy and information when they see the direct value of sharing that information,” said Mike Zaneis, the chief counsel for the Interactive Advertising Bureau, an industry group in Washington.


Flocker Tutorial: Migrating a Stateful Dockerized ElasticSearch-Logstash-Kibana Stack
Flocker is an open-source data volume manager designed exclusively for the purpose of managing stateful services running in containers. As of this writing, Flocker is at release 0.4, meaning that the project is under active development. In the coming weeks and months, we will be adding an API so that you can do everything described in this tutorial programmatically. Additionally, we are building Flocker from the ground up to work with other popular container management tools, such as Docker Swarm, Google Kubernetes and Apache Mesos. We’ve recently published a tutorial on how to use Flocker with Docker Swarm and Kubernetes, so if you are a user of either of those tools, I’d encourage to try out our integration demos.


How to hire for personality and train for skills
"Of course you need people who know the fundamentals of their job, but when your people come across problems, it's important that they see them as just obstacles and roadblocks on the way to overall success; conceptual thinking and abstraction is at the core of this," Jersin says. As important as it is for talent to focus on their own contributions to your products and services, it's also critical that they can see how their part fits into the larger whole. "You want people who can hit their own personal targets, but also keep the big picture -- the company's overall success, development and growth -- in mind as well," says Labourey.



Quote for the day:

“Instead of focusing on how much you can accomplish, focus on how much you can absolutely love what you’re doing.” -- Leo Babauta

June 03, 2015

12 Quick Tips about Application Level Performance Testing and More
In an economy where apps have become the very heart and soul of almost any business, you have less than one second to impress your user. Because of this limited impression availability, application performance is essential to ensure the quality of your customer's digital experience and your user loyalty. Application Performance Management tools and methods are indispensable in ensuring application performance in real, live environments. Testing application performance should be initiated as early as possible in the application development lifecycle to avoid poor performance and ensure user customer retention.


16 cool things to try with the new Google Photos
Scroll down below the faces in that same search screen, and you'll find a list of locations in which your photos have been taken. What's particularly remarkable about this is that it works even if you don't have location reporting activated, as is the case for me. How? Google says its technology is able to recognize known geographical landmarks from photos and then use logic (and the laws of physics) to infer your location in other nearby photos. If you took a snapshot of the Eiffel Tower on February 9th at 2 p.m., for instance, Google can safely assume you were still in Paris in that selfie you took in front of a bakery 45 minutes later. The accuracy and level of detail may surprise you.


When stolen data turns up on the dark web, this tech can find it fast
"There will always be a path out of your network through an advanced or insider threat," said co-founder Danny Rogers in a phone call last week. "There is no defense that's perfect. If you can't stop everything, what else can you do? That's when we started to focus on immediate threat detection," he said. Rarely do like red flags appear on a screen inside a company's firewall warning that its systems have been breached. In reality, most data breaches are discovered because someone stumbles across stolen data in an underground forum, up for sale to the highest bidder. Rogers, and his co-founder Michael Moore, said that using large-scale cloud-based automation to search for this data can considerably cut down on how long it takes to discover breaches.


CIOs future-proof the data center with hybrid strategies
Over the next three to five years, I'd say 80% of our services will be in the cloud. But there will always be a need for services on campus," he said. For example, he plans to keep the door-locking and fire alarm systems in his own data center, where he isn't reliant on a connection to a cloud provider for them to work. His closed-circuit security system, with its high demand on bandwidth, will stay in his data center, too. Hybrid strategies like Haugabrook's also typically require staff training and reassignment of IT roles. Haugabrook said he plans to transition his staff to roles focusing on automating and integrating systems.


Preparing Data for the Self-Service Analytics Experience
Frequently, all users have to work with are spreadsheets and limited reports based on disparate, application-specific databases. Self-service BI and data discovery tools can deliver much better visualization and data exploration, but the sources often remain limited to spreadsheets and siloed application-specific databases. At larger firms, even if there is an enterprise BI standard, users grow tired of waiting: waiting for IT to find development time to address requests for BI reports and dashboards and waiting for IT to find systems time to run the reports and queries. Of course, once this is all set up, users frequently decide that they want different data or different queries and visualizations and the process must start over.


Private Cloud: A Secure Alternative to Public Clouds?
Private clouds do have challenges, especially if on-premises IT is responsible for managing it, which requires the same staffing, management, maintenance and capital expenses as a traditional data center. However, a common misconception is that private clouds always run on client premises, in the client’s own data center. In reality, there are many providers that deploy, host and manage private cloud infrastructure and solutions. A business might also choose a mix of private and public cloud services, called “hybrid” cloud deployment. In fact, Gartner predicts that the majority of private cloud deployments will eventually become hybrid clouds, meaning they will leverage public cloud resources.


Apple Watch Fails To Ignite Wearables Market, Yet
Fitbit ranked as the number one wearable maker by volume. It shipped 3.9 million devices, giving it 34.2% of the market. Xiaomi followed with 2.8 million devices and 26.4% of the market. Garmin rounds out the top three with 700,000 devices and 6.1% of the market. All three companies make low-cost devices (~$100) meant to help track health and fitness. "Bucking the post-holiday decline normally associated with the first quarter is a strong sign for the wearables market," Ramon Llamas, and IDC research manager for wearables, wrote in the June 2 report. "It demonstrates growing end-user interest and the vendors' ability to deliver a diversity of devices and experiences. In addition, demand from emerging markets is on the rise and vendors are eager to meet these new opportunities."


Big Data, Bigger Responsibility
“Companies of all sizes and in virtually every industry are struggling to manage the exploding amounts of data,” says Neil Mendelson, vice president for big data and advanced analytics at Oracle. “But as both business and IT executives know all too well, managing big data involves far more than just dealing with storage and retrieval challenges—it requires addressing a variety of privacy and security issues as well.” In a talk at the Technology Policy Institute’s 2013 Aspen Forum, Federal Trade Commission chairwoman Edith Ramirez described some big data pitfalls to be avoided. Though many organizations use big data for collecting non-personal information, there are others that use it “in ways that implicate individual privacy,” she noted


White collar automation will bring new industrial revolution, says CEO
Change is hard, and we shouldn't be naïve about it. But there are two components here. One is productivity, which we all understand will be there, and the other is about progress. I often give people the example of a construction site. If you pass any construction site today, you will see highly specialized machines. Cranes, fork lifts, bull dozers, and people working alongside them. Ultimately it is going to be about robot human partnership. And if you look at that construction worker, productivity is through the roof, and that allows them to construct things we never thought possible. That's progress.


Why We Fail to Change: Understanding Practices, Principles, and Values Is a Solution
There is no simple answer to the question of why we fail to change – at least not in a form of a recipe. In fact, we have plenty of recipes and they are one the key reasons why we’ve kept repeating the same mistakes for more than 40 years. It’s not only that we have plenty of recipes but also how we’ve codified them – and then, of course, started certifying people. The end result is that it is easy for organizations to simply choose a method from a menu and expect everyone to comply with the method – and expect to repeat a success story. It’s not much of a surprise that it doesn’t work.



Quote for the day:

“Always forgive your enemies; nothing annoys them so much.” -- Oscar Wilde

June 02, 2015

4 tips to help CEOs find their CIO soulmate
Businesses can't be successful without a strategy, and neither can a CIO. When finding the right CIO for your company, you should spend time discussing the overall strategy of your company to make sure your ultimate goals align. Exceeding expectations as a CIO five years ago, might qualify as simply meeting expectations today. As Sagalov puts it, "a CIO is someone who does more than just keep the Wi-Fi on -- he is responsible for strategically growing a company's information capabilities." A CIO needs to have a strategy that will allow the company to thrive and adapt. A strategic CIO is a proactive CIO, and it is important to ensure that the person you hire is willing to plan for the future, rather than react as it happens.


Security breaches a monthly headache for firms
Virus attacks were the most common type of security issue, reported by 81 percent of large companies. But over half (57 percent) had been targeted by phishing attempts, a third (37 percent) had seen a denial of service attack, and nearly one in four (24 percent) said their networks had been breached by hackers. "Considering all breaches, there was a noticeable 38 percent year-on-year increase of unauthorised outsider attacks on large organisations, which included activities such as penetration of networks, denial of service, phishing and identity theft," the report noted. Businesses are pessimistic about their abilities to keep crooks out: over half expected to see more breaches in future.


Empower Your Application Teams
Application teams are the backbone of the revenue generating capabilities of all companies. These creative people spend their off-hours working with the latest tools to keep their skills sharp. Increasingly, these teams are going into the public cloud to develop their applications. Why? Developers need resources in days. Procurement and IT quote resource delivery in weeks. As the pace of business increases, developers need tools that help them accelerate the design and deployment of applications. They want resources quickly, on-demand. Watch this video to learn how Cisco meets the need for on-demand resources.


7012 Regs and Cyber insurance on collision course with small business
The 7012 regulations also require immediate reporting of any incident or threat to UCTI that is carried on or held in an IT system. The NIST is the cognizant agency for Classified standards and operational regulations. The regulations themselves are a part of, and a driver to, a set of complex problems for industry — presently, with risk being transferred away from DoD to its contractors who will find risk rebounding to them via their “cyber” insurance policies. This two-part article isn’t intended to fan the flames, but rather to give the context behind the regs, provide meaningful definitions for practical use, offer probable implications for industry, and set out why the seemingly most reasonable solution for businesses may be the most dangerous to them.


The drivers and inhibitors of cyber security evolution
“Coupled with the coming regulations that will require mandatory breach notification, it is surprising that many are still prioritising the same things they have always done, rather than evolving to ensure they can respond to threats that get through their current defences,” says FireEye European vice-president and chief technology officer (CTO) Greg Day. “Many organisations talk the talk, but want to walk the walk at a very slow and steady pace, while the most enlightened organisations are already spending more than 50% of their security budgets on progressive detection and response capabilities,” he says. The study found organisations are, on average, spending only 23% of their IT security budgets on detection and response, although this expected to increase to 39% in the next two years.


MongoDB Gets SQL-Reporting Capability
MongoDB engineers have built a connector that takes a standard SQL query from Business Objects, Tableau, Cognos, or other SQL-based analysis systems, and translate it into a query that MongoDB understands. MongoDB is popular as a JSON document database, capable of storing email, reports, comments, and other forms of text as objects in a database. It has used its own query and reporting methods that in the past have been incompatible with SQL reporting systems, the ones that most data managers were familiar with, explained Kelly Stirman, vice president of strategy at MongoDB.


The Power of RAML
RAML, or the RESTful API Modeling Language, is a relatively new spec based on the YAML format- making it easily read by both humans and machines. But beyond creating a more easily understood spec (one that could be handed to the documentation team without worry), Uri Sarid, the creator of RAML wanted to push beyond our current understandings and create a way to model our APIs before even writing one line of code. ... All of the RAML tools are open source and freely available at http://RAML.org/projects.... The API Notebook, on the other hand, takes interactivity and exploration a step further by letting developers use JavaScript to call your (and other) APIs. In the API Notebook they are also able to manipulate the data to see how it would work in a real-world use case.


CIOs need to plan and prepare for disruption
Whichever continent, whichever industry and whichever demographic or social group you look at, the signs of disruption are everywhere and it’s accelerating at a frenetic pace driven by a new wave of global 21st century entrepreneurs who last year registered over 100 million new companies and who are all being powered by the same new democracy, .... Today, technology is becoming increasingly ubiquitous and as it does so new social, mobile and cloud based technology platforms are helping people around the world collaborate and communicate easier and faster than ever before, find funding, information and expertise faster than ever before and helping them create, build, distribute and sell new products and services faster than ever before.


Customer-obsessed technology platforms: If you don't know, you're doing IT wrong
Unfortunately this is a terrible way to create applications, regardless if it's on the web, mobile, or any other emerging digital channel. The data is good, but we cannot start with our data in mind -- instead we must start with our customers' needs in mind. But why this change and why now? Our customers (and increasingly our employees) are being presented with so many more options from your competitors, both those known today and tomorrow's digital startups. Simply put, the barrier to creating new software solutions is approaching zero. Making this transformation is central to the BT Agenda -- applying technology to win, serve, and retain customers.


The Secret to Data Lake Success: A Data First Strategy
So why is the data warehouse failing to deliver on these requirements? The organization spent a lot of time and money to create a “slice and dice” environment that should give the business what they need. Unfortunately, in today’s environment, accounting for every question in one model is impossible. New data sources are emerging at a breakneck pace. New questions are sprouting up even faster. A highly engineered environment that only takes the data it needs a upfront is going to have difficulty adapting to rapidly changing requirements.  Faced with the complexity of data loading and transformation processes, as well as a highly intricate data model, the average data warehouse change takes nine months and costs over $1 million to complete. When you build a complex system with one purpose in mind, don’t expect agility.



Quote for the day:

“If you cannot do great things, do small things in a great way.” -- Napoleon Hill

June 01, 2015

5 ways to find and keep customer-focused IT pros
One option is to bring in workers with industry experience who can take on business-focused IT roles. ... Among people with those types of backgrounds, he said he looks for “attitude and passion to go after unsolved problems.” Another way to find people who will excel at working with customers is to get a sense of how they would solve a business problem with technology. Aaron Gette, CIO of Bay Clubs, a luxury fitness and country club company, said he cares less about titles and hot IT roles and more about intangible qualities. “I’m looking for nontraditional IT people. They like to talk to people, not just on social media, but actually socializing and being involved in initiatives,” he said. “They need to be involved in member forums and understand what’s working with our programs.”


Organisations are changing how they spend their cyber security budget
“Firms are coming to terms with the inevitability of a cyber breach,” said Duncan Brown, research director at PAC. “Rather than spending a majority of security budget on prevention, firms will apply a more balanced approach to budgeting for cyber attacks.” ... It’s vital that organisations find the right balance between prevention and response. An organisation that puts all its eggs in one basket and solely spends on prevention will find itself in a tough situation when it inevitably suffers a breach, ditto for those that spend solely on response. To find the right balance, organisations needs to implement a framework that combines prevent and protect, and detect and respond – and enables them to work together.


Deep Learning Catches On in New Industries, from Fashion to Finance
“One of the things Baidu did well early on was to create an internal deep learning platform,” Ng said. “An engineer in our systems group decided to apply it to decide a day in advance when a hard disk is about fail. We use deep learning to detect when there might’ve been an intrusion. Many people are now learning about deep learning and trying to apply it to so many problems.” Deep learning is being tested by researchers to glean insights from medical imagery. Emmanuel Rios Velazquez, a postdoctoral researcher at the Dana-Farber Cancer Institute in Boston, is exploring whether deep learning could help to more accurately predict a patient’s outcome from images of his or her cancer.


Private Cloud: Insurers' Secure Solution
“The insurance industry has often been slower to adopt public cloud than many other industries because of regulations of how data needs to be managed,” says Jeffrey Goldberg, vice president of research and consulting at Novarica. “Some of those are legitimate concerns about data security and some of it is also fear-based — they would like to get the advantages of public cloud but want to maintain control.” Private cloud models — which are either implemented on-premises behind a corporate firewall, or off-premises but within the client’s firewall and dedicated solely to the client — have begun to address insurers’ security concerns. This sector is growing fast: Research firm Technology Business Research forecasts 35% growth in the private cloud sector in 2015.


The Internet of Things Will Give Rise To The Algorithm Economy
Data is the oil of the 21st century. But oil is just useless thick goop until you refine it into fuel. And it’s this fuel – proprietary algorithms that solve specific problems that translate into actions – that will be the secret sauce of successful organizations in the future. Algorithms are already all around us. Consider the driver-less car. Google’s proprietary algorithm is the connective tissue that combines the software, data, sensors and physical asset together into a true leap forward in transportation. Consider high frequency trading. It’s a trader’s unique algorithm that drives each decision that generates higher return than their competitors, not the data that it accesses. And while we’re talking about Google, what makes it one of the most valuable brands in the world? It isn’t data; it’s their most closely guarded secret, their algorithms.


Q&A with Benjamin Wootton on DevOps Landscape in 2015
DevOps from an automation perspective is now big enough a field to be a specialist area. There are so many tools and the space is moving quickly that people can concentrate on it full time and deliver competitive advantage to their businesses. Having a team of people working with these tools on these type of activities can really work and help all of the developers and testers to go faster, leveraging up their value to the business.  I like to see senior engineers in this DevOps team who bring a DevOps mindset and career experience across dev, test and operations. They can then go out and coach other staff members onto the central automation platform, ideally giving those teams an increasing amount of ownership of the automation.


Cloud computing more about agile development than cost
"For CIOs, the message is clear: Shift into the driver seat, or others will," Forrester said in releasing its cloud forecast. "A lot of enterprises are voting with their budgets and they're adopting cloud across the board," Rymer says Small wonder then, that for many organizations, the first question about the cloud is a settled matter -- not a question of if, but when, and how. ... "The bottom line here is ... cloud is the next platform," Rymer says. "We don't get a lot of questions from clients anymore about whether or not they're going to go to public clouds. It's really how do we get there." So how do they get there? To Forrester, it is essential to bridge the gap between the IT shop and the business lines of an organization.


The enterprise technologies to watch in 2015
The new technologies on the list include a few that aren't well-known but I believe represent either key advances likely to grow in strategic importance (machine learning, data science), or new developments that offer very significant benefits tactically with relatively little effort to realize (containers,instant app composition, machine-to-machine systems.) There are also a few long-standing categories which have re-emerged recently as leading areas of technology focus for most organizations with new approaches, or have actually developed into parallel tracks with different levels of impact, often with a clear separation of efforts within many companies (hybrid cloud and commercial public cloud, for example.) I've also consolidated some of last year's items as well, as explained above.


Salesforce teams up with Google and others to breakdown big data tech barriers
“Salesforce Wave for Big Data connects the Analytics Cloud to the industry’s most comprehensive ecosystem of big data innovators. Now every company can extend any data source to business users to transform every customer relationship,” he added.  Google’s contribution will tackle the volume piece of the big data equation by allowing users to run advanced queries on their datasets, while Cloudera will provide users with a centralised hub where their information can be stored and analysed securely. Meanwhile, New Relic’s software analytics platform is being introduced to tackle velocity, by providing users with a means of deriving real-time information about the performance of a company’s web and mobile apps.


AI Supercomputer Built by Tapping Data Warehouses for Their Idle Computing Power
Data centers often have significant numbers of idle machines because they are built to handle surges in demand, such as a rush of sales on Black Friday. Sentient has created software that connects machines in different places over the Internet and puts them to work running machine-learning software as if they were one very powerful computer. That software is designed to keep data encrypted as much as possible so that what Sentient is working on–perhaps for a client–is kept confidential. Sentient can get up to one million processor cores working together on the same problem for months at a time, says Adam Beberg, principal architect for distributed computing at the company. Google’s biggest machine-learning systems don’t reach that scale, he says.



Quote for the day:

“No great manager or leader ever fell from heaven, its learned not inherited.” -- Tom Northup