January 11, 2016

Redmonk analysts on best navigating the tricky path to DevOps adoption

It's the idea that Hilton International or Marriott would be worrying about Airbnb. They weren’t thinking like that. Or transport companies around the world asking what the impact of Uber is.  We've all heard that software is eating the world, but what that basically says is that the threats are real. We used to be in an environment where, if you were a bank, you just looked at your four peer banks and thought that as long as they don’t have too much of an advantage, we're okay. Now they're saying that we're a bank and we're competing with Google and Facebook. Actually, the tolerance for stability is a little bit lower than it was. I had a very interesting conversation with a retailer recently. They had talked about the different goals that organizations have.


A disaster recovery/business continuity plan for the data breach age

The need to manage and protect both business and personal data (as clearly differentiated from the software) has never been more important. A disaster recovery/business continuity plan that does not account for our dependence on data puts the enterprise, its employees and customers at risk. ...
A good disaster recovery/business continuity (DR/BC) plan is not an IT plan, it is a business plan that has significant IT components. As discussed above, more and more focus needs to be placed upon datarecovery beyond ensuring that programs and processes are returned to operational status. The plan should be scenario-based and aligned to the likelihood of varying levels and types of risks as specified by documented business impact analyses and business risk assessments.


Why customer is not always right

There are two fatal flaws in this model, both having to do with managing expectations. First, clients need to understand that they are unlikely to get every deliverable without some compromise – particularly in custom software, where nobody knows exactly what’s involved until the project is more than half done. Second, the project lead on the consultant side must actively manage expectations during every client meeting. If the project lead on the client side is weak – technically or politically – s/he will not successfully propagate the realities of prioritization and negotiation to executives in the client organization. This means the project is in trouble before it starts … and, worse, the trouble can be totally invisible to the client until it’s way too late.


How tech giants spread open source programming love

Programming languages and technologies that were developed by industry and Internet giants – specifically to meet the unique challenges they faced operating at massive scale – have been open sourced and are now being adopted by regular-sized enterprises for everyday use. Part of the reason for this is a natural technology trickle-down effect, according to Mark Driver, a research director at Gartner. "Today's leading edge super high tech is tomorrow’s standard product," he says. "Also, large companies (like Google and Facebook) understand the collaborative nature of open computing and the dynamics that drive the Internet. So it's natural that they share these technologies and strengthen the industry around them."


Six Transformations From 2015 That Will Reshape The World

Looking at the list of finalists for the Crunchies, you could get the impression that the greatest advances of 2015 were sharing and delivery apps, software platforms, and pencils. Yes, these are cool. But much bigger things happened last year. A broad range of technologies reached a tipping point, from science projects or objects of convenience for the rich, to inventions that will transform humanity. We haven’t seen anything of this magnitude since the invention of the printing press in the 1400s. And this is just the beginning. Starting in 2016, a wider range of technologies will begin to reach their tipping points. Here are the six amazing transformations we just saw.


Britain is on the verge of an IT crisis

This shortage will boil over in the coming years as a generation of IT workers, who built the systems and databases that still power critical functions, begin to retire. This is especially worrying in finance, where large institutions, which have repeatedly merged and sold off parts of their businesses, have back-end systems that have been hastily thrown together. As those that created them leave the workforce, disasters will be more difficult and take longer to recover from. Companies have responded to the problems with hiring IT workers by outsourcing more work. But having done this, says Tate, many have made poor decisions, found contractors to be inadequate, and moved operations back inside. The alternative is simply to pay more for the best talent, but a swell in demand across the board is making this increasingly expensive.


3 Guiding Principles for Innovation in Managed Services

We simplify what has become complicated, we create dashboards of the automation and single pane of glass displays of the coordinators, and we start the cycle over again. It sure seems a little reversed to me. Am I issuing a wake-up call to our industry? Absolutely! I have begun to initiate some brain-storming sessions with colleagues that challenge the status quo. Our technology is now using Fully Automated Storage Tiering, multiple alerting consolidation engines, automatic load balancing, pooled resource rebalancing, and the list goes on and on. This is fantastic and exciting beyond belief to talk about, explore, and work with these technologies. However, I am involved in services. We are the pilots of the automation, and we must aviate, navigate and communicate our way through the technology hierarchy.


The Dark Side of The Wearables

As wearable devices make their way into the workplace and corporate networks, they bring a host of security and privacy challenges for IT departments and increase the amount of data that data brokers have to sell about an individual. Jeff Jenkins, chief operating officer and co-founder of APX Labs, talked about the security and privacy of wearables during a panel interview with Tech Pro Research at CES 2015. Because wearable devices are designed to be small and portable, Jenkins said, "you have to make sure you're thinking security first and you're thinking about the information that's being generated by them. You have situations where it's no longer just personal data that may be exposed or compromised, but also potentially operational data, that could be sensitive in nature."


The Emerging Data Design: Bitemporal Data

Simply defined, bitemporal data means storing current and historical data, corrected and adjusted data, all together in the same place. Bitemporal means you are using two time dimensions simultaneously – one to represent business versions and one for corrections. For example, let’s say you have a database table of customers; in a bitemporal world, you would store changes (versions) of the customer’s data, over time, as well as any corrections, as new rows in the same table. Customer data changes include attributes like the customer’s name, address or buying preferences. Corrections (some people like to call it adjustments) represent restatements of data that people or systems make to record the right value. Human typing errors or software errors create data that may get corrected.


DDoS: 4 Attack Trends to Watch in 2016

Most businesses are ill-prepared for DDoS attacks, which is why it costs them so much to recover, Meyerrose says. The cost of recovering from a DDoS attack can be more than $50,000 for small businesses, he notes, quoting data from security firm Kaspersky Labs. That cost includes business lost to downtime and technology expenses and investments associated with site recovery. So what can be done to defend against the growing DDoS threat? "My main strategy for defense would be making sure I could quickly detect and block all types of DDoS attacks, e.g. application or network layer, and be able to quickly redirect my users to a backup duplicate, albeit streamlined, site to keep my business running without interruption," Litan says.



Quote for the day:


"Once we rid ourselves of traditional thinking we can get on with creating the future." -- James Bertrand


January 10, 2016

Open Source as a Driver of Internet of Things

The zero entry barrier provided by the use of open source, with several toolkits, libraries, and open source hardware like Arduino and Raspberry Pi, is the foundation for it turning up in small devices sprinkled all over the globe, from home security to energy management systems, from automobile telematics to health monitors. Because open source helps lower the cost of the device itself, companies can now experiment and stitch together solutions that would otherwise have been ignored because they would have required upfront purchasing of expensive licenses for development tools and environments, specific libraries and software components. Open source is a very effective way to ride the IoT wave at high speed while keeping the risks and costs to do so under control.


Cisco's global cloud projections may blow your mind

Annual global cloud IP traffic is expected to reach 8.6 ZB by the end of 2019, up from 2.1 ZB per year in 2014. In an interesting glimpse into how new technologies are helping drive efficiencies in spite of this massive increase in traffic, networking technologies such as SDN and NFV are expected to streamline data center traffic flows such that the traffic volumes reaching the highest tier (core) of the data center may fall below 10.4 ZB per year, and lower data center tiers could carry over 40 ZB of traffic per year. In terms of how this traffic looks on a regional basis, perhaps unsurprisingly North America will have the highest cloud traffic volume (3.6 ZB) by 2019, followed by Asia Pacific (2.3 ZB) and Western Europe (1.5 ZB). North America will also have the highest data center traffic volume (4.5 ZB) by 2019, followed by Asia Pacific (2.7 ZB) and Western Europe (1.8 ZB).


What makes a great company? Let’s talk information flow

The flow of information between employees is important across all levels and titles. Too often, executive teams hold intelligence close to their chest in fear of having competitive knowledge or financial earnings exposed outside of the company. We want to lead by example — and transparency and trust are huge components. With that goal in mind, we host a Datameer Radio session each month so everyone can get an update on the company and participate in a candid Q&A with the executive team. We’ve found that not only do our employees respect the confidentiality of the information that is shared, but also knowing what is going on strengthens their commitment to being a part of helping us grow. It’s clear that the workplace is in need of disruption with new models of motivation to drive inspiration and enhance well-being.


Internet Of Things Extends Business' Ability To Sense And Respond

Call it fallout from the Google effect, he explains. “One of the things that Google and search has done for us is it has infinitely expanded the capacity of the human memory,” says Hoover. “I don’t have to memorize all the facts in the world. I can go out and look them up and find it if I want to learn about reinforcement learning. It’s expanded my brain, my memory to nearly infinite capacity.” By analogy, the billions of sensors across the planet is expanding our awareness of our surrounding. The Internet of Things is “about Googling reality,” Hoover explains. “I see things, I hear things, I sense the world around me. To sense something at the time it occurs, it no longer has to be near my body. I want to understand the state of pollution in Beijing; I go and find it on the internet.


Dutch government says no to 'encryption backdoors'

The Netherlands began reviewing its policies after the recent Paris terrorist attacks. But this week it said "restrictive" measures would put citizens at risk. Encryption is a way of protecting communications or data so that it is incomprehensible without the correct passcode or key. Advocates say it protects users by preventing criminals and spies from prying into private conversations. But security agencies have said they struggled to bypass encrypted messaging platforms used by groups such as so-called Islamic State to plan attacks. "We are not some kind of maniacs who are ideologues against encryption," FBI director James Comey said in November: "But we have a problem that encryption is crashing into public safety and we have to figure out, as people who care about both, how to resolve it."


Big Data Security and Compliance Issues in the Cloud

It’s a quandary. Businesses want to be able to conduct deep, flexible analytics on complete data sets. That’s the essence of big data. You don’t want to omit any data that might contribute to finding business-facing insights. You want the cloud for flexibility and economics. But, you also don’t want to run afoul of compliance regimens or increase your exposure to security risks. What can you do? Don’t worry. As I said, it can be worked out. Getting on top of public cloud big data security and compliance challenges takes effort on two fronts. First, there has to be a coherent, disciplined set of data governance policies at work in the cloud. Platforms also matter. The two work together, with the platform enabling the definition and enforcement of governance policies.


Symantec Adds Deep Learning to Anti-Malware Tools to Detect Zero-Days

Until recently, deep learning has been locked away in the software development labs. A few companies have realized that they can spot malware by its components and its behavior to ferret out most zero-day attacks before they have a chance to cause damage. Because of this, deep learning is now being deployed on the cyber-security battleground. ... Symantec has their sights set on bigger goals in the enterprise. The next target will be enterprise email, especially cloud-based email. "We process a lot of the world's email," Gardner said. "A lot of attacks enter the enterprise through email. They're insidious." He said that by attacking company email systems, cyber-criminals are able to seize critical information and, in addition, able to steal a lot of money through phishing schemes that install malware on company networks.


The DIFA Framework for evaluating data science projects

A situation that many CIOs or data science departments will face today is that the list of possible analytics projects is sheer endless. The range of project candidates usually begins with analyzing customers and ends somewhere at utilizing social media data. Needless to say that not all project candidates will make sense from a business and especially ROI perspective. Also, some projects will run into dead ends because some fundamental bits turn out to be missing. Even though experimentation and some vagueness about eventual monetary success in the context of data science projects is normal, there are some hard facts that heavily influence the success of a data science project. These facts are structured in the DIFA framework which is explained below.


Tracking Cloud Services: An Essential Security Step

Not knowing who's responsible within an enterprise for managing cloud serviceclouds contracts could result in the inventorying of cloud services falling through the cracks. "The decentralized procurement model of cloud creates situations where individuals and business units may use a cloud service outside of the purview of the central IT organization," says Jim Reavis, CEO of the Cloud Security Alliance, a not-for-profit that promotes use of cloud security best practices. Confusion about who is responsible for cloud services contracts within the enterprise could lead to the failure to inventory each agreement. "Organizations fail to inventory their cloud services and other cloud-accessible devices because they fail to appreciate that cloud computing is not a technology decision," says Kevin Jackson, founder of the cloud computing consultancy GovCloud Network.


How Digital Disruptors use Data Science

Digital disruptors are fast and relentless. They are constantly releasing new functionality. They try things - they experiment. In order to do this, digital disruptors need a feedback loop. They use the data from their customers’ use of their product to get fast and accurate feedback on how these products are being used. Lets look at examples of how they do this using data science. On race days, Nascar analyses all the tweets and fan site activity relating to the race. It uses data science to bucket this human interaction data into topics, and for each topic it automatically determines sentiment. It then addresses any concerns thru information on its fan site, information at the event, or feeds to it broadcast partner, Fox Broadcasting.



Quote for the day:



Technological change is not additive; it's ecological. A new technology does not merely add something; it changes everything. -- Neil Postman


January 09, 2016

Antivirus software could make your company more vulnerable

Antivirus vendors don't seem too concerned about the potential for widespread attacks against their consumer products. For the most part, researchers agree that such attacks are unlikely for now because typical cybercriminal gangs have other, more popular, targets to attack such as Flash Player, Java, Silverlight, Internet Explorer or Microsoft Office. However, the creators of those widely used applications have increasingly added exploit mitigations to them in recent years, and as more people update to newer and better protected versions attackers might be forced to find new targets. Therefore, future attacks against antivirus products used by tens of millions or hundreds of millions of consumers can't be ruled out, especially if cybercriminals get their hands on previously unknown -- zero-day -- vulnerabilities, as they have done from time to time.


XL Catlin Analytics Strategy: Quality Over Quantity

Getting internally and externally sourced data ready for modeling is the bulk of the work, she explains. “Coming up with a cohesive data set probably takes four times longer than creating the model,” she says. “I would say we spend 45 percent of our time on the data, 10 percent on the model, and 45 percent on change management.” The data is housed in SAS and SQL databases , linked through ODBC connections. The team writes code in the SAS programming language to run the data analysis, relying on a code library. SQL and R are also used, to a lesser extent; SQL to extract data from source systems, and R, a language and environment for statistical computing and graphics for exploratory data analysis.


The transition from cloud back to a data center migration

Groupon began its move out of the cloud in 2011, three years after the online deal website was launched. "The biggest driver was cost," Chatha said. "It was not economically feasible for us to stay in the cloud." A motivating factor for some companies to move to the cloud is the ability to pay for it as an operating expense, rather than a capital expense. Today, base rent and utilities payments for colocation space can be treated as Opex, and hardware can be financed and paid as Opex, too, he said. Groupon's needs are more diverse than Netflix, for example, which has generated headlines about its complete move to Amazon Web Services (AWS). Netflix has its greatest need for storage, Chatha said, while Groupon needs everything, from virtual machine hosts to databases.


One-on-One Coaching Misses the Mark

Traditional coaching works with ane executive one-on-one and helps him find new approaches. Believing this approach too limited, we facilitated a meeting with the executive and his team to share the feedback we gathered. This eliminated secrecy and impressed the team. The executive had made himself vulnerable, and the team began thinking about how they could help him. Then we moved the conversation away from the executive to how the team could improve. It began discussing how better to define its collective goals, redesign meetings to make them more productive, and address issues before they became problems. The challenges the team identified, and the solutions proffered to improve performance, never would have emerged in private, one-on-one coaching sessions.


DDoS attack on BBC may have been biggest in history

A group calling itself New World Hacking said that the attack reached 602Gbps. If accurate, that would put it at almost twice the size of the previous record of 334Gbps, recorded by Arbor Networks last year. "Some of this information still needs to be confirmed," said Paul Nicholson, director of product marketing at A10 Networks, a security vendor that helps protect companies against DDoS attacks. "If it's proven, it would be the largest attack on record. But it depends on whether it's actually confirmed, because it's still a relatively recent attack."


Banks, don’t wait for your competition to become data driven

First, the upside of leveraging the potential of data science and analytics and developing data driven business models is not only a measure to increase internal process efficiency but especially to attract customers and maintain a sustainable business. Second, the risk of a “sit tight and wait” strategy is truly suicidal. Establishing a data driven business culture cannot be done over night and needs time for people training and development, letting aside the effort and time needed to choose and set up the systems and infrastructure. Recall how Google disrupted the search industry. Yahoo, Lycos and all these almost forgotten dinosaurs could never catch up over come even close to Google’s success after they had been disrupted.


The Dying Technologies Of 2016

Thinking of antique technologies, vinyl has made a comeback but CDs, DVDs, and Blu-Ray? They’re all marching to the media graveyard. Today, we stream everything we can. I still buy and own CD and DVD players, but I’m an old guy. Also, call me a Luddite, but I like having my music, videos and books in my hand, not in some distant cloud. There aren’t many of us left. Fewer and fewer PCs and laptops come with a CD/DVD player. We used to use CD/DVD drives to install software too. I rarely do that anymore. That’s not just because we download almost all our software today. It’s also because stand-alone PC software is on its way out. Accounting, office suites, customer-relationship management — you name it, we do it on the cloud now.


Project Alignment, Hiring Shortfall Top 2016 Big Data Challenges

This isn’t a new problem. Looking back even a few years ago, when it became clear that data was essentially currency, people predicted significant shortfalls in data scientists. In 2015, companies throughout the industry felt the sting acutely. Every day, new job postings go up looking for qualified data scientists. It takes time to find candidates and not every data genius is the perfect fit for every company. There may be some relief coming, with specialty certificate and alternative education programs for big data popping up from universities and other educational institutions, but it’s not an immediate fix for 2016. If companies want to fill their teams with more data scientists, my advice is to hire people in accordance to the nature of the problems companies want to solve, not all problems require advance data science.


Why your cyber insurance investment may not pay off

If you are considering cyber insurance, you are in my opinion doing the right thing. The cost of a data breach can be staggering, and many small and medium companies suffering one will not even survive. That being said, the purchase of a policy without establishing and following appropriate information security policies and procedures may well be a waste of money. Attorney Eran Kahana, a guest on episode 172 of the Down the Security Hole podcast, put is quite simply: "If you don't do security well, the courts will kill you." Since a strong security posture is necessary anyway to protect your business, the ability to meet the requirements for cyber insurance is just a bonus.  The following are some of the general thing you will need to have in place prior to seeking insurance.


The Search For The Killer Bot

As 2016 dawns, there’s a sense in Silicon Valley that the decades-old fantasy of a true digital assistant is due to roar back into the mainstream. If the trend in past years has been assistants powered by voice — Siri, Alexa, Cortana — in 2016 the focus is shifting to text. And if the bots come, as industry insiders are betting they will, there will be casualties: with artificial intelligence doing the searching for us, Google may see fewer queries. Our AI-powered assistants will manage more and more of our digital activities, eventually diminishing the importance of individual, siloed apps, and the app stores that sell them. Many websites could come to feel as outdated as GeoCities pages — and some companies might ditch them entirely.



Quote for the day:


"Deal with the world the way it is, not the way you wish it was." -- John Chambers


January 07, 2016

Connecting Big Data Project Management with Enterprise Data Strategy

Ideally a portfolio of projects will support an organization’s strategic plan and the goals or missions the organization is charged with pursuing. We may also need to “get tactical” by delivering value to the customer or client as quickly as possible, perhaps by focusing on better-controlled and better-understood product centric data early on via a “data lake” approach. Doing so will be good for the customer and will help create a relationship of trust moving forward. Such a relationship will be needed when complications or uncertainties arise and need to be dealt with. In organizations that are not historically “data centric” or in organizations where management and staff have a low level of data literacy, an early demonstration of value from data analysis is especially important. 


Hybrid Cloud, Microservices and The API Economy: Looking To 2016

The drive toward cloud and the drive toward hybrid environments. If you look back, containers are not surprising because the need for portability became very critical. We ended up with a truly hybrid environment. Along with that, you see this movement towards an API Economy, a movement towards microservices, the movement of DevOps, of rapid transmission, of rapid delivery of small batches of changes–all those made containers very attractive. To me, not only is this something we’ll look back on and say 2015 is where the traction began, but it’s going to gain even more traction and transformation in 2016.


Evaluating your need for a data warehouse platform

A data warehouse platform is typically based on a relational DBMS, and the data in it is structured and generally originates from an organization's operational and transactional systems. Data warehouses are accessed by business executives and analysts using BI dashboards, OLAP and reporting tools, and ad hoc SQL queries. Big data analytics, on the other hand, is typically supported by nonrelational technologies such as Hadoop, Spark and NoSQL DBMSes. The data can be both structured and unstructured, and can originate from every type of internal system plus external data sources, such as social media. Analytics are performed on big data for discovery and insight


Best practices in HDFS authorization with Apache Ranger

Apache Ranger offers a federated authorization model for HDFS. Ranger plugin for HDFS checks for Ranger policies and if a policy exists, access is granted to user. If a policy doesn’t exist in Ranger, then Ranger would default to native permissions model in HDFS (POSIX or HDFS ACL). This federated model is applicable for HDFS and Yarn service in Ranger. ... The federated authorization model enables customers to safely implement Ranger in an existing cluster without affecting jobs which rely on POSIX permissions. We recommend to enable this option as the default model for all deployments. Ranger’s user interface makes it easy for administrators to find the permission (Ranger policy or native HDFS)that provides access to the user.


Why the Cloud Is Taking Over Traditional IT Systems

Data is flowing over the unsecured public data highway, so security is critical, particularly as more workers switch to remote and mobile work. Infrastructure and applications are exposed to the outside world. At this point in the cloud evolution, most new cloud 2.0 applications are architected specifically for the cloud. This means that the performance and response time is higher than the first generation of cloud applications, which were just old client/server applications retrofitted with web interfaces. Around 2013, Moore’s Law started to run into the constraints of the laws of physics. Approaching very small size, transistors are less reliable. Consumers of computing power have enjoyed riding the wave of inexpensive computing power in increasingly smaller devices.


Data Center Design: Which Standards to Follow?

Best practices mean different things to different people and organizations. This series of articles will focus on the major best practices applicable across all types of data centers, including enterprise, colocation, and internet facilities. We will review codes, design standards, and operational standards. We will discuss best practices with respect to facility conceptual design, space planning, building construction, and physical security, as well as mechanical, electrical, plumbing, and fire protection. Facility operations, maintenance, and procedures will be the final topics for the series. Following appropriate codes and standards would seem to be an obvious direction when designing new or upgrading an existing data center.


Design Thinking Is Taking Hold At IBM

IBM's adoption of Design Thinking is important for the company's sheer market heft, but there is another reason, said Coleman. "IBM exists in the gap between the reality of a situation ('We've always succeeded this way.') and what could be," he said. "Design Thinkers say, 'Sure, that's great, but I have a vision," Coleman said. "Most people fear [a vision] because there's risk involved." He said that if Design Thinking succeeds at IBM, a company that for decades has typified how big business in the US works, large numbers of companies will likely follow suit. Indeed, IBM's Cutler said the company gives tours of its studio in Austin three times a week. Pushback still happens among employees and customers, of course. "Anytime something smells like a new process," Cutler admitted, some people are going to get defensive.


The 10 biggest startup opportunities in 2016

This year, venture capitalists and industry observers say the tech world should expect more of the same. "Most hot startups in 2016 won't be trying to lead revolutions or usher in whole new industries," says Igor Shoifot, an investment partner with TMT Investments. "Instead, they'll be enhancing existing technologies, products, services, or transactional ecosystems by saving users time, money, effort, and helping them make better choices more easily." However, the New Year has a few potential technology surprises in store, including the "Uberization" of manufacturing and mobile ecommerce in emerging markets. Here are 10 of the hottest technology startup categories, trends and opportunities (ranked in no particular order) experts expect to see in 2016.


How Goldman Sachs and Bank of America use the cloud and containers

For Thomas, containers represent a way to get the company’s developers and infrastructure workers to focus on the highest-value work. Too much time is spent on managing middleware systems and messaging buses that don’t add value for the bank. “Simplifying that and really flipping ratios of people who are just maintaining, supporting, managing applications, to people who are pushing the applications forward and bringing more value for our customers is the foundation of the goal,” he says. “It’s not about cost reduction, it’s about reinvesting the people and the talent we have to really business value added things for our customers.” Simplification means consolidation too. Thomas says Bank of America has condensed from 64 data centers last year to 31 this year. It plans to have only eight data centers by the end of 2016.


Governance Challenges When Gatekeepers are “Chilled”

The primary board concern is that for certain potentially controversial initiatives, some gatekeepers may become “gun-shy;” i.e., may engage in self-protective conduct that frustrates valid board strategic initiatives and other appropriate efforts. This, despite the fiduciary or employment risks a gatekeeper may assume by acting in what may be perceived as his/her own interests, as opposed to the legitimate business interests of the company. Note that this is a concern separate and distinct from the concern, expressed by some knowledgeable observers, that the new DOJ policy will have a chilling effect on employees’ willingness to cooperate in their companies’ internal investigations. We’re talking here about a different kind of “chill.” Such self-protective conduct may manifest itself in both obvious and subtle ways



Quote for the day:



"The sharpest criticism often goes hand in hand with the deepest idealism and love of country." -- Robert F. Kennedy


January 06, 2016

5 Predictions for Trends in Data, Analytics and Machine Learning in 2016

Applications will be designed to discover self improvement strategies as a new breed of log and machine data analytics, at the cloud layer, using predictive algorithms, enables; continuous improvement, continuous integration and continuous deployment. The application will learn from its users, in this sense the users will become the system architects teaching the system what they, the users, want and how the system is to deliver it to them. Gartner view Advanced Machine Learning amongst the top trends to emerge in 2016 with “advanced machine learning where deep neural nets move beyond classic computing ad information management to create systems that can autonomously learn to perceive the world, on their own … this is what makes smart machines appear "intelligent."


CES 2016: Sneak Peek At Emerging Trends

CES 2016 in Las Vegas came to life for media attendees with a preview event -- CES Unveiled -- on Monday night. The event was set up for vendors to show off their products in hopes of attaining media attention. It also served as a glimpse of the broader products and trends we'll be talking about throughout 2016 and beyond. After walking around a crowded ballroom for a couple of hours checking out gadgets of all shapes, sizes, and functionality, here are the major trends I saw observed that I think are likely to have long-term impact on our lives and businesses.


EU privacy watchdog to set up ethics advisory group

Outlining his plans for an ethics advisory group, Buttarelli said the group will “advise on a new digital ethics that allows the EU to realise the benefits of technology for society, whether for security or economic reasons, in ways that reinforce the rights and freedoms of individuals while retaining the value of human dignity”.  Buttarelli said that as the understanding that dignity is important spreads, people will want more opportunities to protect their privacy. “But we also need to be clear about exchanging personal data for incentives, whether those incentives relate to increased security or consumer benefits,” he said. According to Buttarelli, the internet has evolved such that the tracking of people’s behaviour has become routine for many intelligence agencies and an essential revenue stream for some of the most successful companies.


How IBM's Watson Takes On The World

Cognitive computing, according to Vice President of IBM Watson Steve Gold, essentially marks the arrival of a new “era” in computing. What started with his own company’s development of tabulation computing, to process US census data at the dawn of the 20thcentury, developed into programmatic computing in the middle of the century, with the arrival of transistors, relational databases, magnetic storage and eventually microprocessors. Now, the enormous growth in unstructured data we have experienced in recent years, and the sophisticated methods that have been developed to help us make sense of, understand and learn from this data, has given rise to cognitive computing. Cognitive computers don’t need to be programmed – they can learn for themselves.


Manage Your Emotional Culture

This playful spirit at the top permeates Vail. Management tactics, special outings, celebrations, and rewards all support the emotional culture. Resort managers consistently model joy and prescribe it for their teams. During the workday they give out pins when they notice employees spontaneously having fun or helping others enjoy their jobs. Rather than asking people to follow standardized customer service scripts, they tell everyone to “go out there and have fun.” Mark Gasta, the company’s chief people officer, says he regularly sees ski-lift operators dancing, making jokes, doing “whatever it takes to have fun and entertain the guest” while ensuring a safe experience on the slopes.


Data centers seek creative skills to drive innovation in IT

Creativity and innovation have more to do with the hierarchical, logical IT world than people may think, said James Stanger, senior director of products for CompTIA Inc., a nonprofit IT industry association involved in training and certifications. Innovation and creativity enhances an IT pro's ability to troubleshoot, design architectures and optimize performance to meet traditionally important metrics, such asreliability, stability and efficiency of IT operations. Stanger calls it the ability to make an informed choice. Combine an ability to see "the spaces between the systems" -- how everything interconnects and works in your environment -- with a deep knowledge of the protocols and procedures in use, and the IT worker can create the best architecture and operations possible for their business, he said.


Apple’s convergence will be about input not interface

It’s not the interface that’s changed, it’s the input. A Surface-style touch keyboard was a given as soon as the iPad Pro was confirmed, but Apple did more than move the keys to a more comfortable position. Snapping a Smart Keyboard to the iPad Pro instantly creates a bridge between the desktop and mobile realms, not just with the quick keystrokes and onscreen shortcut bar, but also in how integral it is to the whole experience. With the Air and the mini, Bluetooth keyboards are highly optional and occasional accessories that add little more than convenient typing, but the Smart Keyboard is absolutely necessary to the iPad Pro, so much so that I’m surprised Apple didn’t charge $200 extra and just include it in the box. It may be a baby step, but it’s an important one in the evolution of iOS.


“Just 14% said the IT operations department was the main sponsor of the migration project, and 11% said a business leader with no knowledge of the cloud was the main instigator,” said Rackspace’sAnatomy of a Cloud Migration report. “Overall, this means that CEOs, business leaders and boards of directors drive six out of 10 (61%) cloud migrations,” it added. In cases where the move to cloud was being led by a business leader rather than a tech one, it is far more common to see companies employ a third-party organisation to oversee the process, the report said. As for the reason why business leaders, rather than IT decision-makers, are leading organisations’ cloud charge, it could be because adopting cloud is seen as a way of cutting costs from the business.


Microsoft's New Security Approach

Nadella emphasized that the tools to protect, detect and respond to threats have existed for many years. The seeds for this were planted more than a year ago as Microsoft combined Intune, Azure Rights Management and Azure Active Directory Premium into its Enterprise Mobility Suite and the company doubled down on technologies such as auth­entication and identity management. "What is new is that posture," Nadella said. .... What Microsoft is trying to build, he said, is an "intelligent security graph" that brings together virtually all of the company's security intelligence from streams throughout Microsoft, its customers, partners and security operations centers throughout the world in real time and that of select partners tied into that graph.


Are Data Scientists Doing the Job They Were Hired For?

In a perfect world, data scientists would be free to access and manipulate all enterprise content quickly and fluidly without impairment. But the reality of the data environment for most businesses is a scattered and messy ecosystem of multiple systems and software; each used for different content and management functions. Data is duplicated, disconnected, and disjointed. There is no single portal or platform for search. When data scientists are forced to gather content from IT systems that are sprawled across innumerous platforms and departments, they are left grasping at straws and with little more than a flawed convenience sample. Garbage in, garbage out. The data scientist won’t likely answer all of our problems, but data management just might – if given enough time and planning. As 2016 starts to dawn upon us, business leadership is starting to realize that analytics skills alone will do little to make sense of enterprise-scale content.



Quote for the day:


"Executive ability is deciding quickly and getting somebody else to do the work." -- John G. Pollard


January 05, 2016

The Open Trusted Technology Provider™ Standard (O-TTPS)

Information Technology supply chains depend upon complex and interrelated networks of component suppliers across a wide range of global partners. Suppliers deliver parts to OEMS, or component integrators who build products from them, and in turn offer products to customers directly or to system integrators who integrate them with products from multiple providers at a customer site. This complexity leaves ample opportunity for malicious components to enter the supply chain and leave vulnerabilities that can potentially be exploited. As a result, organizations now need assurances that they are buying from trusted technology providers who follow best practices every step of the way.


Another Step Toward an Open NFV Ecosystem

ETSI NFV ISG Chairman Steven Wright of AT&T, commenting on the continued momentum, observed: “Among our most important 2015 goals was to foster interoperable implementations rather than creating new standards activity. Exiting our final meeting for the year, we are pleased with the progress and entertaining proposals for 2016 work items, which will continue to guide the entire industry on the direction for NFV.” A significant outcome resulting from NFV#12 was the completion of “Report on SDN Usage in NFV Architecture Framework.” This study was conducted over the past 12 months, with 40 contributors from across the industry, and motivated 35 recommendations for the ISG. The report analyzed SDN use cases for NFV, highlighting lessons learned from 14 ETSI NFV PoCs using SDN and NFV, along with open source SDN controllers.


Is Agile Costing You Too Much?

There is a question whether the product owner is a value-adding role or not? In many organizations it appears that it is not: The product owner is a middle-man with authority over prioritization. While sequencing and scheduling when done well will generate value, the act of sequencing one item over another, does not add value to the items themselves and from a Lean value stream mapping perspective it's a non-value-adding role. Even more curious then that Agile requires you to create non-value-adding positions. So in some extreme cases, Agile methods appear to have added 2 new people in non-value-adding positions for every 6 in value-adding positions. Put another way, after the Agile transition, 25% of the workforce is additional "Agile" overhead for operating the Agile method.


Can Your Business Survive a Data Center Outage?

ECS’s geo-replication ensures that the data is protected against site failures and disasters. ECS gives customers the option to link geographically dispersed systems and bi-directionally replicate data among these sites across WAN. Several smart strategies such as geo-caching are used to reduce WAN traffic for data access. That leads to the next natural question: If data is replicated to multiple sites, will I incur a large storage overhead? In order to reduce the number of copies in a multi-site deployment, ECS implements a data chunk contraction model, which dramatically reduces storage overhead in multi-site ECS environments. In fact, storage efficiency increases as more sites are added for geo-replication!


Spark and big data discovery: An evolutionary perspective

To enable a workflow that truly leverages the advantages of a Hadoop-based data lake, businesses need a set of tools that can open up all the assets in the data lake to everyone in the organization who needs them. They need to make analysis accessible and iterative. And they need a workflow that reduces the need for many specialized resources, placing core analytical capability into the hands of power business analysts. Businesses also need to empower these analysts to be citizen data scientists, who can free actual data scientists to pursue complex analysis rather than spending their time performing data preparation. With Spark, a data lake can become a true big data discovery environment. Spark’s emergence as a processing framework for big data is a game changer because its advanced analytics capabilities allow for large-scale data analysis across the enterprise.


Big Data Predictions For 2016

This past year was no exception. Everybody talks about the promise and the potential of big data. Yet there's a sense of disenchantment as CIOs search for use-cases to inspire change inside their own companies. They want to be shown, not told. They want the signal and not the noise. We noticed that 2015 was a noisy year, and 2016 seems like it will be equally as loud. It's not something that CIOs can afford to tune out. With digital transformations and pure-play startups disrupting established industries -- Uber is the example everyone mentions first -- the pressure is on to leverage data in new ways for competitive advantage. CIOs need to straddle two different worlds -- satisfying their existing customer base while moving fast to deliver instant, data-driven services to customers, or they risk losing ground to market upstarts.


Public vs. Private Cloud: How to Integrate Your Data Across Both

Another way to avoid data issues across public and private clouds is to simply choose one or another based on workload type and not have any particular workload straddle both. Some workloads have steady demand or sensitive data, which makes them better suited for the firewalled, fixed capacity confines of a private cloud. Financial analytics and Human Resources workloads are good examples. Other workloads see wide variations in demand and have publicly viewable data that make them a great fit for the elasticity of the public cloud. A customer-facing marketing website or customer analytics that have been sanitized to remove Personally Identifiable Information are typical candidates.


Measuring Change Readiness

One other concept we should stop and discuss briefly is the idea of change saturation. This concept captures the idea that organizations in general, and certain individuals in specific, can only absorb so much change at one time. One frequent occurrence with change efforts is the situation where more than one project or larger change effort may require the same human, financial, physical, information or other resources at the same time. To become aware of this situation and to enable you to work to mitigate the effects of change saturation, you will want to build a heat map identifying the different timing, duration, and intensity of the different requirements all of the different projects and change efforts will place on the different types of resources within the organization. This too is a prerequisite


Ransom32: First-of-its-kind JavaScript-based ransomware spotted in the wild

NW.js allows for much more control and interaction with the underlying operating system, enabling JavaScript to do almost everything ‘normal’ programming languages like C++ or Delphi can do.” Ransom32 is being sold as a service, but ransomware-as-a-service is not new; for example the Tox ransomware developer wanted 30% of the ransom payment and the FAKBEN Team requested a 10% cut of the profit. Ransom32 falls somewhere in-between, with the crypto malware authors wanting a 25% cut for customized versions of its currently undecryptable ransomware. Like other crypto-malware campaigns, wannabe bad guys sign up on a hidden server on the Tor network and can get their own customized Ransom32 ransomware after inputting the Bitcoin address where the ransom is to be delivered.


NVIDIA To Equip Self-Driving Cars With Water-Cooled Super Computer

Right now the focus of PX 2 is to detect and recognize objects, but Nvidia wants self-driving cars to also recognize circumstances. For example, a self-driving car may be able to distinguish an ambulance from a truck, and slow down. A car may also recognize snowy conditions and operate on a road in which the lanes are hidden. But such learning patterns are complex, and it could be a while until self-driving cars can handle such situations. The Drive PX 2 has 12 CPU cores, offers 8 teraflops of floating-point performance, has two Pascal GPUs and draws 250 watts. It is the equivalent of "150 MacBook Pros in your trunk," said Jen-Hsun Huang, Nvidia's CEO, during the press conference at CES.



Quote for the day:


"Products are made in the factory, but brands are created in the mind." -- Walter Landor


January 04, 2016

Why DevOps Needs To Embrace the Database

With all the new innovation, technology and methods that enable IT to move at the speed of the digital enterprise, DBAs are being asked to keep up using the same old methods. The life of a DBA has become a constant state of barely treading water if staying above it at all. As the pressure mounts, DBAs risk deploying changes that perpetuate the cardinal sin of application release – breaking the application. When erroneous database changes are not detected, the application(s) can go down and cause an unenviable ripple effect. DBAs have to track down and remedy the change, developers are annoyed that they have to go back and fix their app, C-level execs are likely livid that the business is losing money, and end users are thrown into a tailspin when they can’t access the apps they rely on.


GM and Lyft Are Building a Network of Self-Driving Cars

The partnership with Lyft, though, signifies ambitions far beyond Super Cruise. While we have no details on the proposed “network of on-demand autonomous vehicles”—such as how it will work or when it will arrive—we can assume it will require a far more advanced take on autonomous driving than Super Cruise will offer. Lyft, like other ride-sharing services, does the bulk of its work in cities, which are devilishly hard for robots to navigate. Urban areas are full of complicated intersections, pedestrians, cyclists, and other hard-to-predict variables. More to the point, it’s hard to see the benefit Lyft gets from partnering with GM on a car that sometimes needs a human driver: If you’re going to pay a person to drive, you might as well have them drive all the time.


Tech Salary Guide 2016

Technology remains a fast-growing and competitive field, and the salary changes for 2016 reflect that. Robert Half Technology reported on the salary changes for job titles across 10 tech verticals, and the average salary range has increased for every title. Some of the titles with the highest raises in salary include chief security officer, developer, business analyst, big data engineer, data scientist and wireless network engineer. Check out the salary ranges for popular technology jobs across a number of verticals and seehow much they've increased since 2015.


Are You Blind To Your Organization’s Culture?

Creating and changing a culture is a lot like losing weight. It takes hours of exercise, weeks of eating right and getting plenty of rest to drop 10-15 pounds. But it seems like those same 10 pounds (and then some) can get packed right back on in only one weekend of too much pizza, nachos, and beer. Culture can be like this. It may take a year or more for leaders to nurture the baseline of a constructive, healthy culture. And, just like that (he said with a snap) things can change. It starts to change when the CEO starts to talk badly about one of the members of their leadership team in the presence of the rest of the team. From there it continues to devolve when the members of that leadership team start doing the same to some of their own managers. In a matter of weeks, a new climate of back-stabbing and disengagement has emerged.


DBaaS to be the backbone of business transformation: Oracle

DBaaS is a paradigm where end users (database administrator, Developers, QA Engineers, Project Leads etc) can request database services, utilize it for the lifetime of the project, and then have them automatically de-provisioned and returned to the resource pool. This provides companies with a shared, consolidated platform from which organisations can easily provision database services. It also has elasticity to scale up and scale back database resources, and chargeback based on database usage, thus increasing cost efficiency. DBaaS follows the definition laid down by the National Institute of Standards and Technology (NIST) for cloud computing -SelfService, Rapid Elasticity, Measured Service (metering and chargeback) and Resource Pooling.


With the Calendar Turned to 2016, Will 'CIO' Become a Part of History?

IT can lead the creation and rollout of an online portal for employees to do everything from submit IT help desk requests, request a contract review from legal, to select healthcare benefits. This is why the CIO is the logical person to assume the role of CPO. The combination of an increasing adoption of cloud computing, and analyzing Big Data to help the enterprise reach its broad business objectives, will enable the CIO-turned-CPO to lead these new service-oriented initiatives. According to the new Verizon Enterprise Solutions’ “2016 State of the Market: Enterprise Cloud” report, 84 percent of businesses surveyed said their cloud use has increased in the past year, and half of enterprises say they will use cloud for at least 75% of their workloads by 2018.


IoT Trends for 2016. Everything connected everywhere

Smart Cities are the powerhouse of the IoT, within which all the other technology settings usually take place. All successful Smart Cities tend to have three aspects in common: non-partisan long-term objectives, combined Public and Private effort, and open data based on open standards. Successful open ecosystems rely on cost-effective cloud-based open infrastructures that are becoming the chosen solutions in many regions that are already either working or experimenting with these platforms. To ensure Smart City development to be deployed in an ordered manner, local government are attracting major technological partners through funding to put in place best-in-class Smart City services and achieve that long-term success that is key in this setting.


Ideas for WebRTC Implementation

WebRTC (Web Real-Time Communication) is an open source technology for implementing multimedia communication capabilities in real time directly in your web browser. It sets the Peer-to-Peer connection between two or more people, which is perfect for transferring of a media (audio and video streams). This technology is supported by the following browsers: Google Chrome, Mozilla Firefox and Opera. You do not need any additional plug-ins for these browsers; just open a web page and start a conversation. There is no native support for those using Safari and IE, but there is a possibility to add special plug-ins. The idea is simple. First, the browser sends a signal to the WebRTC server that the user wants to initiate a call. After getting the link from the server, the user sends this it to his companion.


Insurers Look to Tighten Cybersecurity Before Innovation

Insurers have realized that not only do they have a responsibility to monitor its activities, but they need to make sure that basic housekeeping is being carried out, says Steve Durbin, a former Gartner analyst and managing director of the Information Security Forum. “You may not be able to detect the malware, but you should be able to spot unusual activity, you may not be able to protect every piece of data, but you should have implemented encryption.” In a state filled with banks and insurers, it’s no surprise that New York regulators are currently considering a variety of cybersecurity requirements for banks and insurers. In a letter to other regulators, New York financial services superintendent Anthony Albanese said his agency has surveyed more than 150 banks and 43 insurers since 2013 and have concluded that “robust regulation” is needed.


The missing piece of the cloud security jigsaw

The adoption of the cloud presents additional issues that are snowballing into the unmanageable, specifically, with respect to identity management. It used to be that my active directory allowed for me to control access to most of my systems through domain and system credential access. However, the cloud doesn't necessarily conform to my existing network standards. In addition, with smaller cloud apps often IT isn't even really being consulted about user access or application operation or controls. In general control audits we do assess if users are properly activated and deactivated, but there still exists a lack of visibility which makes me feel uncomfortable in terms of who has what access when. Standards such as SAML are good but are not necessarily always available.



Quote for the day:


"Truly successful decision making relies on a balance between deliberate and instinctive thinking." -- Malcolm Gladwell


January 03, 2016

Enterprise Architecture - Guiding Principles

The usefulness of principles is in their general orientation and perspective; they do not prescribe specific actions. A given principle applies in some contexts but not all contexts. Different principles may conflict with each other, such as the principle of accessibility and the principle of security. Therefore, applying principles in the development of EA requires deliberation and often tradeoffs. The selection of principles to apply to a given EA is based on a combination of the general environment of the enterprise and the specifics of the goals and purpose of the EA. The application of appropriate principles facilitates grounding, balance, and positioning of an EA. Deviating from the principles may result in unnecessary and avoidable long-term costs and risks.


How to Flush DNS

There are wide arrays of DNS issues that can arise at the network administrator or power user level. For the end-user; however, the majority of DNS problems arise from either bad configuration entries or the local computer’s DNS storage requiring flushing. Independent of the type of operating system, many home computer users will input the DNS Server for their respective Internet Service Provider (ISP) incorrectly resulting in a failed Internet connection. Each ISP will have a slightly different configuration process; however, the IP address of the DNS server for your home network to use will be provided on registration for service. Many times the ISP will use the address for their actual DNS server, where others it will be the same as the Gateway IP for the service


The Disciplined Agile Framework

IT departments are complex adaptive organizations. What we mean by that is that the actions of one team will affect the actions of another team, and so on and so on. For example, the way that your agile delivery team works will have an effect on, and be affected by, any other team that you interact with. If you’re working with your operations teams, perhaps as part of your overall DevOps strategy, then each of those teams will need to adapt the way they work to collaborate effectively with one another. Each team will hopefully learn from the other and improve the way that they work. These improvements with ripple out to other teams. The challenge is that every area within IT has one or more bodies of knowledge, and in some cases published “books of knowledge”, that provide guidance for people working in those areas.


Designing the Business of IT

One of the core benefits that organisations can expect is a more cost-efficient IT environment. Senior IT leaders from MunichRe, Shell and Achmea, as well as research from Gartner, predicts that IT4IT will help organisations manage an increasingly complex IT estate in a more cost-effective fashion. It will also free up time and budget for innovation and new products. They feel the Reference Architecture provides a strong framework for managing multi-sourcing approaches, which are becoming more prominent in organisations around the world. Another key benefit of IT4IT is that it is not being introduced as an alternative to methodologies or frameworks such as TOGAF and ITIL.


Google's 'Lego' Smartphone, Smarter TVs: What We're Excited About In 2016

The Internet of Things should continue to provide the foundation for the technology industry's ambitions next year, framed by machine learning, analytics, networking, and ever-smaller devices. Connected sensors will proliferate. Intelligent software agents will learn new tricks that automate discrete tasks in a way that's similar to Gmail's Smart Reply service. Robots will emerge from private businesses to begin grocery deliveries on public sidewalks. If regulatory approval can be secured, drones will begin lawful package deliveries, following in the footsteps of flying contraband couriers.


TLS Client Authentication

Why TLS client authentication? Because that’s the most standard way to authenticate a user who owns a certificate. Of course, smartcard certificates are not the only application – organizations may issue internal certificates to users that they store on their machines. The point is to have an authentication mechanism that is more secure than a simple username/password pair. It is a usability problem, especially with smartcards, but that’s beyond the scope of this post. So, with TLS clientAuth, in addition to the server identity being verified by the client, the client identity is also verified by the server. This means the client has a certificate that is issued by an authority, which the server explicitly trusts.


Market Police Deploy New Algorithm Weapons Against Spoofers

“We have to capture every trade now,” O’Brien said. “In today’s markets it’s all about analyzing patterns and contexts.” Yet given how rapidly fraudsters can change their methods to hoodwink human beings, outwitting surveillance software could be even easier. Algorithms are sophisticated but they’re incapable of determining whether a flurry of buy and sell orders are legitimate or unlawful. “The surveillance tools are merely the first line of defense,” said Haim Bodek, founder of Decimus Capital Markets, a New York-based algorithmic investing firm. “These tools can help bring suspicious activity to the attention of regulators, trading venues and brokers, but they’re a poor substitute for a compliance program that monitors activity across affiliated accounts and groups of traders.”


2025: the five key attributes for your business surviving the next ten years in tech

The two make-or-break traits that rose to the top for these leaders were being able to spot new opportunities predictively and being able to innovate in an agile way. The survey also asked these leaders how prepared they believe their organisations are in each of these two dimensions. The gaps were quite remarkable. While 62% of those surveyed identified predictively spotting opportunities as being very important for their businesses, only 12% thought that their businesses had this capability. And only nine percent believed their organisations were capable of innovating extremely well in an agile way.


Podcast: Portfolio Management & The Agile Extension

In agile, we need to be prepared to constantly adapt our plans. That approach works extremely well at the project or initiative level, but at an organizational level, budgets and plans tend to be longer term and less adaptable. The current rate of change often means that those plans are negated and organizations find it difficult to adapt quickly to changing market conditions. We need to take the concept of backlog management and apply it at a higher level to programs and portfolios so that we are able to adaptively respond to changes in the world around us. The traditional definition of project success has been on time, on scope, and on budget. Those constraints still exist, but they are not the driving factors today.


Cybersecurity in 2016: will it come down to luck or leadership?

Unfortunately in most respects, 2016 won’t change much: users will still unknowingly click on malicious links; IT departments will still be bad at staying up to date with patching; the bad guys will continue to attack; and the tide of misery from breaches will persist. What matters most is whether your organisation will be a victim or not. Of course you could do nothing, and be lucky. But the only way to control your fate is to lead your organisation to the high ground based on a well-considered, security-first strategy. It is important to remember that, despite their claims, most security vendors cannot help you. Within the market we see too many 'me too' vendors, who’s main focus in on the staple of detection.



Quote for the day:

"It is literally true that you can succeed best and quickest by helping others to succeed." -- Napoleon Hill

January 02, 2016

5 Aspects of Cloud Computing to Watch Out For in 2016

The best part is that there is no need to buy expensive software licenses. Any software application upgrades, and the security feature aspects are the sole obligations of the cloud service provider. When it comes to efficiency, flexibility, reliability, and scalability the cloud undoubtedly triumphs over the rest.  Staying competitive in today’s world demands high levels of investment in technology in its various avatars. But enter cloud computing and all your worries go away thanks to ‘rent, don’t buy’ policy so strikingly espoused by it. It is easy to stay on top of market opportunities and retain your competitive edge with cloud computing. The future of cloud computing will only get brighter.


Technology Will Create New Models for Privacy Regulation

The average cost per user of a data breach is now $240 … think of businesses looking at that cost and saying “What if I can find a way to not hold that data, but the value of that data?” When we do that, our concept of privacy will be different. Our concept so far is that we should give people control over copies of data. In the future, we will not worry about copies of data, but using data. The paradigm of required use will develop once we have really simple ways to hold data. If I were king, I would say it’s too early. Let’s muddle through the next few years. The costs are costly, but the current model of privacy will not make sense going forward. If I ping a service, and it tells me someone is over 18, I don’t need to hold that fact.


Artificial Intelligence Finally Entered Our Everyday World

Ng’s prototype relies on a technology called deep learning. Inside the massive computer data centers that underpin Baidu’s online services, the company runs massive neural networks—networks of hardware and software that approximate the web of neurons in the human brain. By analyzing enormous collections of digital images, these networks can learn to identify objects, written words, even human faces. Feed enough cat photos into a neural net, and it can learn to identify a cat. Feed it enough photos of a cloud, and it can learn to identify a cloud.


The CEO of Google's £400 million AI startup is going to discuss ethics

Defending why Google hasn't revealed the members of its AI ethics committee yet, Hassabis said "it's very early days" and "there's lots of scrutiny on this". He said he'd like to get everyone "up to speed" on artificial intelligence first. "We wanted to have a calm, collected debate first," he said. "At some point we will reveal who these people [on the ethics committee] are and what issues are being discussed." Hassabis also assured the audience that he will not allow DeepMind technology to be used in military applications. Hassabis also revealed that he spoke with Hawking on the topic of AI a few months ago. "I think [Hawking] was quite reassured about how we specifically were approaching AI," said Hassabis.


Ten digital trust challenges

It’s not just the challenge of keeping up to speed with technological developments. It’s whether and how the current design of our private and public institutions needs to adapt to cope with these changes and to restore the trust of society – digital trust. In June, to mark the 800th anniversary of the Magna Carta, we looked at how institutions (organised and purposeful interactions of people based on contract, law or culture) must create and maintain trust through legitimacy, effectiveness and transparency, and how global megatrends like technology are driving the need for a design transformation and a bold new charter for our digital world. Here we take a look at 10 digital trust issues that in our view institutions must grow new capabilities to address.


Security threats, hackers and shadow IT still plague health IT

As a starting point, Hopfer suggests that CIOs take an inventory of the cloud services running within their organizations to assess their security posture. The exercise of evaluating what types of applications employees are running can shed light on the tools they need to support the business objectives of the enterprise. ... As hackers grow more sophisticated and attacks mount, security is a primary concern for CIOs in all industries, but it carries a special importance in healthcare owing to the sensitivity of the data involved. Moreover, much of the information contained in health records is unalterable, and, taken in composite, makes for a remarkably full profile that criminals can put to use for all manner of fraudulent ends.


How to defend your business against the worst hackers have to offer in the New Year

When it comes to crystal ball gazing in the tech world, the rule of thumb is not to do it as you only end up looking like an idiot in 12 months’ time. This time, though, the sad truth of the matter is that predicting the shape of the IT security threatscape for next year really isn't that hard: 'more of the bloody same' pretty much sums it up. But while exploit kits, DDoS attacks, and ransomware will all continue marching into the enterprise and doing damage, these threats will also evolve to become more dangerous. Here's five IT security predictions to mull over as you recover from your New Year's Eve party.


Big Data Industry Predictions for 2016

In 2016, automated personalization will be a critical business benefit that big data analytics will begin to deliver in the coming year. Companies will continue to seek competitive advantage by adopting new big data technologies and allowing machines to simulate subjective ‘squishy’ data – including human communication cues such as nonverbal behavior, facial expressions, tone of voice. Big data analytics makes this possible by assimilating vast amounts of information, including the types of data that were too slow and expensive to collect and analyze in the past, such as communications and case records for knowledge workers. As the machines get better at interpreting a variety of data types and collating it with vast quantities of structured data, they can begin to improve and accelerate both employee-owned business processes and customer-facing experiences.


Using Redis As a Time Series Database: Why and How

One of the first questions brought up when talking about Redis and its use as a time-series database is “what is the use or purpose of a time-series database?” The use-cases around time series databases are more related to the data involved - specifically that your data is structured as a series of events or samples of one or more values or metrics over time. A few examples include (but are not limited to): Sell price and volume of a traded stock; Total value and delivery location of an order placed at an online retailer; Actions of a user in a video game; and Data gathered from sensors embedded inside IoT devices. We could keep going, but basically if something happened or you made a measurement, you can record that with a timestamp.



Quote for the day:


"Unless you try to do something beyond what you have already mastered, you will never grow." --Ralph Waldo Emerson