January 24, 2016

Thinking Outside-In: How APIs Fulfill the Original Promise of SOA

Outside of the pure technology reasons, APIs have gained traction due to the inherent focus on simple, practical deployment. This, in turn, made it easier for technology leaders to convince their bosses that it was worth the investment, simply because it was easy to deliver tangible results very quickly. The API deployment model, that is, where and how APIs are deployed, executed, and accessed by consumers, is often referred to as “microservices” – decomposing the business workflow into a set of extremely fine grained services, each of which only does one thing and does it well. A microservice is typically not bigger than 100-1000 lines of code, outside of which it is time to split it into two separate services.


Artificial Intelligence is Closer -- and Less Awesome -- than Most Realize

“AI is making the biggest advances in things like speech recognition, computer vision problems and processing millions of images very fast,” Baveja said. “A lot of it’s driven by much faster processing, much cheaper processing and having much more data.” Within a year, the team hopes to have an early version of the tool that students can use to receive a customized list of classes they should take based on their unique circumstances. Human advisers will remain essential, Baveja said, but humans suffer from constraints such as limited time and availability. And while human advisers are good at recognizing contextual information like a student’s emotional state, even the most experienced adviser doesn’t have in mind a statistical overview of all student and class data enriched by concomitant patterns and trends.


AI, Machine Learning Rising In The Enterprise

Facebook AI Research (FAIR), which had already released to open source its deep-learning modules for the open source development environment Torch in Jan. 2015, last month announced another move. This time Facebook said it would release its server hardware design that's been optimized for machine learning to open source. Facebook has submitted the GPU-based system design materials to the Open Compute Project. The company said that the system is designed for greater energy and heat efficiency, as well as ease of maintenance. Digital tech giants such as Facebook, Google, and Amazon that have large data center operations have long designed their own hardware rather than use the designs from others, such as HP Enterprise and Dell. So, why the big wave of open source releases for machine learning-related development by these big companies?


Cloud app security: How not to fail

Developers often have a knee-jerk response and encrypt everything everywhere. While that sounds like the safest route, there are trade-offs you should consider first. For instance, if you implement encryption in flight, you'll have to encrypt and decrypt data before you place it on the network for remote consumption, and before any of your applications can consume it. That requires processing time and imposes a performance penalty that, in the cloud, can add up to a major cost. Make sure you have clear requirements for encryption in flight and that the level of risk and potential loss, calculated in dollars, is worth the cost of using encryption in flight. Encryption at rest is the most practical to apply, because you're ensuring that the data can't be read as it sits on a cloud-based storage system.


What 2015 Taught Me About The London Tech Scene

Add the very obvious talent shortage, and you have a recipe for spiraling costs. If I learned one thing in 2015, it was that surrounding myself with good people who want to work hard is paramount, yet I ended up contracting outside the U.K. based on recommendations. If there’s one reason London is awash with so many one- and two-man bands, it’s that the cost of finding and hiring a team is so high. Sure, you can go to Lithuania or Pakistan or Timbuktu and find someone who can get the job done, but that’s no replacement for in-house talent — and without a good team, you’re basically nowhere. The government’s plan to loosen up visa requirements for suitably qualified candidates should help in the short and medium term, but only a greater emphasis on training will sort out the problem in the long term.


Businesses need to place higher priority on cyber security

For too long have smaller companies adopted the attitude that they are too small or too low value to be targeted, and for too long has cyber-security taken a back seat. As this research shows, the outsourced approach is increasingly a viable alternative to the "go at it alone" status quo. It opens the door to a world of experienced MSPs, the best of which offer comprehensive, lightweight security solutions that are affordable, easy-to-install and provide real-time protection against modern threats. These small businesses are often targeted by advanced and persistent threats because of their partnerships with bigger fish. Without addressing these security capabilities SMBs will find it increasingly difficult to work with larger enterprises.


Accelerating Change in Data Analytics

The daily lives that are being impacted most by these changes are those of the people responsible for running data centers and delivering those analytics services. The role of IT professional has switched from a proactive annual planning set of responsibilities, to a reactive “how do I find compute and storage resources really quickly” list of requirements. It’s impossible to predict capacity needs, and speed is required to respond to opportunities and risks as they happen. In addition, the ability to leverage other resources on the cloud to ease the risk of predicting capacity improperly has resulted in a whole host of governance challenges. This huge amount of change has even lead to some referring to 2015 as “one of the most radically disruptive and transformative periods in IT industry history.”


Data Scientists: The Myth and the Reality

The data scientist of 2016 has been described as “part analyst, part artist.” She combines an analytical mind with the ability to interpret data to spot trends in business that are otherwise unseen. This skill requires an innate sense of creativity and thinking outside of the box. A solid foundation in math, analytics, computer science, and applications, as well as computer programming, are some of the skills needed to succeed in a career in data science. The all-star hybrid data scientist jobs that are often advertised online are something to take with a pinch of salt. As with most careers in tech, it is not likely that you have had extensive experience in all areas that are required on the job spec. A recent article on the topic states that it is important to look beyond the definition of the “unicorn” data scientist.


Three Reasons You're Underestimating The Impact Of Digital On Business

The digitally-enabled platform economy is often cited as the greatest opportunity for growth, but one that is off limits to companies outside Silicon Valley. It is true that ‘born digital’ companies dominate platform business models today, offering new value by bringing millions of consumers and service providers together. They can act fast because their model doesn’t rely on owning assets or producing goods. However, we now see classic manufacturing companies exploiting the value of data and digital platforms to build similar ecosystems of partners and customers on top of their asset heavy business models. In healthcare, engineering, even agricultural equipment, long established brands are embracing digital disruption by creating entirely new services and streams of revenue.


Are health hackers the new cyber security threat?

The market is flooded with stolen credit card details, he said, so “healthcare records attract the premium now”. Investigators do not believe information from the Anthem breach has been sold on black markets. However, other hackers have targeted victims of the Anthem attack with fake emails that appear to be from Anthem or offer credit protection. Those emails aim to steal data that could be sold to criminals, people familiar with the case say. Anthem plans to spend $130m over two years to better protect its networks from breaches. The company has assured regulators that it has strengthened its system, taking steps such as changing administrator passwords every 10 hours and hiring 55 cyber security experts.




Quote for the day:

"MindfulLeaders cultivate the ability to think clearly & to focus on the most important opportunities" -- @Bill_George


January 20, 2016

Semantic Data Technology and Innovations in Client Lifecycle Management

Ontologies can be used to support the KYC/AML in the following areas: a) Ontology based information extraction – used to extract relevant information from unstructured documents (for example monitoring a website to detect people who are involved in money laundering) b) Ontology based information discovery through inference – Detect money laundering schemes or establish connections of people with organizations that are on criminal watch lists c) Ontology based compliance rule verification d) Seamless integration of external and internal data – for example, data integration between internal watch lists between businesses.


Internet of Things in 2016: 6 Stats Everyone Should Know

The Internet of Things, which connects cars, homes, wearables, and everyday objects to the cloud, is a hot tech topic these days. Chipmakers such as Qualcomm andIntel are expanding into the space to diversify away from their core chip businesses. Smartphone makers such as Samsung and Apple  are entering the wearables and smart home markets to expand their mobile ecosystems. Yet according to Accenture, around 87% of mainstream consumers still don't understand what the IoT market is. Therefore, let's take a moment and review six key statistics that everyone should know about the Internet of Things.


JavaScript’s Creator Is Building A Browser For The Ad-Blocked Future

“At Brave, we’re building a solution designed to avert war and give users the fair deal they deserve for coming to the Web to browse and contribute,” Eich wrote. And an interview, he told me, “We’re doing something bigger than an ad blocker.” At a basic level, Brave is, yes, a browser that blocks ads, as well as a variety of data collection technologies, such as analytics scripts and impression-tracking pixels — as Eich put it, “We clear the whole swimming pool of algae.” But there are some important nuances here. For one thing, Eich said Brave won’t block all ads, because native, trackerless ads that only use the publisher’s own data will appear to the browser as normal content, and won’t be blocked.


8 Cheat Sheet Sites To Ace Tech Job Interviews

"You should do anything you can that's legal to prepare for an interview. That includes looking at these sites, talking to people you know who work at the company or used to work there and talking to recruiters who help the company find people," said Jon Holman, founder of the executive recruiting firm The Holman Group. ... Holman stressed, "You especially don't want to assume that the statements on the blog are true or current. Companies aren't stupid. If they know that a blogger has posted their "standard" questions ... (the) questions will get changed. And if you're flummoxed in the interview because you didn't think more broadly than the list of questions on the blog, well, you don't deserve the job."


DevOps: Tear Down the Wall!

“You build it, you run it” is one of the key principles of DevOps. The premise is based on the reluctance of a developer to pass defects downstream if there is a chance they’ll get paged later that night to fix a self-induced production incident. As developers embrace codifying resilient operational considerations into their delivery pipeline, they’ll begin to appreciate the heavy-lifting required to ensure their environments are well-managed and secure. ... Just as Development is inclusive of a myriad of interconnected disciplines and functions, Operations is also an overloaded term. The subtle complexities of infrastructure, network, and security need to be considered carefully before you remove “The Last Few Bricks”.


How to manage integrated testing for CI, CD and DevOps

With time, the need for getting quickly to the market has enforced test automation to be included in the early stages of a development process. More and more organisations are realising the importance of writing test code or scripts similar to that of writing development codes. ... An Integrated Test Management framework equipped with multi-tool integration capabilities can support continuous integration, automated triggering of build, automated testing and results reporting, ensuring continuous delivery, and rapid deployment practices - the roadmap to achieve DevOps.


Democratizing Big Data value

“Almost every company nowadays is growing so rapidly with the type of data they have,” adds Saso. “It doesn’t matter if you’re an architecture firm, a marketing company, or a large enterprise getting information from all your smaller remote sites—everyone is compiling data to [generate] better business decisions or create a system that makes their products run faster.” There are now many options available to people just starting out with using larger data set analytics. Online providers, for example, can scale up a database in a matter of minutes. “It’s much more approachable,” says Saso. “There are many different flavors and formats to start with, and people are realizing that.”


Getting Ready for IoT’s Big Data Challenges with Couchbase Mobile

Couchbase Mobile handles security in 5 areas: For User Authentication we support pluggable authentication. Out of the box we have support for popular public login providers like Facebook or you can write your own custom provider; For Data Read/Write Access there are fine-grained policy tools that allow controlling data access for individual users and roles; Data Transport on the Wire, for data in motion, is over TLS; Data Storage on Device, for data at rest on device, uses the device’s built in File System Encryption and additionally data-level encryption; and Data Storage in the Cloud, for data at rest in the cloud, you can configure Couchbase Server to use File System Encryption.


Analytics Investments Often Depend on How the CIO Views Their Legacy

What was interesting was that we discovered as we delved deeper into the data that there are three very distinct patterns of how CIOs deliver value to their organization. The patterns were “trusted operator”, “change instigator” and “business co-creator”. There is no right or wrong pattern. It is just dependent on the need of the business at that time. Cloud was a really big one for trusted operators. They viewed that as a way for them to think about and engage with internal business stakeholders and drive either some of the cost efficiencies or reliability, or whatever security issues were important to them. Cloud was important, but not as important or digital or analytics or business intelligence. But cloud seemed to be resonating really well with the trusted operators.


The trouble with being SMART

Respect for expertise, not centralized authority, coordinates open source communities that create great technologies. Innovative companies give employees off-the-clock time and free resources, and benefit from their tinkering. Such environments thrive due to decentralized action. SMART goals cannot add to, and inevitably subtract from, these structures. Second, companies no longer compete individually but as members of networks: Apple couldn’t create the iPhone, or Airbus the A350 aircraft, without collaborating with outsiders. Complexity, uncertainty, and ambiguity are unavoidably present therein since network members are geographically dispersed, and have varying strategies, processes and cultures. These enable problems and opportunities to regularly propagate with blinding speed.



Quote for the day:


"Talent hits a target no one else can hit; Genius hits a target no one else can see." -- Arthur Schopenhauer


January 17, 2016

Enterprises may eye supercomputing

Barry Bolding, chief strategy officer at Cray, said large companies are increasingly utilizing scientific computing and workloads that are consistent are often more cost effective on-premise. The cloud provides an HPC option for smaller and midsized companies. "There are a lot of great uses for the cloud and it's more flexible," explained Bolding. "But when you have continuous work the cloud can be more expensive because there's so much volume. You can't afford to do spot pricing all the time and have to reserve instances at a premium. If you're doing weather forecasting 7 days a week, 24 hours a day and 365 days a year on-prem is less expensive." Instead, Bolding said high-end Cray systems are more complementary to the cloud, which is used for bursting.


Hadoop, Open Source Adoption and Growth to Accelerate in 2016

A look at the technology environment today will show that the tables have totally turned. Even in large, conservative organizations, we see the desire for open-source software over even the tried-and-true commercial, closed-source options. After emerging from their bubble, organizations have begun objectively questioning whether licensing and suppor costs are providing true ROI. They are becoming increasingly concerned about vendor lock and see the strategic limitations of the inability to fix bugs or enhance these commercial fferings. If companies want to change something in an open-source project, or accelerate a bug fix, they now have the opportunity simply to do it themselves.


The second digital revolution: A cloud of clouds

For years, CIOs have struggled to avoid duplicating IT services, or creating silos of data that cannot be shared. A potential downside of a shift to the cloud, especially when you have business units autonomously introducing cloud solutions, is that it might perpetuate that same problem of isolated services and information. The CIO ends up with a patchwork of cloud solutions, over which he or she has little control. One emerging solution is for the CIO to combine all the organisation’s cloud services into a single cloud — a cloud of clouds — that can be managed and secured centrally. With such a strategy, the CIO has more visibility and control, and is better able to identify risky services, tighten up security and ensure a fair allocation of corporate resources.


Presenting a Roadmap for Digital Transformation at NRF 2016

Life in the Digital Vortex is challenging—especially for retailers. In an environment where averaged across industries four of the top 10 incumbents will be displaced by digital disruption in the next five years, retail ranks as the third most vulnerable out of 12. Retailers are also being squeezed between online-only retailers and traditional competitors that are further along with their digital transformations. But with the threat also comes opportunity. Cisco’s most recent Digital Value at Stake research highlights specific digital use cases that industries can implement now to drive new sources of value. According to the research, six industries—manufacturing, financial services, retail, service provider, healthcare, and oil and gas—will account for 71 percent of the total private sector Digital Value at Stake over the next decade.


5 reasons most outsourcing projects fail

Outsourcing is an integral part of today’s work culture. Companies across a wide range of sizes and industries are choosing to outsource some or all of their software development. As David Berry, CIO of Daymon says, outsourcing is no longer about saving money, but primarily about flexibility and getting to scale.While outsourcing has many benefits, it also brings some operational challenges. To get a better sense of the roadblocks that could derail an outsourced project, I interviewed people who take responsibility for outsourcing software projects - CIOs.


10 things the tech world should leave behind

Like many of you, I saw Star Wars this past month and thought, "Man, it's good to be back among the lightsabers, Tie fighters, droids, and of course, the Millennium Falcon!" It didn't feel anachronistic to return to the Star Warsuniverse, even though it had the same elements as when I was six years old. Some old-school technology (both on and off the big screen) will always remain fun and interesting. But other technological elements have worn out their welcome and need a swift kick to the curb. This article looks at 10 examples. Now, this list is subjective and I can't promise all these things are headed for the dustbin of history.


2016 and Beyond: Technologies and Trends that Will Change the Future of IT

The race to connect the unconnected will continue as well, whether we speak about connecting the next 4 billion people, introducing more wearables, creating body implants or enabling the Internet of Things (IoT), where billions of sensors are changing the way we live our lives. In the coming year, we will continue to march toward IoT with more than 11.5 billion mobile-ready devices and connections – 4 billion more than there were in 2014. As in years past, we’ve leaned on the Cisco Technology Radar to spot the next innovations that could benefit our customers, challenge the status quo of existing product portfolios or even address technology gaps.




Quote for the day:


"The task of leadership is not to put greatness into humanity, but to elicit it, for the greatness is already there." -- John Buchan


January 15, 2016

Digital Influencer: Catching Up With David Linthicum, Sage Of The Cloud

It’s unusual for techies with no particular writing background to have the discipline to write such a book. Linthicum’s secret? “You have to give up stuff. Spend less time on things you enjoy,” he says. “It’s tough to get the discipline to write one to two thousand words per day, but you have to do so to crank out a book.” In spite of putting in such an effort, the book proved too early to market – another repeating theme in Linthicum’s background. “I wrote the EAI book for Addison Wesley,” Linthicum explains, “But only a few people were following EAI at the time. In 1996 there was no interest in the book.” Then in 1997, Software AG spun off their American efforts as SAGA Software, and Linthicum joined as CTO. “They had no technology at the time, so we had to create our own,” Linthicum says. “By 1997 EAI was still not hot, so we called it ‘solution-oriented middleware’.”


IT efficiency is a moving target; here's how to hit the bull's-eye

IT staying behind closed doors or locked in the data center is no longer an option. Technology continues to integrate deeply into every business and job function of the organization. All leaders, including the chief information officer and directors, should attend and be involved with activities on campus. How can IT develop an internal strategic plan if we are not fully engaged with business units, academics, faculty senate, staff senate and student government? Without that engagement we are only focusing on what we think is best or listing the next evolution of our favorite technologies. It is easy to like a particular technology and be biased toward the true impact it will have at an institution.


What Remix OS means for Android on PCs

What Remix means for Android is unclear. While Chrome OS has taken over the education market, it hasn't seen the same level of success that Android has in mobile. In particular, the selection of Chrome apps is pretty paltry, while development of Android apps is thriving. If Android were Google's laptop-and-desktop operating system, the app gap would cease to be an issue. But Google's ultimate plans don't really matter. Android's fundamental open-source nature means it can be hacked and modified into a desktop operating system even if Google never wants to go in that direction. The Remix OS proves that. Even if Android does become more of a desktop operating system, Google probably won't start offering it for download onto any PC. With some work and polish, Jide's Remix OS could become a more compelling alternative for average computer users than traditional desktop Linux. This is a project to keep an eye on.


Why the lack of women in IT is bad for tech, bad for the economy

According to the report the problem starts at school. It points to a 2012 survey which found that only 17 percent of girls had been taught any computer coding in school, while almost twice as many boys had (33 percent). "And some argue that girls are often steered away from science and math courses in primary school," the report says. "Other experts go earlier still, stressing the role parents need to take in encouraging girls younger than school age to be interested in science and technology." According to Regina Moran, the CEO of Fujitsu UK & Ireland, the gender imbalance is bad news for the UK. "Women make up a large proportion of our customers both professionally and personally," she said. "Neglecting women in the workforce will be a costly mistake."


When To Upgrade PCs: 4 Tips For A Smart IT Strategy

It's equally important to monitor user behavior, to see if their PC usage is changing over time. Smartphone and tablet use through progressive BYOD policies within many organizations are pushing users away from the PC and onto mobile devices. Additionally, applications themselves are being reworked so that they can operate on lower-performance mobile devices. In many cases, the only option for PC users is to access applications using a Web front-end that requires very few computing resources. The corporate-issued PC is becoming nothing more than a simple keyboard, mouse, and monitor portal, which connects to backend servers that handle the bulk of the processing power. That's why we see companies waiting for a catalyst event -- such as a major OS upgrade -- in order to justify new hardware.


Smartwatches in transition as smartglasses rule

In 2015, industry analysts expected smartglasses to realize $1 billion in annual costs savings in the field services industry, and they estimate the market will grow to $6 billion in 2016, Ballard said. "I think monocular smartglasses are going to really fly this year in enterprise. I think you're going to see true augmented reality glasses like Microsoft HoloLens and cousins like Epson's glasses bridging gap between the two," Ballard said. "We are seeing some early adoption trends that we think will take off around HoloLens, biometric authentication, token, and wearable IoT sensors." AR glasses are particularly beneficial in the enterprise. Dan Ledger, principal for Endeavour Partners, said, "We're getting to a point now where Google Glass came out a few years ago and ran its course and Google is working on its new iteration of that and Microsoft has some incredible products.


Why thinking like a criminal is good for security

A combined focus on technical and human surveillance is good security practice. “Have employees be aware. Lock doors and windows. There are a lot of technology things you can do. Bad guys have as good of technology as the good guys. We scan and find, but bad guys do too, but they act before the hole is fixed,” Stolte said.  A slight shift in language when talking about security and data can also help security teams think like a criminal. Erlin said, “It’s a very common best practice for organizations to identify sensitive data. Using the term valuable instead twists perception away from what organizations feel is sensitive to what might be valuable to a criminal.” Regardless of what other information criminals might find valuable, the crown jewels will always remain sensitive and top priority.


Data virtualization tools move into strategic IT realm

Well, like a lot of things in analytics, things have been around for a long time but the business need for them and the ability of the environment that we're in -- in terms of the amount of memory people have, network bandwidth -- [wasn't conducive to effective use of its capabilities]. The technology … has existed for a while but the business demand (social media, IoT, sensor devices, machine learning, Web data and a lot of the cloud data) [did not]. A lot of companies use cloud applications … so there's much more demand for this virtualization of the data and there's much more data that's scattered out there. Even though the technology existed before, the need for it has exploded and then the capabilities for that kind of technology to go after the volumes of data -- the unstructured data and structured data -- have all sort of grown based on the demand.


2016: Cyber-Crime Becomes Big-Time

"2016 will see cybercrime finally find its place in our official statistics," says KPMG's cyber security technical director, David Ferbrache, "extortion attacks have been making a comeback with criminals demanding significant sums for suspending denial of service attacks against targets; not going public with stolen data; and of course providing a ‘service’ which grants access to a ‘client’s data which they had previously hacked and encrypted." “While phishing attacks, banking Trojans and large scale low value cash outs have characterised the last 10 years of cybercrime, new techniques are becoming part of the criminal arsenal while firms invest more and more in cyber threat intelligence in the hope of keeping up," adds Ferbrache, "in 2016 we predict that organised crime groups will become increasingly selective in targeting high net worth individuals, corporate treasuries and commercial bank accounts."


State CIOs agenda targets cybersecurity

"Our CIOs have to manage a lot of federal data, and they all have to be managed differently, even though the CIO is attempting ... to establish an enterprise vision," she says. "These federal regulations are standing in the way of consolidation and optimization, to put it simply." So NASCIO is asking for relief from federal regulations, generally (a tall order, Cooke admits), and in particular is trying to call attention to the challenge of sharing information, both within different state agencies and with outside entities like federal and local government groups, other states and the private sector. Too often, Cooke says, federal programs administered by the states don't afford CIOs or agency administrators the explicit flexibility to share information and collaborate across the siloes in which those programs reside.



Quote for the day:


"The worst part of success is trying to find someone who is happy for you." -- Bette Midler,


January 14, 2016

Big Data Goes Mainstream: What Now?

Organizations today are often pursuing those goals by implementing big data environments that coexist with the data warehouse, according to Bean. Organizations are currently looking at what information is better suited to what environment. "There are certain things that a data warehouse is suited for, like data compliance" or operational reporting. "Big data is more about discovery environments and looking for patterns … Right now there is a value to both environments." Another factor that comes into play between the data warehouse environment and the big data initiatives is cost. "One of the original premises of big data was it was much more cost-effective than traditional data environments," Bean said. "And that will likely be the case."


Microsoft R Server Is Free for Developers and Students

A Developer Edition of Microsoft R Server, "with all the features of the commercial version," will be available to coders as a free download. It will also be included in the Microsoft Data Science Virtual Machine, a Windows Server 2012-based virtual machine that includes tools for data scientists and developers. Microsoft is also making Microsoft R Server available free for students under the company's DreamSpark technology in education program. "Providing even more students with access to Microsoft R Server is a pretty big deal," wrote Microsoft Program Manager Joseph Rickert. "Microsoft R Server extends the reach of R into big data, distributed processing environments by providing a framework for manipulating large data sets time so that all of the data being analyzed does not have to simultaneously fit into memory."


Big Data Still Requires Humans To Make Meaningful Connections

Perhaps it’s because we put so much faith into technology to solve our problems. We have been led to believe big data is going to help businesses make smarter and more informed decisions. In healthcare, it will help our doctors and medical professionals make better diagnoses and find the most appropriate treatments. In sports, it will help our favorite teams pick the best players. In government, it will open up information and lead us to the transparency promised land where no corrupt government official can hide. And it will help root out those people who are planning to do us harm. As we learned in the recent horrific attacks in Paris, sometimes it doesn’t matter how much information we collect.


Truly Wireless Headphones Arrive, But With a Few Strings Attached

It works for keeping the ears in sync. The Dash, sold online and coming soon to shops, kept going even when I wore it in the shower. Bragi says even when you are swimming, the waterproof earbuds work, playing music stored directly on them. The audio, again, sounded fine for working out, although even a bit more compressed than the Earin buds. Bragi says it will continue to fine-tune audio quality. The Dash can last more than three hours on a single charge, which it also gets from a companion battery case. Each Dash bud is roughly double the size of the Earin, but still lays flat inside my ear. (Silicone sleeves of multiple sizes help fit different-size ears.) I almost never felt like the Dash would fall out, whether I was running or doing my best impression of Animal from the Muppets.


Hope in a Glove for Parkinson’s Patients

GyroGlove’s design is simple. It uses a miniature, dynamically adjustable gyroscope, which sits on the back of the hand, within a plastic casing attached to the glove’s material. When the device is switched on, the battery-powered gyroscope whirs to life. Its orientation is adjusted by a precession hinge and turntable, both controlled by a small circuit board, thereby pushing back against the wearer’s movements as the gyroscope tries to right itself. While the initial prototypes of the device still require refinements to size and noise, Alison McGregor, professor of musculoskeletal biodynamics at Imperial College, who has been a mentor to the team, says the device “holds great promise and could have a significant impact on users’ quality of life.”


Emerging: DataOps and three tips for getting there

CIOs know the typical wave of adoption -- technology or otherwise -- starts with early adopters. But even before the early adopters, CIOs will need to find their innovators -- employees who are, essentially, change agents. "In order to build a culture, we needed to identify not only the people who have technical skills or the business skills, but those who also are fearless. They want to go out to an organization and actually change things -- they want to change the way government works," she said. ... Before Jin arrived, a basic dashboard was designed for Mayor Martin Walsh,the first of its kind for the city. A year later, the mayor's dashboard has not only become more a sophisticated administration window into Boston doings, it also acts as a constituent-facing information portal.


How CIOs will refine digital transformation in 2016

“The traditional IT security defense is completely broken,” says Russell. “Most CIOs and senior leadership and boards are realizing that when you wake up every day and see another breach of some kind … the existing model does not work.” He’s well into a four-year IT security roadmap, which includes adding vArmour software to identify and flag anomalous traffic flowing across the company’s computer network. It’s designed to find the type of threat that hit Target, in which an intruder crawled into the network through a third-party vendor and began moving data. “That’s a huge transition from saying ‘we have a barrier nobody can get through.’” The tech has also provides fodder for conversation with his board, which wants details on what he is doing to buttress corporate defense.


People are the biggest source of vulnerability

People are the biggest point of vulnerability in any organization and the endpoint is where they interact with whatever an attacker is after: intellectual property, credentials, cyber ransom, etc. Further, people are responsible for the policies and procedures that are in place at the enterprise, whether forced upon them by regulatory bodies or voluntarily for proper security hygiene. Securing the endpoint would be less difficult if we were willing to accept policies and procedures that could help reduce the attack surface. But, no enterprise, in practice, wants to put employees through having separate systems for outside/inside network access. Employees want to and will use their corporate equipment for personal things: checking email, syncing music with their phones, and engaging others on social networks.


Automakers tap mobile software experts in search of premium cache

"Younger customers demand the latest connectivity features, and German premium automakers need to develop new offerings in the digital arena which cater to this," said Thilo Koslowski, vice president of the automotive practice at technology market research firm Gartner. BMW's Chicago team helped to develop 'Bumper Detect' a new system unveiled last week which uses BMW's onboard camera and sensors to photograph potential thieves or vandals. "The car can take photographs of another vehicle which may have left a dent in your parked car, and send pictures to your mobile device," Robertson said. The Bavarian automaker already has several software development centres in Munich and elsewhere and said it will continue to recruit staff in 2016 to help "the advancement of new technologies, including the ever-increasing scale of digitalization."


3 Lessons From The Graveyard of Fintech Start-Ups

Every tech field involves legal complexities. While big corporations have their own lawyers to maneuver complicated legal regulations, start-ups are on their own. And it’s a big deal. While some financial technologies may be far less intrusive, some could face intense quagmires. GoCardless, a UK-based online direct debit provider, has been sponsored by RBS (Royal Bank of Scotland) and handles $1 billion of transactions a year. Even their founder, Hiroki Takeuchi, has noted the difficulty in understanding regulations, as well as penetrating the bank-owned financial infrastructure. “To get access, you need to set up some sort of arrangement with a bank that moves at a glacial pace.” They didn’t go it alone, and it took a lot of work to work with the famed glacial pace of traditional banks.



Quote for the day:


"Respect for people is the cornerstone of communication and networking." -- @susanroane


January 13, 2016

Board Governance: Higher Expectations, but Better Practices?

Although Banks have made significant progress toward meeting regulatory expectations in this area, necessary changes have not yet been implemented at several institutions. For example, 30% of Banks have not formalized in their board charters a requirement for the risk committee to approve the Bank’s risk governance framework, as required under EPS. A similar 30% are yet to require their board (or the risk committee) to perform an annual self-assessment, as expected by the OCC. Finally, about 20% of Banks have not yet formalized the requirement that their board (or the risk committee) annually approve the institution’s risk appetite statement (a key component of the risk governance framework), as required by the OCC.


Data Privacy Reform Is Wreaking Havoc

From a legislative viewpoint, the matter of “where data resides” is critical as these new data privacy rules roll out. The Ovum research underscores that when it comes to the physical location of data, there is uncertainty and confusion. Until now, a key benefit of the cloud was that businesses no longer needed to concern themselves with the physical location of their data. It was stored off-site, for all to share, as needed. Now, with the European Union (EU), Israel and the United States beefing up regulations with the goal of stopping the flood of data leaks and stolen information, businesses must shift their approach to the cloud in a fundamental way. Suddenly, the location controlling the physical path of data matters.


4 Ways To Be A More Resilient Leader

Why resiliency? Last year I wrote about employer brand and candidate experience. Subsequent conversations with friends, family and clients made me realize organizations, and employees, and people, need more than a strong brand and the intent to engage and create a positive experience for employees and potential talent. We need resilient organizations with flexible, resourceful leaders to create the most productive work culture for people. Most organizations make a plan and figure that will get them where they need to go. But much of the time things don’t go according to plan, and people lose heart and focus. Employees start asking the same questions every day, betraying their unease and uncertainty.


Learn any of these 16 programming languages and you'll always have a job

"Software is eating the world," venture capitalist Marc Andreessen famously declared. Someone has to write that software. Why not you? There are thousands of programming languages, but some are far more popular than others. When a company goes out to find new programming talent, they're looking for people familiar with the languages and systems they already use - even as new languages like Apple Swift start to make a splash. Here are the programming languages you should learn if you always want to have a job, as suggested by the popular TIOBE Index.


The best web browser to replace obsolete Internet Explorer is...

The easiest way to get a new, supported browser is to simply upgrade to IE 11. You can do that in two ways: Download the installer from Microsoft--be wary of getting it from third-party websites---and simply install it. Or, you can simply update your system. Either way works perfectly well whether you're moving from IE 8, 9 or 10 to 11. ... Chrome, 501, barely edged out Opera, 500 for the top spot. Firefox took third with 448. And, once more eating the dust of the others, came IE with 336. The numbers make it obvious. When you replace IE 8, 9 or 10 on Windows 7, Chrome is easily the best choice. Opera, which has become the forgotten browser, also deserves some attention. Firefox, which has had more than its fair share of troubles, doesn't appear to be a good choice.


The Internet of Things is wasted without risk-taking

Productive data analysis requires an open mind. While it is undoubtedly important to improve business efficiency and utilise data for maximising profitability, the greatest innovations are typically born out of business opportunities created in completely new markets or sectors. The most lucrative jackpots are ideas that cannot be foreseen before data analysis. Let us discuss a few examples. Elevator companies provide services to large masses of people on a daily basis, which means they possess a large amount of data on the movements of their users. This data could be utilised in planning parking facilities, developing restaurant services or ensuring security. Another example could be a crane company with a hundred active cranes operating in the middle of a large city. 


2015 was a tipping point for six technologies

Smartphones with the capabilities of today’s iPhone will cost less than $50 by 2020. By then, the efforts of Facebook, Google, OneWeb, and SpaceX to blanket the Earth with inexpensive Web access through drones, balloons, and microsatellites will surely bear fruit. This means we will see another 3 billion people come on line. This will be particularly transformative for the developing world. Soon, everyone will have access to the ocean of knowledge on the Internet. They’ll be able to learn about scientific advances as they happen. Social media will enable billions of people to share their experiences and help one another. Farmers will be able learn how to improve crop yields. And those are but a few examples.


Global telecommunications: 2016 outlook

Wireless spectrum is essential to all wireless networks for over-the-air transmission of analog and digital signals including voice, video and data. The value of spectrum in the FCC’s latest auction rose significantly above prior auctions and the secondary market, underscoring the need for more of this resource. ... Revenue growth for Europe’s telecom industry in 2016 will depend largely on the ability to stimulate and monetize demand for data amid a tepid economic recovery in Europe and regulatory uncertainty. Fixed-mobile convergence will set the tone of competition with varying levels of promotional activity across countries. Potential consolidation in Italy and the U.K., even with stricter remedy requirements, support pricing power. Capital spending will moderate as 4G networks near completion.


For the First Time, More Are Mobile-Banking Than Going to a Branch

For the first time ever, there are more of you than people who actually walked into a branch in 2015, according to a new survey by Javelin Strategy & Research, a unit of financial-industry research firm Greenwich Associates. Last year, roughly 30% of adults in the U.S. used a mobile banking service weekly, while just 24% availed themselves of a physical branch service as often, Javelin’s survey of 3,100 people found. That’s the first time in the history of the survey that mobile users (and that means just smartphones and tablets, not via desktop computers) outpaced branch users, Javelin said. In 2015, one in ten consumers used mobile banking for the first time, or roughly 25 million people. Since 2010 the number of smartphone bankers has doubled, while the number of people using a tablet has jumped nearly 10 times, Javelin found.



Quote for the day:


"Progress is impossible without change and those who cannot change their minds cannot change anything." -- George B. Shaw


January 12, 2016

List of data breaches and cyber attacks in 2015 – over 480 million leaked records

There have been breaches of highly sensitive data (including that of children), targeted attacks on government agencies such as the US’s OPM and Germany’s Bundestag, and an alarming number of well-orchestrated DDoS attacks. Money has been stolen, data has been swiped and lives have been ruined. However, I must not fail to mention the fantastic work law enforcement agencies around the world have been putting in to bring justice down on the cyber criminals causing havoc this year. As Stuart Winter-Tear recently called it, 2015 has been the year of collaboration, and we can only hope to see the same in 2016.


Malware on the Smart TV?

So in this case, it’s not a new type of malware specifically targeting Smart TVs, but a common threat to all internet users. There are also reports that this scam has hit users on Apple MacBooks; and since it runs in the browser, it can run on Smart TVs and even on smartphones. These kinds of threats often get combined with exploits and may take advantage of vulnerabilities in the browser, Flash Player or Java. If successful, they may install additional malware on the machine or change DNS settings of your system or home router which may lead to similar symptoms. Such behaviour could not be observed in this case, since they malicious pages have been removed already. Keep in mind, there might be vulnerabilities in the software on your TV! Therefore it’s important to check if your device is up to date. Make sure you installed the latest updates for your Smart TV!


Red Hat's Ansible 2.0 brings new power to devops

Blocks also provide a way to perform exception handling, so that if something goes wrong during the course of a block, it can be handled. Existing scrips that don't use blocks will run as-is, but legacy scripts could only implement the same kind of functionality by way of a lot of boilerplate code. A new addition called strategies controls how playbooks execute, with the default for existing scripts being a "linear" strategy -- e.g., all hosts have to finish one task before any of them can begin the next one. A "serial" strategy, meanwhile, ensures one group of hosts finishes its work before another group can begin, and another strategy named "free" allows all hosts to run independently of each other. Strategies are not hard-wired into Ansible, either; they can be defined by plug-ins.


Will LiFi Take Big Data And The Internet Of Things To A New Level?

LiFi is a category of Visible Light Communication; an LED light flickers at speeds undetectable to the naked eye to transmit data — a bit like high tech morse code. In fact, scientists have demonstrated in a lab that they can transmit information at as much as 224 gigabits per second, the equivalent of 18 movies of 1.5 GB each being downloaded every single second. In an office setting, they were able to achieve speeds up to 100 times faster than average WiFi speeds. The LED lights require so little energy, they can be powered by a standard ethernet cord. Inventor Harald Haas has also suggested that the smart lights could be powered by solar cells charging batteries. In addition, LiFi does not create electromagnetic interference the way WiFi does, meaning it could have important applications in sensitive locations like healthcare facilities.


8 Things I Learned About SDx in 2015

This is, of course, the concept known as virtualization, whereby applications are detached or disaggregated from the underlying hardware. The most important aspect of this is on the development level, because it has ushered in the era of agile development in which software can be designed, deployed, moved, and updated on the fly. ... Bubbles have value in themselves even though there is going to be pain and carnage along the way. Even as a herd of startups is culled, the bubble can accelerate innovation in specific markets. What’s interesting about the orginal Internet bubble, which resulted in a crash and many failed companies, is that it created the largest economic engine in the world — and some of the world’s most valuable companies, including Amazon and Google. The same thing is happening in SDx and cloud-based security.


Tech innovations that will transform healthcare in the next five years

Healthcare providers are no strangers to the impact of technology on their operations. Over the past several years, for example, the move to electronic health records, though painful, has helped organizations develop the IT capabilities to pursue other innovations, with an eye toward better outcomes and improved operational efficiency. While many technology advancements hold tremendous potential to transform the industry, their timing and viability are unclear, particularly since promising technologies must often go through years of testing to obtain approval for use. Industry regulations, such as safeguarding patient information, can further cloud the timeline.


NAS vs object: Which one for large volumes of unstructured data?

Object storage enables enterprises and service providers to manage multi-petabyte secondary storage with relative ease. This does not directly compete with traditional file and block storage for serving frequently-accessed data and transactional workloads. In addition, when we refer to storage performance we usually think in terms of speed, latency and throughput in the datacentre. This is very different to the cloudy world of distributed applications and clients, where mobile devices typically access data over long distances and from widely disparate locations. The second differentiator is geographic scale. In the distributed world we need distributed storage performance and throughput.


Exposing the Lucene Library as a Microservice with Baratine

The ability to expose an existing application or library as a web service without any code modifications is a most appealing concept. Using Baratine, an open-source framework for building a platform of loosely coupled microservices, this can be accomplished in two steps: Implement a service portion (SOA) then; and Implement a client library for communication. Using this approach, Baratine can transform an existing library or application into a standalone web service. The Baratine services will communicate with the existing library, and the Baratine clients will service requests from the outside world. ... The Apache Foundation describes Lucene as: “a high-performance, full-featured text search engine library written entirely in Java. It is a technology suitable for nearly any application that requires full-text search, especially cross-platform.”


Scale-Out Storage and the Virtualized Data Center

Scale-out, as opposed to scale up, has the promise of allowing a solution to grow with the number of hosts in the cluster, but very often we see solutions that fail to live up to this promise. Why is scale-out hard? Well, there are multiple reasons why scale-out is hard and although the specifics of each solution are different, the common theme is that multiple hosts means multiple copies of data, and multiple copies means they need to be kept coherent or consistent. The price of keeping the copies coherent, henceforth referred to as doing “cache coherency”, goes up as you traverse down the following list: A. Immutable objects B. Mutable objects. Single Reader, Single Writer (Single-RW) C. Mutable objects. Multiple Readers, Multiple Writers (Multi-RW)



Quote for the day:



"The secret to success is doing the stuff other people won't do & doing it for a really long time." -- John Jantsch


January 11, 2016

Redmonk analysts on best navigating the tricky path to DevOps adoption

It's the idea that Hilton International or Marriott would be worrying about Airbnb. They weren’t thinking like that. Or transport companies around the world asking what the impact of Uber is.  We've all heard that software is eating the world, but what that basically says is that the threats are real. We used to be in an environment where, if you were a bank, you just looked at your four peer banks and thought that as long as they don’t have too much of an advantage, we're okay. Now they're saying that we're a bank and we're competing with Google and Facebook. Actually, the tolerance for stability is a little bit lower than it was. I had a very interesting conversation with a retailer recently. They had talked about the different goals that organizations have.


A disaster recovery/business continuity plan for the data breach age

The need to manage and protect both business and personal data (as clearly differentiated from the software) has never been more important. A disaster recovery/business continuity plan that does not account for our dependence on data puts the enterprise, its employees and customers at risk. ...
A good disaster recovery/business continuity (DR/BC) plan is not an IT plan, it is a business plan that has significant IT components. As discussed above, more and more focus needs to be placed upon datarecovery beyond ensuring that programs and processes are returned to operational status. The plan should be scenario-based and aligned to the likelihood of varying levels and types of risks as specified by documented business impact analyses and business risk assessments.


Why customer is not always right

There are two fatal flaws in this model, both having to do with managing expectations. First, clients need to understand that they are unlikely to get every deliverable without some compromise – particularly in custom software, where nobody knows exactly what’s involved until the project is more than half done. Second, the project lead on the consultant side must actively manage expectations during every client meeting. If the project lead on the client side is weak – technically or politically – s/he will not successfully propagate the realities of prioritization and negotiation to executives in the client organization. This means the project is in trouble before it starts … and, worse, the trouble can be totally invisible to the client until it’s way too late.


How tech giants spread open source programming love

Programming languages and technologies that were developed by industry and Internet giants – specifically to meet the unique challenges they faced operating at massive scale – have been open sourced and are now being adopted by regular-sized enterprises for everyday use. Part of the reason for this is a natural technology trickle-down effect, according to Mark Driver, a research director at Gartner. "Today's leading edge super high tech is tomorrow’s standard product," he says. "Also, large companies (like Google and Facebook) understand the collaborative nature of open computing and the dynamics that drive the Internet. So it's natural that they share these technologies and strengthen the industry around them."


Six Transformations From 2015 That Will Reshape The World

Looking at the list of finalists for the Crunchies, you could get the impression that the greatest advances of 2015 were sharing and delivery apps, software platforms, and pencils. Yes, these are cool. But much bigger things happened last year. A broad range of technologies reached a tipping point, from science projects or objects of convenience for the rich, to inventions that will transform humanity. We haven’t seen anything of this magnitude since the invention of the printing press in the 1400s. And this is just the beginning. Starting in 2016, a wider range of technologies will begin to reach their tipping points. Here are the six amazing transformations we just saw.


Britain is on the verge of an IT crisis

This shortage will boil over in the coming years as a generation of IT workers, who built the systems and databases that still power critical functions, begin to retire. This is especially worrying in finance, where large institutions, which have repeatedly merged and sold off parts of their businesses, have back-end systems that have been hastily thrown together. As those that created them leave the workforce, disasters will be more difficult and take longer to recover from. Companies have responded to the problems with hiring IT workers by outsourcing more work. But having done this, says Tate, many have made poor decisions, found contractors to be inadequate, and moved operations back inside. The alternative is simply to pay more for the best talent, but a swell in demand across the board is making this increasingly expensive.


3 Guiding Principles for Innovation in Managed Services

We simplify what has become complicated, we create dashboards of the automation and single pane of glass displays of the coordinators, and we start the cycle over again. It sure seems a little reversed to me. Am I issuing a wake-up call to our industry? Absolutely! I have begun to initiate some brain-storming sessions with colleagues that challenge the status quo. Our technology is now using Fully Automated Storage Tiering, multiple alerting consolidation engines, automatic load balancing, pooled resource rebalancing, and the list goes on and on. This is fantastic and exciting beyond belief to talk about, explore, and work with these technologies. However, I am involved in services. We are the pilots of the automation, and we must aviate, navigate and communicate our way through the technology hierarchy.


The Dark Side of The Wearables

As wearable devices make their way into the workplace and corporate networks, they bring a host of security and privacy challenges for IT departments and increase the amount of data that data brokers have to sell about an individual. Jeff Jenkins, chief operating officer and co-founder of APX Labs, talked about the security and privacy of wearables during a panel interview with Tech Pro Research at CES 2015. Because wearable devices are designed to be small and portable, Jenkins said, "you have to make sure you're thinking security first and you're thinking about the information that's being generated by them. You have situations where it's no longer just personal data that may be exposed or compromised, but also potentially operational data, that could be sensitive in nature."


The Emerging Data Design: Bitemporal Data

Simply defined, bitemporal data means storing current and historical data, corrected and adjusted data, all together in the same place. Bitemporal means you are using two time dimensions simultaneously – one to represent business versions and one for corrections. For example, let’s say you have a database table of customers; in a bitemporal world, you would store changes (versions) of the customer’s data, over time, as well as any corrections, as new rows in the same table. Customer data changes include attributes like the customer’s name, address or buying preferences. Corrections (some people like to call it adjustments) represent restatements of data that people or systems make to record the right value. Human typing errors or software errors create data that may get corrected.


DDoS: 4 Attack Trends to Watch in 2016

Most businesses are ill-prepared for DDoS attacks, which is why it costs them so much to recover, Meyerrose says. The cost of recovering from a DDoS attack can be more than $50,000 for small businesses, he notes, quoting data from security firm Kaspersky Labs. That cost includes business lost to downtime and technology expenses and investments associated with site recovery. So what can be done to defend against the growing DDoS threat? "My main strategy for defense would be making sure I could quickly detect and block all types of DDoS attacks, e.g. application or network layer, and be able to quickly redirect my users to a backup duplicate, albeit streamlined, site to keep my business running without interruption," Litan says.



Quote for the day:


"Once we rid ourselves of traditional thinking we can get on with creating the future." -- James Bertrand


January 10, 2016

Open Source as a Driver of Internet of Things

The zero entry barrier provided by the use of open source, with several toolkits, libraries, and open source hardware like Arduino and Raspberry Pi, is the foundation for it turning up in small devices sprinkled all over the globe, from home security to energy management systems, from automobile telematics to health monitors. Because open source helps lower the cost of the device itself, companies can now experiment and stitch together solutions that would otherwise have been ignored because they would have required upfront purchasing of expensive licenses for development tools and environments, specific libraries and software components. Open source is a very effective way to ride the IoT wave at high speed while keeping the risks and costs to do so under control.


Cisco's global cloud projections may blow your mind

Annual global cloud IP traffic is expected to reach 8.6 ZB by the end of 2019, up from 2.1 ZB per year in 2014. In an interesting glimpse into how new technologies are helping drive efficiencies in spite of this massive increase in traffic, networking technologies such as SDN and NFV are expected to streamline data center traffic flows such that the traffic volumes reaching the highest tier (core) of the data center may fall below 10.4 ZB per year, and lower data center tiers could carry over 40 ZB of traffic per year. In terms of how this traffic looks on a regional basis, perhaps unsurprisingly North America will have the highest cloud traffic volume (3.6 ZB) by 2019, followed by Asia Pacific (2.3 ZB) and Western Europe (1.5 ZB). North America will also have the highest data center traffic volume (4.5 ZB) by 2019, followed by Asia Pacific (2.7 ZB) and Western Europe (1.8 ZB).


What makes a great company? Let’s talk information flow

The flow of information between employees is important across all levels and titles. Too often, executive teams hold intelligence close to their chest in fear of having competitive knowledge or financial earnings exposed outside of the company. We want to lead by example — and transparency and trust are huge components. With that goal in mind, we host a Datameer Radio session each month so everyone can get an update on the company and participate in a candid Q&A with the executive team. We’ve found that not only do our employees respect the confidentiality of the information that is shared, but also knowing what is going on strengthens their commitment to being a part of helping us grow. It’s clear that the workplace is in need of disruption with new models of motivation to drive inspiration and enhance well-being.


Internet Of Things Extends Business' Ability To Sense And Respond

Call it fallout from the Google effect, he explains. “One of the things that Google and search has done for us is it has infinitely expanded the capacity of the human memory,” says Hoover. “I don’t have to memorize all the facts in the world. I can go out and look them up and find it if I want to learn about reinforcement learning. It’s expanded my brain, my memory to nearly infinite capacity.” By analogy, the billions of sensors across the planet is expanding our awareness of our surrounding. The Internet of Things is “about Googling reality,” Hoover explains. “I see things, I hear things, I sense the world around me. To sense something at the time it occurs, it no longer has to be near my body. I want to understand the state of pollution in Beijing; I go and find it on the internet.


Dutch government says no to 'encryption backdoors'

The Netherlands began reviewing its policies after the recent Paris terrorist attacks. But this week it said "restrictive" measures would put citizens at risk. Encryption is a way of protecting communications or data so that it is incomprehensible without the correct passcode or key. Advocates say it protects users by preventing criminals and spies from prying into private conversations. But security agencies have said they struggled to bypass encrypted messaging platforms used by groups such as so-called Islamic State to plan attacks. "We are not some kind of maniacs who are ideologues against encryption," FBI director James Comey said in November: "But we have a problem that encryption is crashing into public safety and we have to figure out, as people who care about both, how to resolve it."


Big Data Security and Compliance Issues in the Cloud

It’s a quandary. Businesses want to be able to conduct deep, flexible analytics on complete data sets. That’s the essence of big data. You don’t want to omit any data that might contribute to finding business-facing insights. You want the cloud for flexibility and economics. But, you also don’t want to run afoul of compliance regimens or increase your exposure to security risks. What can you do? Don’t worry. As I said, it can be worked out. Getting on top of public cloud big data security and compliance challenges takes effort on two fronts. First, there has to be a coherent, disciplined set of data governance policies at work in the cloud. Platforms also matter. The two work together, with the platform enabling the definition and enforcement of governance policies.


Symantec Adds Deep Learning to Anti-Malware Tools to Detect Zero-Days

Until recently, deep learning has been locked away in the software development labs. A few companies have realized that they can spot malware by its components and its behavior to ferret out most zero-day attacks before they have a chance to cause damage. Because of this, deep learning is now being deployed on the cyber-security battleground. ... Symantec has their sights set on bigger goals in the enterprise. The next target will be enterprise email, especially cloud-based email. "We process a lot of the world's email," Gardner said. "A lot of attacks enter the enterprise through email. They're insidious." He said that by attacking company email systems, cyber-criminals are able to seize critical information and, in addition, able to steal a lot of money through phishing schemes that install malware on company networks.


The DIFA Framework for evaluating data science projects

A situation that many CIOs or data science departments will face today is that the list of possible analytics projects is sheer endless. The range of project candidates usually begins with analyzing customers and ends somewhere at utilizing social media data. Needless to say that not all project candidates will make sense from a business and especially ROI perspective. Also, some projects will run into dead ends because some fundamental bits turn out to be missing. Even though experimentation and some vagueness about eventual monetary success in the context of data science projects is normal, there are some hard facts that heavily influence the success of a data science project. These facts are structured in the DIFA framework which is explained below.


Tracking Cloud Services: An Essential Security Step

Not knowing who's responsible within an enterprise for managing cloud serviceclouds contracts could result in the inventorying of cloud services falling through the cracks. "The decentralized procurement model of cloud creates situations where individuals and business units may use a cloud service outside of the purview of the central IT organization," says Jim Reavis, CEO of the Cloud Security Alliance, a not-for-profit that promotes use of cloud security best practices. Confusion about who is responsible for cloud services contracts within the enterprise could lead to the failure to inventory each agreement. "Organizations fail to inventory their cloud services and other cloud-accessible devices because they fail to appreciate that cloud computing is not a technology decision," says Kevin Jackson, founder of the cloud computing consultancy GovCloud Network.


How Digital Disruptors use Data Science

Digital disruptors are fast and relentless. They are constantly releasing new functionality. They try things - they experiment. In order to do this, digital disruptors need a feedback loop. They use the data from their customers’ use of their product to get fast and accurate feedback on how these products are being used. Lets look at examples of how they do this using data science. On race days, Nascar analyses all the tweets and fan site activity relating to the race. It uses data science to bucket this human interaction data into topics, and for each topic it automatically determines sentiment. It then addresses any concerns thru information on its fan site, information at the event, or feeds to it broadcast partner, Fox Broadcasting.



Quote for the day:



Technological change is not additive; it's ecological. A new technology does not merely add something; it changes everything. -- Neil Postman