May 09, 2016

India wants everyone who shares location data to get a license

While the bill states that the security vetting agency will concern itself only with information pertaining to sensitive areas like military bases, it’s clear that this is an over-reaching and poorly thought out plan to police mapping apps. It’s also likely to do little to stop terrorist attacks. Since the rules in the bill only apply within India and to Indians outside the country, it won’t restrict foreign military forces and terrorists beyond India’s borders from sourcing map data from elsewhere. Although Kiren Rijiju, Union Minister of State for Home Affairs of India promised that, “We won’t create hurdles for business and technological development,” it’s hard to imagine the bill doing anything but create logjams for apps and digital projects that require map data in the country and force developers and individuals to wade through red tape, if it comes into effect.


Mobile backend as a service: Features and deployment options

Organizations that aren't comfortable adopting public cloud services or have a large and growing portfolio of mobile apps can opt to deploy many of these MBaaS products as a private cloud on internally managed systems. We don't think security concerns are a valid excuse for shying away from cloud services, but a traditional on-premises software deployment may be more cost-effective for organizations with a large app development pipeline and that are also looking for a fully integrated suite encompassing app design, development, testing, project management and runtime back end. We would still be cautious about this route, since the mobile backend as a service market is extremely dynamic, with new features constantly being added and niche vendors -- like FeedHenry -- being acquired and incorporated into larger cloud portfolios.


3 women who radically changed the course of technology

When we think of innovators of the technology space we largely think of blokes like Mark Zuckerberg, Bill Gates, Steve Jobs, Elon Musk – the zeitgeist is largely male. But that hasn’t always been the case. As a matter of fact, for the first decade or so programming was a ‘pink-collar’ industry. The vast majority of early coders were women. What’s more, it’s an industry created by women. Two centuries ago (when computers were made of flywheels and cogs), Ada Lovelace was the matriarch behind programmable computers. The world’s first developer, who lay down the foundations for the future we’re currently living in, was a woman. Almost a hundred years after she published her seminal documents on programmable computers, Alan Turing used them as inspiration for the modern, electric machines we still use today.


Riverbed SD-WAN design goes beyond optimization

The SteelConnect gateway features a built-in next-generation firewall and unified threat protection. And in future versions available later this year, SteelConnect will support a variety of routing protocols, including Open Shortest Path First and the Border Gateway Protocol. Later this year, Riverbed will add its network visibility and application performance apps to SteelConnect Manager, and it will also add support for third-party applications, giving users access to a diverse array of services. The new switches and APs, meantime, will give Riverbed some additional ammunition as it courts customers ready to upgrade their branch sites with hardware that consolidates functions now spread across myriad devices, IDC's Casemore said.


Protecting the rainforests with IoT and recycled phones

Topher White and David Grenell, the founders of Rainforest Connection, developed a solution that uses the sound of loggers’ tools, such as chainsaws and trucks to haul away the logs, against them. They install sensors in rainforests that each monitor the sounds in a square-mile area. Villagers and local authorities are alerted when the sound of chainsaws or trucks is detected. Designing a rugged solution that's also affordable and easy to install is hard. It has to work in extreme humidity, operate in heavy rainfall and be self-powered. Applying some good old-fashioned ingenuity, White and Grenell decided to repurpose some of the more than 150 million phones discarded every year in the U.S. as the sensors for this project. Rainforest Connection adapts old phones, making them waterproof and powered by solar panels.


The Rise of Knowledge Workers Is Accelerating Despite the Threat of Automation

There is no doubt that machines are getting smarter, faster, more powerful and more dexterous—and potentially capable of doing more and more of the tasks that humans do. It’s easy to find warnings of the imminent risk of a jobless future. Most dramatically, a group of researchers at the University of Oxford warned three years ago that technology was on the cusp of destroying nearly 47% of U.S. jobs in coming years. It’s only been three years since that prediction, but so far new knowledge jobs are easily eclipsing the jobs that are disappearing. Even as machines get smarter, many jobs have critical components that are social, emotional, creative or relational. These are overwhelmingly likely to be classified as non-routine types of jobs. The prospect of robots or automation replacing all of them remains remote. In other words, there’s good reason to think knowledge work will continue to grow.


Ramsomware Should Haunt You All The Time

Businesses also need to find out where the attackers went within the network to discover where they might have buried malware for use at a later time, he says. Often the ransomware attack is used as a distraction so network security pros don’t notice other types of attacks. One of the best protections against ransomware attacks is effective backup, but it’s not foolproof. For example, if it is inserted in machines and lies dormant the ransomware itself can be backed up, so machines restored with the backup will still be infected. That’s why forensics are important to determine when and where the malware was placed. And it’s important to reimage machines, not just restore data. “You have to ask did your backups backup everything? Do so recently enough? Do they have integrity?” he says.


What's Really Happening Inside the Dell, EMC Merger

How will this relationship, the largest-ever union of two IT companies, have to work to be successful? Huge companies with multiple thousands of employees tend to be good at doing what they've always done, but are less successful at recognizing changes in the markets and then turning quickly to satisfy those new developments. This is what small, agile startups are particularly good at doing. New-gen IT buyers in their 20s and 30s, at least those contacted by eWEEK, are shrugging off this deal. They're never going to buy what this new company produces on a wide scale, unless the new Dell Technologies can convince them otherwise—and that will be a tall order. This is a large company that's going to sell to other large companies, and, if it can do that on a global basis, maybe that will be enough to be successful. But the midrange is where the growth in IT will be for the next two decades.


4 Benefits of Switching your Contact Center Agent Software to WebRTC

Contact centers are rapidly changing, moving towards becoming omnichannel machines where customers can skip across channels while the context of their interactions is maintained. This is not always the case. I had my own share of broken interactions across channels - getting it right isn’t easy. Most enterprises cannot invest in a full blown transformation of their contact center. That costs too much and comes with great risks (as any IT project does). A different approach is to take baby steps towards a full solution - one in each certain areas of the contact center are modernized and replaced. One such area which is popular for modernization with small and medium contact centers is the agent VoIP client. In a contact center, each agent is designated a phone. This is how they receive calls.


Securing SSDs with AES Disk Encryption

Encrypted SSDs not only operate at full speed without impacting system performance, but offer a number of advantages over software-based disk encryption. Security-wise, just like any other disk-encryption solution, encrypted SSDs perform transparent, complete encryption of all files including hidden and temporary files that may store sensitive information. However, the cryptographic hardware and encryption key is isolated from the host system, making the encryption process robust against attacks or viruses on the host system. Authentication with encrypted SSDs happens pre-boot. All user space data, including the operating system, is completely inaccessible until the user is authenticated. Sanitizing encrypted SSDs is fast and secure. On the other hand, sanitizing a conventional hard drive or SSD requires overwrite procedures that can take hours or days, or physical destruction that could still leave data on the drive.



Quote for the day:


"All our final decisions are made in a state of mind that is not going to last." -- Marcel Proust


May 08, 2016

3 Ways to Use Big Data in FinTech

This type of Application Programming Interface (API) is a way of communication with an online banking system, where a third-party can use the information about a customer stored in the banking system. A client should simply log in into the bank account and banking API does the rest of the work; it checks the balance of the account or extracts the summary of his transactions over a certain timeframe. Then this data is surpasses to a third party, a company that is interested in getting this type of data. The best thing about banking APIs is that all of the data is only passed through the user consent. If you have not heard about this technology, you should definitely take a closer look into it. In the modern dynamic era, data sharing data is essential for the progress.


With Tech Layoffs Projected to Be Deep, Professionals Should Get Ready to Switch Gears

So the question that looms over these layoffs is, what is going to happen to these highly skilled, intellectually capable professionals? A percentage of them will likely find employment in other technology companies. But the numbers are really high, and the industry is probably not going to be able to absorb the entire set. Instead of feeling depressed about the whole thing, I would like to look at the positive side of this phenomenon. These are tech savvy people, many have worked for years and developed experience and savings, and can, or at least a percentage of them can, start new ventures. They don't all have to build billion-dollar companies. But they can focus on a $5 million, $10 million, or $20 million idea and build a $1 million, $2 million, $5 million, or $10 million company over the next phase of their careers.


Taking Agile to Marketing: Process, Teams, and Success Stories

We are well aware of how software projects with an agile approach requires a cross-functional team to be able to ensure the best output in our deliveries. Similarly, when we adopt the agile paradigm into marketing, it requires us to bring together a team that is strong on all elements right from strategizing to creating, designing, developing and executing the strategy. Where on one hand, all the conventional organizational silos are still in place, working as they always did, on the other hand, to successfully take up Agile Marketing, it is vital that you have a special team in place dedicated entirely to it. A team that spans institutional silos, breaking down the old school hierarchy model is able to create a cross-functional working model that is committed to agile marketing, empowered for efficient decision making and swift responses.


Cisco patch stops attackers from taking over TelePresence systems

The vulnerability is present in the TelePresence Codec (TC) and Collaboration Endpoint (CE) software. The affected devices are: TelePresence EX Series, TelePresence Integrator C Series, TelePresence MX Series, TelePresence Profile Series, TelePresence SX Series, TelePresence SX Quick Set Series, TelePresence VX Clinical Assistant and TelePresence VX Tactical. Users who can't immediately install the software updates can disable the XML API to mitigate this flaw, but doing so will make it impossible to manage the systems through the Cisco TelePresence Management Suite. Otherwise, a high-severity denial-of-service vulnerability was patched in the Cisco FirePOWER System Software. By exploiting the flaw, attackers could cause an affected system to stop inspecting and processing packets.


The power of platform

Today’s businesses need to bring order to their fragmented operations in a holistic model. Running a digital business, requires a platform with a unified console and shared tools, libraries and services, which supports a loosely-coupled and dynamic operation, without compromising control, performance and scalability. The objective of this shift toward platforms is to allow organizations to concentrate on their business, not on underlying and ever-changing technology. ... Future State presents a universal interface, which abstracts all of the technical complexity of working with NFV and SDN-based services. It provides a declarative, or intent-based, model for rapidly designing and testing new network services, offering them as products in a catalog, and managing the lifecycle service orchestration of the services, including service assurance and disaster recovery.


How Salesforce Does Enterprise Architecture

Business-value led means that the ultimate goal of EA is to enable your business goals and objectives. Whether that is enabling a new business model, a flexible mobility strategy or an omni-channel strategy, EA's number one goal is actually not about technology but about business results. Pragmatic means that EA provides an executable roadmap that takes months, not years. No business has the luxury of waiting for a year or more for systems enablement. In many cases, business are already behind the curve in terms of having the systems they need to meet their business requirements. .... MVP or Minimum Viable Product approach means that EA should NOT be filling binders with amazing-looking artifacts. The creation of complex technical documents and diagrams is not a business value add.


Getting started with cloud solutions: What goes where?

Make sure you fully understand company-wide storage and access demands, and factor in how they will change or grow in the future. You should also describe every workflow in your business to identify the candidates for migration. PC Magazine recommends your best bet is to first migrate easily transportable, low risk, and high return workflows, such as email and backup/restore, and then apply any takeaways from this initial migration to the tougher jobs coming later. Don't assume that your current view of each department's operation is up to date. Meet and directly speak with team leaders, drilling down into details about how each team handles its workloads, the sensitivity of data they handle, and who's involved in their collaborative processes. Do assume that connectivity to the public cloud could break, on either your end or the provider's end.


Controversial digital currency in news on creator revelation, host of Indian start-ups

The latest development notwithstanding, there has been a lot of buzz around bitcoins lately, especially in the Indian context. In December last year—exactly two years after it issued a caution against the use of bitcoin—the Reserve Bank of India (RBI) came around to appreciate the strengths of bitcoin’s underlying ‘blockchain’ technology. Blockchain is a digital platform that records all bitcoin transactions ever made in a way that can’t be altered. Around two months before that, in October 2015, Ratan Tata, chairman emeritus of Tata Sons, joined a group of investors, including American Express, to invest in a US-based start-up, Abra. The US-based company works on bitcoin and similar technologies for interchange of currencies. Indian bitcoin start-ups, too, are not very far behind and are seeing considerable funding activity.


Microservice Threading Models and Their Tradeoffs

Paying attention to the threading model is an effective way to focus the architect on considering the trade-offs between efficiency and code complexity. As a service is decomposed into parallel operations with shared resources, the application will become more efficient and its responses will exhibit less latency (within limits, see Amdahl’s Law). Parallelizing operations and safely sharing resources introduces more complexity into the code. However, the more complex the code is, the harder it is for engineers to fully comprehend; which means developers are more likely to introduce new bugs with every change. One of the most important responsibilities of the architect is to find a good balance between efficiency and code complexity.


Introducing the Architecture of the Future: Mapping the Way to IT Success

The question is what does “software defined enterprise” (SDE) really mean for IT and how do we get there. By way of an answer, our Office of Architecture and Innovation team recently came up with a game plan—appropriately called EMC IT Architecture of the Future. ... While the priorities highlighted in the plan are not new, this is the first time IT has brought them together in a single vision defining how each fits to deliver SDE. The intent is to help clearly communicate IT’s role in making the notion of SDE a reality. The resulting blueprint—which we are promoting throughout our organization with posters, blogs and videos— diagrams the components of how IT will approach the architecture of the future. It is a holistic vision of a software defined enterprise that will deliver value to our customers.



Quote for the day:


"Leaders keep their eyes on the horizon, not just on the bottom line." -- Warren G. Bennis


May 07, 2016

Why Microsoft won't extend the Windows 10 free upgrade offer

If Microsoft hasn't paid a price for instituting the free upgrade, its ecosystem has: Analysts from all corners have attributed some of the continued contraction of PC sales to the free deal. In other words, Microsoft has reasons, one in particular, to end the offer: Doing so will placate its OEM (original equipment manufacturers) partners, the Lenovos, Dells and HPs of the computer industry, who will have a better chance of selling new hardware with the offer abandoned. ... It has, for instance, made a radical change to what hardware it will support in the future, saying in January that only Windows 10, or more specifically, "the latest Windows platform at that time," will support future systems. The policy is a strong signal to OEMs that Microsoft will push new hardware as it markets 10. It also establishes a precedent if, down the line, Microsoft decides to declare Windows 10 unfit for, and thus unsupported on, older hardware.


The Central Bank Heist and COBIT 5® for security

COBIT 5 for security addresses ignorance. It sets out everything you have to do to achieve the relevant security way beyond technical solutions. It provides both high-level and detailed approaches on assessing the business environment, ways to discover your firm’s risk appetite and tolerances, and what to look at to define the security you need. Only then does the guide move on to security implementation and monitoring. Looking at security in context brings out the reason for security investment and makes sense of the effort and expenditure required. ... Getting back to our central banks in Bangladesh and the Philippines, and SWIFT, what lessons are there that need to be applied? For Bangladesh: COBIT 5 cannot cure apathy but can go a long way in identifying the cultural issues the bank needs to overcome. The security guide’s approach exposes vulnerabilities and helps identify what can be done, however small, to begin addressing them.


Banks tie up with startups for novel payment solutions

By opening up their APIs, banks have reduced the time taken for fintech companies to process transactions. Yes Bank opened its API recently to Snapdeal and allowed the ecommerce company to route their 'Refunds' through them. Hence time required to refund is reduced to an hour from a week.  "Previously we had to make an excel sheet containing all the data regarding the money that got transferred through us and had to upload it onto the bank's corporate interface, which then would get processed. This would require a lot of manual file sharing. Now with API integration, this process has become automated and fast," said Anish Williams, CEO of Transerve which is a business correspondent to RBL Bank.  While opening up of the bank's interface makes processes very easy, it also exposes the bank's platform to malware which could harm the bank's database.


Australian Treasury department calls for public comments to help classify digital currency

The first solution is an “Input taxed treatment,” which removes the taxable event from the acquisition of digital currencies. A consumer that buys a few dollars worth of an unspecified digital currency won’t be taxed on the transaction, but when he or she goes to buy a cup of coffee with that digital money, the coffee store charges normal sales tax. A second solution is to remove all special designation and taxation from digital currencies, and to simply re-label them “money,” alongside items in the current definition of money, “including Australian or foreign currency, promissory notes, bills of exchange and money orders.” This option would give bitcoin more credibility than before, but instantly subject bitcoin to all of the existing regulations that come with money, including taxation on foreign exchange transactions, required to buy them.


Will Digitization Eliminate Jobs or Redefine Them?

The answer depends on how companies are digitized. If you look at technology simply as a way to reduce costs at the expense of your most important asset—people—job losses will mount. If, however, you approach digitization correctly by first reimagining work—understanding which new roles and skills will be needed in a digital world—the future of jobs is bright. ... The research shows how empowering people with “digital accelerators” that combine business process change and technology creates the three capabilities of digital business agility—hyperawareness, informed decision-making, and fast execution. Digitization is not just about deploying technology to cut costs. The real goal is to drive business growth through innovation. That means reimagining work and empowering people in a digital context and creating a more efficient and fulfilling work experience along the way.


Who will manage IoT in the enterprise?

Most IT departments throughout enterprises are about to relinquish control of the Internet of Things, a research firm says. IoT will not generally be managed by IT, reckons Bob O’Donnell. His company,TECHnalysis Research, recently completed an online study about which department will be running IoT within organizations. Surprisingly, operations, facilities and manufacturing was the principal selection, the researcher found (with 42 percent). It will be the “most common department to be responsible for IoT projects,” O’Donnell says. IT came in second, with 33 percent, and line of business and business strategy groups followed in third position at 24 percent. Line of business can mean a few different things in corporate-eze. They include computer applications used in an enterprise, general products offered or a general corporate division. In any case, it isn’t IT.


Digital strategies must be able to adapt to changes in cloud services

Venters said that enterprises can respond by understanding the direction and business models in which cloud vendors are heading and creating a business strategy in response. This can evolve with them and if necessary pivot in the opposite direction if the objectives of both do not align to the benefit of each other. “When we buy a service [from a vendor] we need to understand their business model and how it might adapt in the future, and then adapt our business model to align with it,” he explained. Such a strategy means that all IT procurement should be seen as acquiring sustainable services, not products, that operate in tandem with a company’s business strategy. Effectively, each purchase needs to work in the context of the business and the wider cloud ecosystem to which they connect.


Why We Should Not Jailbreak Our Devices

Jailbreaking in iOS is the process of gaining unauthorized access or elevated privileges on a system. It basically modifies the iOS kernel and allows file system read and write access to an application. Most of the jailbreaking tools apply some kernel patches to the iOS kernel and make some unauthorized changes to the kernel to remove the limitation and security features built by the manufacturer. And, this allows the users to install additional third party applications, extensions and patches from outside Apple's App Store. ...  Attackers can easily insert malicious files into or extract sensitive file from a jailbroken device. In fact, this vulnerability is widely used by a number of commonly known malware programs. Attackers can use keyloggers or other malware programs to steal sensitive data from a jailbroken device.


Qualcomm Flaw Puts Millions of Android Device at Risk

Devices running Android KitKat (4.4) and later are affected less than older devices because they come with the Security Enhancements for Android (SEAndroid) mechanism enabled in enforcing mode by default. This makes stealing other apps' data through this flaw impossible. On these newer Android versions, "the 'netd' context that the '/system/bin/radish' executable runs as does not have the ability to interact with other 'radio' user application data, has limited filesystem write capabilities and is typically limited in terms of application interactions," Valletta said. However, a malicious application could still use the flaw to modify system properties, he said. "The impact here depends entirely on how the OEM is using the system property subsystem."


We're still living in the dark ages of cyber security

The big problem we face today is that most of the software that runs our massive IT ecosystem is vulnerable to cyberattacks. And there’s nowhere to hide: If you’re not a hermit, it is virtually impossible today not to be exposed to information technologies. Digital equipment, devices and gadgets are all around us. An average household in the modern world already has several networked devices, and there are predictions that soon it will own hundreds of them. And there’s probably not a single factory today - no matter what industry - that’s not using some sort of computerised industrial control systems. The big problem is that we’re using computers and various devices that were never designed to withstand an attack by a highly qualified threat actor. However, our infrastructure is becoming increasingly ‘cyber-physical’, while being run by the same vulnerable software.



Quote for the day:


“Great leaders don't need to act tough. Their confidence and humility serve to underscore...” -- Simon Sinek


May 05, 2016

8 Challenges Affecting Software Project Management

The software industry is highly complex, requiring workers with both industry-specific skills as well as the requisite software development expertise. ... The software industry is also one of the fastest moving and evolving industries, creating an environment where companies can go under as fast as they started, due to domestic and international competition. Business owners, executives, middle management and all other employees working in this field are continually pressured to keep up, and project management professionals are under even more pressure to ensure the successful execution of projects. It’s not enough to only know about project management. Project managers must also keep pace with this fast-moving industry in order to anticipate possible risk, quality, integration, financial and other factors that may hinder the chances of a successful project.


The quest to find Satoshi Nakamoto continues

Another Satoshi has bitten the dust. On May 2nd Craig Wright, an Australian entrepreneur, published on his blog what he claimed was proof that he is Satoshi Nakamoto, the mysterious creator of bitcoin, a cryptocurrency. Within 90 minutes the post had been debunked on Reddit, an online forum. He then said that he would present “extraordinary proof” that he is indeed Mr Nakamoto by moving some bitcoin from accounts thought to be under control of the currency’s creator. But on May 5th he wrote on his blog that he did not have the strength to continue trying to prove his identity, prompting most to add his name to the long list of false leads in the hunt for Mr Nakamoto.


How to use advanced analytics to mitigate EHR data risks

"One of the analytics that we recommend is an analytic on all your expectations once you do go-live," Hoover says. "you want to look at what was happening three months before and in the same months a year before. You're dealing with materiality issues. It's the sum of the small parts that all of a sudden equal huge dollars." For instance, Hoover notes that when physicians make their rounds, there's a certain charge that should be associated with those rounds. Hundreds of physicians may be going through that process on a daily basis. If the physicians don't record those charges properly, or the system wasn't designed properly, that can add up to millions of dollars of revenue leakage in a few months.


India emerging as fastest-growing market for fintech software products

"While core banking, insurance, risk management, and point of sales solutions were first-generation products, the industry has undergone rapid evolution in terms of product offerings, with added focus on customer experience, driven by the advent of mobile and analytics technology," he said. In spite of these innovations, banks were facing challenges including complying with certain regulations such as the Foreign Account Tax Compliance Act (FATCA) and Anti-Money Laundering (AML), as well as the lack of automation and integration across systems -- hence these institutions have to resort to technology to deal with the complexities in compliances and procedures, the report said. Regulatory information required is obsolete and lack of automation and integration across systems causes another challenge for banks.


Fintech: The New World Order

Fintech is not new, but it has been given a facelift. Most would say that financial technology (or FinTech) has been around for a long time, and they’re correct. It isn’t new per se, but it is evolving faster now than ever, and changing how business is done. What makes Fintech so disruptive that it’s affecting all the institutional pillars in one strike. The pillars (Banking, Capital Markets, Private Equity, Insurance, Legal, Regulatory), all of which are long standing institutional pillars in our business society that had been static – and stagnant- for too long. Like many new sectors, in order to make sense of it and what it’s doing, you need to break it down into all the component parts to see how each affects what we’re doing or working on. Fintech is a financial revolution, or as many call it, an EVOLUTION.


Siri Creators To Unveil AI Personal Assistant Viv

When Viv, a new form of artificial intelligence, launches on May 9, the term "BattleBots" will take on an entirely new meaning. Viv comes from the people who created Siri, but Viv is far more powerful and better able to hold a conversation, they claim. Viv, along with Siri, Amazon's Alexa, Microsoft's Cortana, Google Now, and others, will be vying to fulfill your spoken commands. Viv is a start-up that's ready to come out of stealth mode, according to The Washington Post. Viv was developed by Dag Kittlaus and Adam Cheyer, who also developed Siri. Apple bought Siri in 2010, but Kittlaus and Cheyer were disappointed with how Siri was implemented on the iPhone and decided to strike out on their own.


Cloud consequences: JP Morgan calls time on IT-as-usual

Another obvious implication for enterprise IT groups is that demand for infrastructure-oriented skills is going to shrink, with a concomitant increase for application-oriented skills. It’s not going to be a great time to have a skill set focused on data center storage or compute. Employees will face the necessity of needing to retool their job focus toward applications. Employers will struggle with how to obtain enough talent. This will be exacerbated by the pressure on IT from peer business units and senior executives to deliver digital applications more quickly. And both sides will soon come to recognize that building cloud-native applications requires skills quite distinct from traditional three-tier applications.


How a tiny fishing village became the gadget factory of the world

Shenzhen is best-known for being home to giant manufacturers like Foxconn, but it also houses hundreds of smaller factories that create everything from individual gadget components to finished devices. While the city has been anonymously churning out electronics for decades, now it's changing again, becoming notable as both a hub for hardware startups and some of the largest Chinese tech companies that yearn to be bigger players on the world stage. As such, more than any other Chinese city, it embodies many of the issues that surface when discussing the country's increasing role in the technology industry, both as a gigantic market viewed hungrily by western tech companies, but also as an increasingly confident creator of its own hardware and software.


Locating Common Micro Service Performance Anti-Patterns

Most teams don’t implement their own frameworks, but rely on existing popular frameworks, such as MVC and REST. This is a good thing as we should not reinvent the wheel every time we build a new app. But a common syndrome occurs when a new project starts with a small prototype based on a sample downloaded from public code repositories like GitHub; the prototype evolves and morphs into a full application, and the team neglects to invest the requisite retrospection to evaluate whether the chosen frameworks were the best for the job and were properly tuned. My tip is spend time to understand the internals of your selected frameworks and how to best configure and optimize them for throughput and performance. If you don’t then be prepared to end up with the situation explained above; I see it every day!


To Go From IT to IoT Build on Your Skills

“There will certainly be some new job titles and new roles, but we will probably also see a lot of expansion of existing roles and responsibilities,” said Tim Herbert, senior VP for research and market intelligence at the Computing Technology Industry Association (CompTIA), an IT industry group. For example, if an oil company decides to deploy sensors on a pipeline, the network administrator may find herself learning how to connect new endpoints that have a bare-bones embedded OS, only battery power, and no human user. That’s likely to call for new skills. ... A good place to start is to get onto an internal project team or “center of excellence” where a core group of employees shape the organization’s use of IoT, Geschickter said. These teams will be looking for well-rounded technical people, so it may be time to study up a little.



Quote for the day:


"You can't have satisfied customers if you have dissatisfied employees." -- Shawn Murphy


May 04, 2016

An Introduction to Bitcoin and Blockchain Technology

Unlike traditional computer networks and payment systems, Bitcoin is not administered by any centralized authority or controlled by any rights holder. Instead, it was introduced to the world as an open source project. It may be utilized by any person, without fee, by downloading Bitcoin software and accessing the peer-to-peer network. These users collectively provide the infrastructure and computing power that processes and verifies transactions and information posted through that network and recorded on its decentralized ledger. A group of computer scientists and programmers volunteer their time toward upgrading and improving the Bitcoin software code, primarily through an open repository on the GitHub website.


Proactive DDoS Mitigation in Today's Threat Landscape

Cybersecurity threats have become more persistent and damaging than ever. Despite increased awareness of the network threat landscape, many organizations struggle to protect themselves against DDoS attacks. Download the eBook to learn about the latest cybersecurity threats and DDoS mitigation best practices: The current threat environment and what it means for your organization; How moving toward a more holistic, proactive approach can help ensure the availability and security of your business; and Why over provisioning bandwidth to absorb large attacks is an ineffective strategy


U.K. Considering Government Applications of Blockchain Technology

“Blockchains ‒ distributed ledgers, shared ledgers ‒ are digital tools for building trust in data,” explained Hancock. “Rather than a single central authority demanding trust and declaring: ‘I say this data is correct,’ you have the distributed consensus of everyone in the chain, saying in unison: ‘we agree that this data is correct.’” Hancock added that data held in the blockchain comes with its own history, and that history is a fundamental part of proving its integrity. “This fact is enormously powerful,” he said. While cautioning against considering the emerging blockchain technology as a generally applicable solution for all problems, Hancock emphasized that the technology offers efficient solutions for important use cases.


Why Fast Flux Networks

The basic idea behind a Fast Flux Network is to associate multiple IP addresses to a malicious domain name. These IP addresses are swapped in and out with extremely high frequency, may be in every 3 minutes, with the help of changing DNS records. As a result, a browser connecting to the same malicious website in every three minutes will see different IP address each time and connect to the actual malicious website via different infected computers every time. ... Fast Flux motherships are the main controlling elements behind the front end servers. They are similar to Command & Control or C & C servers, though they have much more features compared to the C & C servers. This mothership node is hidden by the front end servers, which make them extremely difficult to track down.


Educating Regulators a Priority, Say Blockchain Policy Experts

To help them better understand blockchain, Edge said, governmental organizations and companies should create sandbox environments in which they can safely experiment, then invite younger members to take leadership positions exploring the technology. "You’ll end up with an army of young people," he said, adding: "Before we can do those conversations, we have to do what Jamie’s talking about." Jonathan Levin, founder and CEO of Chainalysis, had his own strategy for doing exactly that – teaching policymakers at any organization, including large corporations about blockchain. Instead of pursuing an “army of young people,” Levin advocated that the organization should identify a single person with at least some bitcoin or blockchain knowledge, and offer them a position as a specialist.


How secure are cloud storage services?

Truth be told, cloud services are not as insecure as the occasionally screaming headlines make out. In fact, there's much to be said for the argument that Dropbox, iCloud, Google Drive and OneDrive have both the money and motivation to make their data stores much more secure than you could hope to achieve on your meagre budget yourself. So let's take a look at these four services, but first we need to get a few things out of the way. This isn't a review of these services, if you want to know what ins and outs of the cloud services, take a look at our article ... Because Google Drive uses the same Google account for login as Gmail, the danger was that everything was compromised as a result. It turned out, however, that the dump was of old phished passwords and at most 2% may have worked - but were all reset by Google anyway.


Businesses must address digital transformation security risks, says analyst

“When organisations look at digital transformation, they need to restructure so that information security is responsible for the security of everything, including the internet of things (IoT), the organisation’s operational technology (OT) and the business,” said Kuppinger. “The execution of information security needs to move to where organisations use IT, meaning IT departments will have to become more decentralised and services-based, while information security is independently responsible for security governance across everything,” he said. ... “Identity and access management is increasingly about … managing identities of everyone and everything in a connected world, and supporting organisations in their governance, risk and compliance [GRC] initiatives,” he said.


5 secure habits of the paranoid PC user

It makes much more sense to access the Internet through a virtual private network in which all outgoing and incoming traffic is funneled through an encrypted channel to a trusted Internet gateway. Another advantage of this strategy is how it masks your current IP address, which should further reduce opportunities for phishing. Fortunately, commercial VPN offerings such as VyprVPN and PureVPN abound for individuals and small businesses, and are typically priced at between $5 and $10 per month. Almost all of these services provide their own VPN client to log in to the correct servers with minimum configuration required. Affordability aside, some considerations when choosing a suitable VPN service include its performance in the region where you live or travel to, the number of simultaneous client devices it supports, the platforms it supports and its reliability.


Analytics-driven SDN and NFVi, CORD and M-CORD

The advent of new virtualized frameworks (cloud, NFV, and SDN) has changed telco hardware strategy to embrace open control interfaces and programmable data path elements. The cloud has given rise to commoditization of infrastructure using merchant silicon and commercial off-the-shelf hardware that are managed and orchestrated with software. NFV and SDN push the scope of software even more towards the abstraction, programming, manipulation and disaggregation of network functions. “Whitebox switching” in networking has emerged as a bare metal switch that is run with a hardware-agnostic network operating system and managed by a decoupled centralized controller. The cloud, NFV and SDN have shifted the industry’s attention from a high performance “closed” purpose-built systems to an “open” programmable hardware and flexible software architecture as the differentiator.


Unfreezing an Organization

Scaling a piece of software without forethought or using unsuitable architecture resulted in many of the enterprise systems that most of our companies run on: expensive to maintain, fragile, hard to replace and loaded with decades of technical debt. Scaling an organization without forethought or using an unsuitable assembly line process, resulted in the organizations that run our companies: expensive, fragile, slow, hard to replace, and loaded with decades of organizational debt. Rewriting a 20 year old antiquated payments system that processes a bank's payments or transforming an organization share the same characteristics: it takes much longer than originally promised, it’s incredibly difficult, and most senior leaders will shy away from it. At the same time, in a world of Uber, Wealthfront, and Fintech incubators, most organizations have no choice. The brave few are doing something about it.



Quote for the day:


"Designing your product for monetization first & people second will probably leave you with neither." -- Tara Hunt


May 03, 2016

Now or Never: The Ultimate Strategy for Handling Defects

Let’s stop using the backlog as a trash can. Having a longer queue of issues will increase the average lead time of our system. We could say that any backlog isn’t just a “first in/first out” queue and manage it that way, but managing our bug log demands time and energy. In my experience, the benefits of these activities with long bug logs are overrated. Just stop doing it. If a bug is critical enough but we haven’t fixed it, it will remind us about itself — don’t worry about that. Just recently, one of my teams had such a case. They knew about the problem, which appeared rarely in unpredictable situations. After a quick analysis, the team decided it was not important enough (below the line) because of its infrequency and closed the issue. However, the bug reappeared in several weeks under different conditions.


Will Fintech Destroy The Banks?

Whether the conversation starts with a vague reference to bitcoin, blockchain or crowd-funding, we're increasingly hearing from clients who are curious or worried about the implications of "Fintech" on their business or investment portfolio, particularly as it pertains to the banks.  Although wrapped in jargon and buzz words, "fintech" or financial technology, is simply the application of technology to improve the efficiency or delivery of financial services, at scale. Put this way, the concept goes from complex and obscure to obvious and unavoidable. There are, however, different themes to the innovation that could be very disruptive for both financial institutions and their customers:


I'm Calling it: Social Networking is Over

Confusion about the difference between social networking and social media is why most people haven't noticed the decline of social networking. People don't stop to think about the difference. Social networking is personal content. Social media is professional content. The sharing of social media -- professionally produced videos, articles, podcasts and photos -- is gradually replacing the sharing of personal content about one's life. For example, as you read my column, this article is being shared on Facebook, Twitter, Google+ and other so-called "social networking" sites. But that isn't social networking; it's social media. Micro-blogging, micro-schmogging. No matter what you call it, Twitter is included in every roundup, comparison or article about social networking. It's universally included in the "social network" category.


Strengthening authentication through big data

The idea is to unobtrusively gather information from several sources, including user behavior and device usage, to create a profile that is unique to the account owner and cannot be stolen or replicated by fraudulent users. The next steps would be to use the profile to detect activities that hint at malicious activity and only then initiate extra authentication steps to make sure the account hasn’t been hijacked or compromised. This model has many strengths. It’s not something you lose, such as physical tokens; it doesn’t require extra memorization efforts; it can’t be stolen or replicated, such as passcodes, or even fingerprint and retina scans; and, above all, it’s not cumbersome and it doesn’t introduce extra complexities to the user experience.


Forensics expert fights crime with digital weapons

As the volume of electronic data grows exponentially and the number and type of devices "owned" by people — such as smartphones and tablets — increases, the need to be able to identify, collect, consolidate, filter and analyse relevant data, compounded by "peripheral data" such as CCTV, physical access control logs, satnav or computer log files, becomes even more important. Historically, during an investigation numerous techniques and tools would be used to attempt to piece together the various pieces of the puzzle, especially around chronology. For example, when trying to link a call on a mobile phone with a person having just entered a secure office, against an unauthorised log onto a computer and the copying of files to a remote device.


The Rise of Threat Intelligence Gateways

Why is threat intelligence gaining momentum? Security professionals know that since they can’t block every conceivable cyber-attack, they need to collect, process, and analyze all types of internal and external security data to improve their incident detection and response capabilities. Many also want to use threat intelligence more proactively for threat prevention. In fact, 36% of enterprise cybersecurity professionals say that their organizations intend to use threat intelligence feeds to automate remediation actions over the next 24 months. ... When threat intelligence points to bad IP address, URL, or DNS lookups, why not simply block them from the get go? Unfortunately, this hasn’t always been easy in the past as it involved normalizing disparate threat intelligence feeds, building custom dashboards and rule sets, integrating various network security devices, etc.


Is There Really Such a Thing as a “Hybrid Agile” method?

Are there projects that don’t require business cases and annual budget planning? Probably. But not many in larger organisations. So finding a way of making the existing waterfall processes more lean, will enable us to shift from “Hybrid Agile” to “real Agile”. The go-live preparation is different. I think there are many technologies for which we already have good answers that allow us to go-live as required using Continuous Delivery practices. For other technologies, COTS come to mind, we will likely continue to see some waterfall validation and testing practices being used before we can go live, but as the technologies and tools evolve this will become shorter and shorter until this final phase disappears.


IT leaders pick productivity over security

Security is on the top IT leader's mind, especially as hacks become more frequent, sophisticated and malicious, but the report also uncovered some shocking truths about cybersecurity in the enterprise. The report showed major flaws in how businesses and IT leaders approach security, and it boils down to a lack of communication between the C-Suite and IT leaders, as well as a general frustration with how security slows down overall productivity in the company. But just because security might bog down productivity, or IT leaders and executives suffer from a lack of communication, businesses need to remain vigilant regarding security. Jack Danahy, CTO and co-founder of Barkly, says efficiency should be redefined. "Good security does not bog down efficiency.


The race to create smart homes is on

Two things are needed to make homes truly “smart.” First are sensors, actuators and appliances that obey commands and provide status information. There are already hundreds if not thousands of smart home products on the market. These have evolved in recent years beyond simple door sensors and light switches to smart thermostats such as Nest and voice command devices such as the Amazon Echo. Second are protocols and tools that enable all of these devices, regardless of vendor, to communicate with each other. However, this is a major undertaking and it won’t happen overnight. In the meantime, smartphone apps, communication hubs and cloud-based services are enabling practical solutions that can be implemented right now.


Enterprise UX: Past, present and future

Looking ahead, there are multiple technology developments underway that will affect how user experiences -- for both consumers and business users -- are created. Wearables, IoT devices, virtual and augmented reality, and increasingly sophisticated artificial intelligence (AI) will all profoundly change the way humans and computers interact with one another, and with the world around them. Gesture and voice control, for example, are set to play an increasingly important roles. The emerging umbrella term for where all this is heading is the 'post-app' world of pervasive computing, where desktop WIMP and mobile touch-driven interfaces are augmented or superseded by more 'natural' methods of user interaction.



Quote for the day:


"I never look at the glass as half empty or half full. I look to see who is pouring the water and deal with them." -- Mark Cuban


May 02, 2016

The expanding landscape of exploit kits

If you have systems and files being encrypted or file share becomes encrypted, that’s a huge impact. Dozens of hospitals have been attacked recently, and for some it has taken them days to recover. That means massive down time, rescheduling major surgeries. It’s literally putting lives at risk,” Williams said. Through their networks in the dark web, nefarious actors are informed that new exploits are seen in the wild, making them aware of even zero-day vulnerabilities before the general public. Leonard said, “Under responsible disclosure, a researcher will identify the use of a brand new exploit script to a vendor. The vendor then releases a patch that can be applied to the business.” Businesses, though, struggle to apply those patches expeditiously. The level of sophistication and the relative ease with which criminals can access exploit kits compromises business operations and has security teams on overdrive trying to expedite the patching process.


Ways to craft a better enterprise IT security roadmap

You need to be able to detect those threats and attacks. And detecting a threat, a vulnerability and an attack are three separate things, and that's important to understand. Lots of companies sell you vulnerability detection. Vulnerability detection is basically like telling you which doors you have unlocked. Attack detection is telling you when the burglar is coming through your door. And threat detection is, "Hey, the burglar has been seen on your street with a big bag of loot and he's heading for your house." So those are three separate things and, ideally, you want to know all three things. And that distinction is important because sometimes people say, "Well, I do vulnerability scanning so I'm covered." No, that just tells you which doors are unlocked. Maybe the burglars are getting smart enough to come in through the chimney.


Unified Storage That Can Sync and Share

Many siloed storage, data management, file sync and share and security solutions exist to provide for these individual requirements, but are typically cobbled together in costly, inefficient and unreliable ways. Nexsan UNITY addresses all of these requirements in a single unified solution which delivers high performance and multi-site collaboration at LAN speed to support business continuity and disaster recovery processes as well as mobile access to primary storage data. UNITY's patented technology is designed to support all devices – from mobile devices to tablets, laptops and desktops running Android, iOS, Mac and Windows– and provides a secure connection to data stored and managed within the enterprise totally eliminating the drudgery of using unpopular and aging VPN technologies.


Mobile Banking Trends Will Lead Change in Banking in 2016

Mobile banking offers many advantages: Users can authenticate their identity and open new accounts, sign up for direct deposit, pay bills, take out loans, and deposit checks by photographing them, all from their mobile devices. Mobile apps such as Venmo let users make and share payments instantly, and Quicken Loans’ Rocket Mortgage even offers a mortgage approval in eight minutes through a process that automatically collects pay and credit information and requires minimal typing by the user, letting them sign their name right from their mobile device. In the U.S., Simple and Moven are the leaders in developing banking apps that allow people to pay by mobile, track their expenditures, and save for future goals in electronic envelopes — whether it’s for large expenses such as vacations or a down payment on a house, or for smaller things like a tattoo or a bike tune up.


What's Wrong with Open Data Sites--and How We Can Fix Them

The second non-obvious design problem, which is probably the most important, is that most open data sites bury data in what is known as thedeep web. The deep web is the fraction of the Internet that is not accessible to search engines, or that cannot be indexed properly. The surface of the web is made of text, pictures, and video, which search engines know how to index. But search engines are not good at knowing that the number that you are searching for is hidden in row 17,354 of a comma separated file that is inside a zip file linked in a poorly described page of an open data site. In some cases, pressing a radio button and selecting options from a number of dropdown menus can get you the desired number, but this does not help search engines either, because crawlers cannot explore dropdown menus.


Here's why analytics is eating the supply chain

This is not to say that supply-chain professionals are newcomers to the world of analytics. On the contrary: Demand forecasting, for example, has "been around forever" and relied heavily on data, said Paul Myerson, an author and professor of practice in supply chain management at Lehigh University. What's new today are the tools. "Today we have very visual tools that are much quicker to run," Myerson explained. "What used to take overnight can now be done in minutes." New software is also enabling more collaboration among partners, including key customers and suppliers. Point-of-sale data provides better insight for everyone involved, leading to better forecasting decisions. "It's about agreeing on forecasts and collaborating on inventory throughout the supply chain," Myerson said. "It really improves efficiency, cost and quality, and not just for manufacturers."


Embracing Agile

When we ask executives what they know about agile, the response is usually an uneasy smile and a quip such as “Just enough to be dangerous.” They may throw around agile-related terms (“sprints,” “time boxes”) and claim that their companies are becoming more and more nimble. But because they haven’t gone through training, they don’t really understand the approach. Consequently, they unwittingly continue to manage in ways that run counter to agile principles and practices, undermining the effectiveness of agile teams in units that report to them. These executives launch countless initiatives with urgent deadlines rather than assign the highest priority to two or three. They spread themselves and their best people across too many projects.


Cloud Economics – Are You Getting the Bigger Picture?

Most enterprises have hardware utilization rates significantly below 20% because of the excess capacity required to handle peak demand. As such, many companies carry up to 5 times the required hardware, networking, and data center space during steady state business cycles. If their computing demand is spiky, utilization rates outside of peak cycles are commonly below 10%. As a result, enterprises are spending much more on compute and storage than is required. Figure 1 depicts the traditional model where cloud shifts fixed CapEx expenses to variable OpEx expenses. To understand the full value of cloud for your enterprise, you must look beyond the CapEx vs. OpEx benefits and assess the other value drivers at play.


Zen and the art of big data digital IoT transformation

Despite the obvious levels of machine automation in the Internet of Things — and the machine-learning capabilities that some of these machines will benefit from — we must also keep the human factor top of mind. The first beneficiaries of efficient data management and effective data analysis will often be the worker-stakeholders within the business. When we empower employees with intelligence to be able to perform their jobs better (with better machines and processes around them), we ultimately derive greater business value at the end of the day. For want of more tangible examples here, if big data digital IoT transformation is focused on plant machinery, then we could see turbine sensors reporting performance statistics to enable more efficient predictive maintenance. Our business model states: less downtime + better serviced machines = greater business value.


PCI's new rules focus on the chiefs

Troy Leach, the chief technology officer for the PCI Security Standards Council, said in an interview that he finds this lack of involvement problematic and that he fought for the new rule. The rule itself sounds innocuous and possibly even obvious, but there's a lot more to it. The rule, within Requirement 12, mandates that "executive management establish responsibilities for the protection of cardholder data and a PCI DSS compliance program." To Leach's mind, that means that they have to dig in and assume responsibilities for payments security and stopping the simple act of delegating it away. "The intent is that we at least push the visibility to the executive level," Troy said, referring to the full text of the new guidelines. "We need for there to be different C-levels aware of compliance responsibilities."



Quote for the day:


"An organization's ability to learn, and translate that learning into action rapidly, is the ultimate competitive advantage." -- Jack Welch


May 01, 2016

Blockchain Platform Emercoin is Moving Beyond Cryptocurrencies

Now, Emer is a platform that offers two primary product umbrellas, a blockchain-based platform for a variety of of services including security, advertising, and legal. It is also a payment services unit in Emercoin which also runs through the Emer platform. While the name Emercoin might be recognizable from the standpoint of being a cryptocurrency used to send and receive payment and is ultimately tradeable, it is just a piece of what makes the Emer platform so valuable in the services it offers and the problems it can solve. For developers seeking to build products or solutions based off of the Emer platform, they’ve provided a quick-start guide for deploying an Emercoin wallet on an Ubuntu instance within Microsoft Azure.


IT leaders inundated with bimodal IT meme

At first glance, it certainly seems to make sense, and from my perspective bimodal is intellectually useful as a concept in creating a dividing line between legacy technology efforts that change much more slowly and must be managed more carefully on one side and high velocity new digital projects that must more faithfully match the rate of exponential change of market conditions today, while also effectively applying the latest technologies and techniques. Where the concept breaks down, as I explored in my original critique of bimodal, is in actual execution. The real world of technology and the activities that make it bear fruit cannot be neatly compartmentalized into a dual structure. Not only do the actual needs and demands of individual IT initiatives vary widely, the team skills and processes on the ground are unique for nearly every project as well.


Quantifying Benefits of Network Virtualization in the Data Center

Network virtualization allows IT organizations to deploy their network resources whenever and wherever they need them. IT can rapidly add the capacity to make sure the network delivers the performance and reliability demanded by evolving data center environments. NV provides improved, centralized management and offers microsegmentation to improve data center security and increase compliance. For organizations facing network upgrades, the option to deploy network software on white-box switches can result in significant capex savings. NV deployment is becoming mainstream in leading data center deployments. Organizations are likely to see strong ROI benefits from operational efficiencies – although this ROI is challenging to quantify.


Data Tower: a Data Center for Saruman

As our society increasingly relies on digital services, the unique problems in data center design attract attention from designers outside of the data center world, who propose unorthodox design ideas in attempts to envision new, better ways to build data centers in the future. Another example of this trend is an Estonian startup called Project Rhizome, which is thinking of ways to better integrate data centers into densely populated urban areas. eVolo, an architecture and design journal, has held its futuristic skyscraper design competition since 2006. The other two winners in this year’s contest were a design that proposes a continuous horizontal skyscraper around New York City’s Central Park, which is sunken to create more space for housing with unobstructed views, and a vertical control terminal for drones that would provide services to New York residents.


Data Storytelling: The Essential Data Science Skill Everyone Needs

It’s important to understand how these different elements combine and work together in data storytelling. When narrative is coupled with data, it helps to explain to your audience what’s happening in the data and why a particular insight is important. Ample context and commentary is often needed to fully appreciate an insight. When visuals are applied to data, they can enlighten the audience to insights that they wouldn’t see without charts or graphs. Many interesting patterns and outliers in the data would remain hidden in the rows and columns of data tables without the help of data visualizations.


A Google executive creative director explains what he does for a living 

While Vranakis says there are many agencies out there that have taken a more modern working approach he thinks - without wanting to make sweeping statements about the industry - there are a couple of things that separate the Lab from a traditional agency setup. One is the way in which the creative directors and other experienced executives at agencies often take credit for a team effort. He points to the annual Gunn Report, which showcases the year's most successful campaigns. "I think there are still structures that incentivize the contributor as opposed to the group," Vranakis said. "If you look at Gunn Reports and things like that, they all name individuals. I know that it's just business. If you win an award and your name is attached to the award, you'll get a pay-rise as they'll need to keep you as you'll be headhunted."


Identity Management: Where Cloud Security Falls Short

It's obvious that thinking outside the traditional security perimeter is necessary. Less obvious is how much "controlling the access to data" will contribute to firms being able to adopt cloud services and technologies more safely, Yeoh and Baron continued. The survey identifed seven types of perimeter-based security products, and asked respondents how many of them were in use in their organizations. As the table below shows, antivirus, anti-spam, and Virtual Private Networks were the top three solutions in use by respondents. ... When the question turns to which access measures are in place for partners, outsourced IT, and other third parties, the picture changes quite a bit. Only 62% of respondents said they had privileged access management in place for such users, 25% had application to application password management, and 32% had secure password storage.


Why image recognition is about to transform business

Not every company has the resources, or wants to invest in the resources, to build out a computer vision engineering team. Even if you’ve found the right team, it can be a lot of work to get it just right, which is where hosted API services come in. Carried out in the cloud, these solutions offer menus of out-of-the-box image recognition services that can be easily integrated with an existing app or used to build out a specific feature or an entire business. Say the Travel Channel needs “landmark detection” to show relevant photos on landing pages for specific landmarks, or eHarmony wants to filter out “unsafe” profile images uploaded by their users. Neither of these companies needs or wants to get into the deep learning image recognition development business, but can still benefit from its capabilities.


2016: The year of application layer security in public clouds

In 2016, private datacentres will reflect public cloud security realities and secure internal network traffic as well. Encrypted layers of security within a datacentre or public cloud network will help organisations control access and encryption to limit malicious east/west movement. This ‘application segmentation’ at the application layer will add security within the network to strengthen existing datacentre hardware and virtualisation layer security. Enterprise application owners will realise the value of true virtual networks in concept in practice. No more will network operators believe a VLAN is actually virtual! The limitations of the physical network architectures will be magnified once enterprises see the difference between an underlay for bulk transport and an overlay for application specific use-case tuning. The glaring security holes in physical networks once obfuscated will reveal themselves.


Controlling Hybrid Cloud Complexity with Containers: CoreOS, rkt, and Image Standards

CoreOS Linux is an interesting example of system architecture decisions informed by the reality of container clusters. If the container makes applications self-contained, portable, standard units, the operating system should adapt to empower this dynamic use case. In CoreOS, the operating system and basic userland utilities are stripped to their bare minimum and shipped as an integral unit, automatically updated across a cluster of machines. CoreOS may be thought of as a “hypervisor” for containers, that is itself packaged in a discrete and standard way. Utilizing this single image distribution, CoreOS foregoes individual package management in favor of frequent and coordinated updates of the entire system, protected by a dual-partition scheme providing instant and simple rollbacks in the event of trouble.



Quote for the day:


"When data disproves a theory, a good scientist discards the theory and a poor one discards the data." -- Will Spencer