Daily Tech Digest - March 11, 2017

Demystifying Advanced Data Visualization

Advanced Data Visualization gives a new meaning on how pictures can simplify information needed to comprehend complex questions. Angela Hausman states that Big Data does not mean much if the people who control change can’t understand or have to spend too much time deciphering the Great Data that is presented. In addition, Big Data speeds across the Internet, captured from people and the Internet of Things (IoT) including items such as appliances, GPS, and building maintenance. This Big Data constantly updates, second by second, providing not a static picture, but a dynamic movie. Organizations, need to find ways in keeping up with this Big Data in order to understand their customers better and to move much more quickly, smoothly, and efficiently.


Four perspectives on data lakes

Governance is a practice that you apply to “something.” Just like James Watt’s fly-ball governor for the steam engine, a governance program seeks to keep a engine in balance so it works effectively. This engine may be a process, organization, or flow of information. The important point is that the target of what you are governing is clearly defined. Approaches to governance, particularly around a data lake, vary widely due to the different choices that organizations make in their definition of the engine being managed. For example, the IT department may see the data lake engine as a collection of technology working together. The business may see the data lake as part of an innovation engine helping them to create new value from data. So which is the right engine to govern? It depends on the objective for data lake.


AWS Outage and High Availability

Your HA strategy should be also tied to your monitoring, alerting, remediation but also to your customer support strategy. Monitoring and alerting is clear – you want to know if your site or parts of it are down and take the appropriate actions as described in your remediation plan. But why, your customer support strategy? Well, if you haven’t noticed – AWS Service Dashboard was also down yesterday. The question comes up, how do you notify your customers of issues with your service if your standard channel is also down? I know that a lot of IT guys don’t think of it but Twitter turns out to be a pretty good communication tool – maybe you should think of it next time your site is down. Developing solid HA strategy doesn’t need to be a big bang approach.


Quantum technology is beginning to come into its own

Everything in the natural world can be described by quantum mechanics. Born a century ago, this theory is the rule book for what happens at atomic scales, providing explanations for everything from the layout of the periodic table to the zoo of particles spraying out of atom-smashers. It has guided the development of everyday technologies from lasers to MRI machines and put a solid foundation under astrophysicists’ musings about unknowables such as the interiors of black holes and the dawn of the universe. Revealed by a few surprising discoveries, such as that atoms absorb and emit energy only in packets of discrete sizes (quanta), and that light and matter can act as both waves and particles, it is modern physics’ greatest triumph.


Protecting the enterprise against mobile threats

As it is with securing the traditional network, mobile security is also about building policies. "Security resources are scarce," said Simkin, "so, organization needs to think about how they safely enable those mobile devices to access corporate resources. They need to take the time now to consider what technology they are going to put into place to keep the company safe." Even the White House is changing the paradigm a little bit. The President's now infamous use of an Android phone has helped bring to light the need for better mobile security, said Paul Innella, CEO at TDI.  "If organizations don't start treating mobile devices, which includes IoT, as corporate assets, they are going to see this wide scale disruption and infiltration. So, they have to be thinking about how they evaluate the risk of one of these mobile devices coming into their environment," Innella said.


Google offers new 'Always Free' cloud tier to attract users

The free offerings are meant to help attract users to Google Cloud Platform at a time when the company is competing against Amazon Web Services, Microsoft Azure and other public cloud providers for developers’ time and attention. Google’s Always Free tier is somewhat similar to what AWS offers its customers. For example, both platforms allow users to run workloads using their respective event-driven compute services, AWS Lambda and Google Functions. One thing that sets Google apart is its willingness to hand out a free virtual machine. Google previously offered a 60-day free trial with $300 in credits. An extended trial was one of the cloud provider’s most-requested features, since the short time limit often wasn’t enough for a full proof-of-concept test.


Pablo Brenner talks reverse psychology in IT collaboration

The aim is to use automation to help create an environment similar to Stack Overflow inside a company. This could be as simple as offering pop ups on a library telling individuals to avoid this particular site (“programmers spend a lot of time using the wrong library”) and also help to attach skills to a developer. “We’re building life CVs on people,” says Brenner. This may seem a little worryingly intrusive, but Brenner doesn’t think so. He stresses that the system is only looking at technology skills not what people are generally reading online at work. “Like any tool, it could be used in a bad way,” he concedes, but he does not feel there should be any concern that employees will be categorised within an organisation based on the number of skills they have because this is too hard to define. Some people have broader knowledge some people have deeper knowledge.


Banking Industry Still Taking Small Steps with Big Data

Financial organizations also must use data and advanced analytics for fraud and risk mitigation and achieving regulatory and compliance objectives. With cybersecurity more important than ever, falling behind in the use of data for security purposes is not an option. While the majority of institutions might have much of the infrastructure in place to manage the increasing flow of data, significantly fewer have their data integrated across silos. This continues to be a challenge as customer expect their financial organization to understand their entire relationship when working with their bank or credit union. This challenge is obviously exacerbated with smaller organizations who may not even have a CRM system in place.


Facebook rolls out Bryce Canyon, its next-gen storage platform

Facebook on Wednesday unveiled a new storage platform, Bryce Canyon, that offers the efficiency and performance necessary to support the social media company's "video first" strategy. The design specification for the platform is available via the Open Compute Project. It'll be used primarily for high-density storage, including videos and photos. Bryce Canyon supports 72 hard disk drives (HDDs) in four Open Rack units. That's a 20-percent higher density than Open Vault, the first storage enclosure that Facebook in 2013 contributed to the Open Compute Project. Bryce Canyon is also Facebook's first major storage chassis designed from the ground up since Open Vault. Meanwhile, Bryce Canyon also offers a 4x increase in compute capability over the Honey Badger storage server designed in 2015.


Troubleshooting Memory Issues in Java Applications

For a Java process, there are several memory pools or spaces - Java heap, Metaspace, PermGen (in versions prior to Java 8) and native heap. Each of these memory pools might encounter its own set of memory problems, for example– abnormal memory growth, slowness in the application or memory leaks, all of which can eventually manifest in the form of an OutOfMemoryError for these spaces. In this article we will try to understand what these OutOfMemoryError error messages mean, which diagnostic data we should collect to diagnose and troubleshoot these issues, and will investigate some tooling to collect that data and analyze it for resolving these memory problems. This article focuses on how these memory issues can be handled and prevented in the production environments. The OutOfMemoryError message reported by the Java HotSpot VM gives a clear indication as to which memory space is depleting.



Quote for the day:


"It is a leader's job to challenge the status quo. And when you do, you make enemies." -- @CarlyFiorina


Daily Tech Digest - March 10, 2017

Application support and maintenance add up to operational ALM

Approach operational maintenance and support lifecycles with a concept of application states. Every application exists in a specific number of states, each representing a set of components and workflow relationships. One state is usually considered the normal or base state, and all the others are responses to special conditions. In this multi-state dynamic, application maintenance and support has two goals throughout the application's lifecycle. It must define each possible operating state precisely, in terms of component hosting and workflow connection through the network. It also must manage the application's dynamic movement from one valid operating state to another, exhibiting stable, secure and compliant behavior.


Approaching Cybersecurity Risk Management At Any Organization

First, get the company leadership on board. A cyber risk management strategy is unlikely to succeed if it is not a priority across the entire organization. Second, outline and implement a strategy for securely adding new technologies – whether it is a new finance application or connecting something to the network. Review the new solution versus the rest of the network and determine if it adds or eliminates any risk, and assess if its level of impact is acceptable. Finally, educate your employees on their role in the overall corporate cyber risk strategy. Employees could be viewed as an easy target for criminals, so consistently educating them on the threats facing the organization will help prevent some attacks.


Bots: Biggest Player On The Cybercrime Block

Joe St. Sauver, scientist at Farsight Security, said bot makers, using compromised devices, spread the “traffic” among multiple IP addresses, “so that some clicks come from Oregon, others come from Ohio, others from Oklahoma etc. “That software may also include routines designed to mimic natural pauses, while pages are ‘being read,’ or subsequent clicks – perhaps drilling down on optional features, looking for local dealers or other things that look like what a normal human visitor would do,” he said. But Tiffany said too many security professionals still, “falsely assume that bot traffic looks robotic.” Instead, it comes from residential IP addresses, uses real browsers and does unrobotic things like, “run JavaScript, run Flash, use the victim's cookies to look like real humans, and interact with pages like real people, often by emulating the real people who own the computers they've infected.”


China mulls national cryptocurrency in race to digital money

It’s not surprising that countries have found it difficult to tackle cryptocurrencies. People exchanging things on peer to peer (P2P) networks used to be the music and video industry’s problem. Now, suddenly, people were exchanging money with them. When used properly, P2P money offers true anonymity, which creates problems for authorities trying to track the flow of cash to terrorists and organized criminals. Left unchecked, it’s also a great tax evasion tool. Where governments are regulating, they’re typically making sure that anyone trading bitcoins registers their identities so that authorities can follow the money. It’s a tricky line for policymakers to walk. Governments need to control cryptocurrencies, but if they squash them altogether, they risk missing some of its best innovations.


Deep packet inspection: The smart person's guide

Although DPI has a number of uses, the practice is rooted in enterprise network security. Sniffing traffic in and out of a network is understandably useful for preventing and detecting intrusions. Detecting and blocking the IP of malicious traffic is particularly effective at fending off buffer overflow and DDoS attacks. DPI is also used by internet service providers. If packets are mail, ISPs are the postal service and have access to unencrypted web traffic as well as packet metadata like headers. This provides ISPs with an abundance of useful information, and the companies leverage access to user data in a number of ways. Most ISPs in the United States are allowed to turn user data over to law enforcement agencies. Additionally, many ISPs use consumer data to target advertising, analyze file sharing habits, and tier access service and speeds.


State of Cyber Security 2017

State of Cyber Security 2017 reports the results of the annual ISACA global cyber security survey, conducted in October 2016. The survey results bolster the belief that the field of cyber security remains dynamic and turbulent during its formative years. Weekly news headlines confirm that cyberattacks are not a seasonal threat or dependent on specific industry environmental attributes, but are constant and should remain forefront in every enterprise executive’s thought process. To equip you with a comprehensive understanding of the cyber security industry through the lens of those who define it—the managers and practitioners—ISACA is presenting the survey results in a series of reports that focus on individual topics. This report is the first in the ISACA State of Cyber Security 2017 white paper series and presents timely information about cyber security workforce development and its current trends.


Big Growth in Data Security Provides Consultant Opportunities

Consultants need superior application and network penetration skills. This means that they should be able to break down, and analyze the way that software works within any environment. This includes input and output channels. Networks need to be understood in the same way. The purpose of this knowledge, is to identify where risks exist, or where existing security breaches are occurring. Software algorithms are known to provide false positives, so a consultant needs to be able to identify these, and should have skill in determining viable threats. This will help the consultant to allocate resources where they are most necessary, which can benefit their employer, financially. Consultants should build an understanding of the technologies used by their employer. Whenever working on a contract, a consultant will deal with systems that they are unfamiliar with.


Data Security: Don’t Call an Ambulance for a Sore Throat

It’s a constant struggle, one that today’s businesses fight with infrastructure- and device-based approaches, and (vital but often neglected) employee training against social engineering attacks. The challenges continue as technologies evolve from “strange new risk” to “vital to business success.” Five or six years ago, security concerns led many businesses to declare they’d never use cloud services. You’d be hard-pressed to find a CIO or CEO who’d say that today. Just as businesses have evolved toward the cloud, they’re also evolving toward enterprise-wide data access. We recognize the valuable insights and innovations to be gleaned from trading siloed departmental data warehouses for the comprehensive enterprise data lake. Tearing down those silos can cost us a layer of security around specific data sets, but curling up in an information panic room is not the way forward.


Application layer security puts up another obstacle for hackers

Businesses are baking security into applications during the development process. "Identifying a security flaw in development is much less expensive than doing it once the application is running," stated Nathan Wenzler, chief security strategist at AsTech Consulting, a cyber-risk management firm in San Francisco. ... In static analysis, security software examines code without running it. It analyzes source code, identifies locations where vulnerabilities may exist and outlines potential fixes. Dynamic analysis is another option wherein the IT team tests and evaluates application security while compiling the software. Dynamic analysis tools pepper the application with attack scenarios to detect vulnerabilities.


CIA-Made Malware ? Now Antivirus Vendors Can Find Out

Among those techniques are ways to bypass antivirus software from vendors including Avira, Bitdefender and Comodo, according to some of the leaked documents. The documents even include some snippets of code that antivirus vendors can use to detect whether a hacking attempt may have come from the CIA, said Jake Williams, founder of security company Rendition InfoSec. “In the documents, they (the CIA) mention specific code snippets used in operational tools,” Williams said. Antivirus vendors can use this to look at their customers’ networks for any traces of past intrusions. That might be a big blow to the CIA’s surveillance operations. Now anyone, including foreign governments, can use the WikiLeaks dump to figure out if the CIA ever targeted them, according to Williams.



Quote for the day:


"If people follow you, you have an obligation not to abuse that trust." -- Gordon Tredgold


Daily Tech Digest - March 09, 2017

Google: Democratisation of AI tech to ‘greatly improve’ quality of life

The technologies stand to have a transformational impact on the way processes are carried out in the financial services, education, manufacturing, healthcare, retail and agriculture industries, to name a few – if organisations in these sectors can access it. “As technology reaches more people, its impact becomes more profound. This is why the next step for AI must be democratisation, by lowering the barriers to entry and making it available to the largest possible community of developers, users and enterprises,” she said. “It requires rare expertise and resources few companies can afford on their own. This is why cloud is the ideal platform for AI.” Particularly, said Fei-Fei Li, when it comes to drawing on the global reach of the Google Cloud Platform to put AI technologies in the hands of everyday users all over the world.


Say hello to the Robo-bankers: how AI is affecting banking and finance

“The development in the basic technologies, from computer processing and data storage to communication, is allowing more sophisticated technology to advance,” says Marcos Monteiro, CEO of Veezoo and participant in the inaugural Kickstart Accelerator based in Zurich. “So we have AI now able to process all this data and come up with better predictions – giving companies more data and more information.” “Companies have a lot of data but they still find it very difficult to get the information that they need. Our goal is to democratise data inside a company and make it easier for everybody to get the information they need to work.” ... When speaking at the recent RegTech Futures summit in Amsterdam, Sybenetix’s R&D president, Paul Young, advised companies to treat AI as a specialist team member: “A supervised AI approach combined with expert domain knowledge is the key to supporting people, not replacing them.”


GE Favors SaaS For Non-Differentiated Apps, Has Big Plans For IoT 

The more SaaS we can buy the better off we are, especially for non-differentiated applications like HR, scheduling, administrative, bill paying, taxes, compliance, customs, etc. The world can’t get to SaaS fast enough for us. The core applications that make GE different -- how we do field services better, how we sell better, how we do inventory, planning and predictive analysis better -- that stuff we don’t want as SaaS because there is differentiation there for us. Our software and our analytics allow us to do better than our competitors. That’s where we invest. Our feedback to the vendors that want to come in and sell us infrastructure as a service … skip that. We can already run stuff pretty cheap. We’ve got a great cloud strategy and we’ll move when we need to. Give me SaaS, that’s what I really want.


The Disconnected Digital World

Ironically, the continuous stream of digital information itself can create a dissociative effect. Digital feeds such as social media, email, enterprise messaging and collaborative communities inundate individuals to the point where they become info-blind. People are unable to recognize the important slivers of information within the digital landscape before them. How many helpful informational messages are sent in your organization each day, week and month? Are personnel now in the habit of simply filing these away or deleting them before absorbing what may be an important security item? In the same way that startups and DevOps talk about the minimum viable product (MVP), as described in “The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses,” by Eric Ries, perhaps we need something akin to a minimum viable digital insight for security.


Securing DNS against threats from the Internet of Things

The simplicity with which DDoS attacks can be generated using DNS infrastructure is what makes them so concerning. After taking control of a system, hackers will use a spoof IP address of their target to send queries to named servers across the internet which, in turn, will send back responses. The attacker is able to amplify the query to return the largest possible response, often by employing a botnet of thousands of computers or, in the examples above, connected devices, to incapacitate the target. However, the responsibility for these attacks needn’t always lay with the owners of the connected devices. It isn’t always clear whether a particular device is vulnerable. The name on the label isn’t always the name of the manufacturer, for example, and these manufacturers tend not to make it easy – or in some cases, possible – to change the passwords on these devices.


Manage SELinux policies for better troubleshooting, access controls

Security-Enhanced Linux is an advanced access control mechanism built into most modern Linux distributions. With Security-Enhanced Linux in place, administrators use policies to better manage security. But these policies are key to not only the security of a system, but to its functionality. For example, Security-Enhanced Linux (SELinux) allows applications to query a policy; admins to control process initialization, inheritance and program execution; and admins to manage files, file systems, directories, sockets, open file descriptors, messaging interfaces and network interfaces. It also allows for in-place policy changes -- the ability to alter SELinux policies without rebooting the system. SELinux works by implementing mandatory access control (MAC) on top of discretionary access control (DAC) to protect systems from intrusion.


Open Rights Group calls for control of spies’ use of zero-days

“While targeted surveillance is a legitimate aim, we need to know that government regulation of this area is sufficient,” said Open Rights Group campaigner Ed Johnson-Williams “From what we learnt during the passage of the Investigatory Powers Act, it appears that the ‘creation’ of techniques is not really regulated at all,” he wrote in a blog post. The leaked CIA documents indicate that US intelligence agencies are working with the UK to stockpile vulnerabilities that can be used on Microsoft Windows, Mac and Linux computers ,as well as iOS and Android smartphones and smart TVs. In the light of the fact that many of the vulnerabilities disclosed came from UK intelligence agencies, Johnson-Williams said the UK government has serious questions to answer


A pragmatic approach to master data management

Some organizations are drawing upon their existing resources to handle master data management, often calling upon employees to manually clean and migrate data. This method tends to be prone to human error, causing further complications and does not scale well as business needs change. Many organizations have implemented specific data management tools to aid with integration and cleansing. Integration tools, however, do not always support large amounts of data and are limited in the types of files and data sources they can manipulate. Another strategy implemented by organizations, despite common understanding that it is a poor solution, is point-to-point integration. Point-to-point integration, commonly referred to as custom code, is a method in which skilled developers write custom code and implement it within each specific endpoint in order to create connectivity.


Hackers Exploit Apache Struts Vulnerability To Compromise Corporate Web Servers

On Monday, the Apache Struts developers fixed a high-impact vulnerability in the framework's Jakarta Multipart parser. Hours later, an exploit for the flaw appeared on Chinese-language websites and this was almost immediately followed by real-world attacks, according to researchers from Cisco Systems. The vulnerability is very easy to exploit and allows attackers to execute system commands with the privileges of the user running the web server process. If the web server is configured to run as root, the system is completely compromised, but executing code as a lower-privileged user is also a serious security threat. What's even worse is that the Java web application doesn't even need to implement file upload functionality via the Jakarta Multipart parser in order to be vulnerable.


How to start building your next-generation operating model

Technology is a core element of any next-generation operating model, and it needs to support a much faster and more flexible deployment of products and services. However, companies often have trouble understanding how to implement these new technologies alongside legacy systems or are hampered by outdated systems that move far too slowly. To address these issues, leaders are building modular architecture that supports flexible and reusable technologies. Business-process management (BPM) tools and externally facing channels, for example, can be shared across many if not all customer journeys. Leading technology teams collaborate with business leaders to assess which systems need to move faster. This understanding helps institutions decide how to architect their technology



Quote for the day:


“Let no feeling of discouragement prey upon you, and in the end you are sure to succeed.” -- Abraham Lincoln


Daily Tech Digest - March 08, 2017

Machine learning is marketing’s future

On any given day, most marketers are up to their ears in data — data from the programs they run, the buyers they court and track. Marketing automation enables them to separate signals from noise, wheat from chaff, so that they can orchestrate specific actions based on the stories data tells them. In fact, some marketing automation platforms have evolved to the point of predicting the best times to engage specific buyers, using past behaviors and actions to identify an optimal time for sends and engagements (a window when opens and click-throughs are likely to be highest). The technology has also gotten smarter in the lead scoring and weighting department and can enable marketers to go beyond conditional scoring rules, prescribing scoring values for behaviors and actions across different segments, industries and buyers.


From disrupted to disruptor: Reinventing your business by transforming the core

Talent priorities should be based on a clear understanding of the skills needed at all levels of the business. This requires investing in building relevant digital capabilities that fit with the strategy and keep pace with customers as they change the way they consider and make purchases. At the same time, targeted hiring should be tied to those capabilities that actually drive financial performance. Enabling that talent to thrive requires a digital culture, i.e., one that is customer centric and project based, with a bias for speed and continuous learning. In fact, cultural and organizational issues can lead to the squandering of up to 85 percent of the value at stake.3Making sure the new culture sticks requires rebuilding programs that reward and encourage new behaviors, such as performance management, promotion criteria, and incentive systems.


Unpatched Western Digital Bugs Leave NAS Boxes Open To Attack

The vulnerabilities were discovered on Western Digital’s My Cloud PR4100 NAS device. However, the flaws are also present across WD’s portfolio of MyCloud NAS devices such as: DL4100, EX4, EX2 Ultra and PR2100. A full list of impacted products is available online. Researchers say a group of vulnerabilities that when used in tandem create conditions that could allow an attacker to fully compromise the hardware. “In the worst case, one could steal sensitive data stored on the device or use it as a jump host for further internal attacks,” according to SCVL in an advisory. The vulnerabilities include command injection vulnerabilities, a stack-based buffer overflow bug and a cross-site request forgery flaw.


Want to do your own analytics? Google's free Data Studio takes on Microsoft's Power BI

It is Google's answer to more established analytics platforms such as Tableau, QlikView, and of course Microsoft's Power BI, which got a relaunch in 2015 untethered from Office 365. However, Google's Data Studio visualization toolset for the moment focuses primarily on connecting up data from Google sources, such as Google Analytics, Google AdWords, Google Sheets, and BigQuery. But it will soon roll out connectors for SQL databases. For Google, one of its main selling points is collaboration and to this end it's using Google Docs technology to offer real-time group editing so data can be brought in from different teams. "One of the fundamental ideas behind Data Studio is that data should be easily accessible to anyone in an organization. We believe that, as more people have access to data, better decisions will be made," Google said in a blogpost.


Millennials Are Most Risk Prone To Cyber Security Threats

Fearlessness is what makes the Millennials unique and gives them an ability to innovate and invent and at the same time their need for instant gratification makes them vulnerable, more so during their sojourn in the cyber world. The millennials (those born after 1980) constitute one-third of the total workforce globally and by 2020, they will account for nearly half of that. The ever growing population also indicates that as an alarming rise in the number breaches come to light, it becomes important for the stakeholders to formulate policies which allows them to derisk their security concerns. ... What is more alarming, according to the survey, is the fact that more than half of them admit they would “very’’ or ‘’ moderately likely’’ evade restrictive workplace controls. This is compounded by their reluctance to receive security training.


What's The Value In Attack Attribution?

"If you are an enterprise, you want to fix vulnerabilities. If you know who is attacking you, it makes prioritizing a little bit easier," O'Leary said. In the grand scheme of things attribution doesn't really matter because if an enterprise has one vulnerability, then an attacker has an entry point. Attribution does, however, help in that, "If they know that someone is targeting them for a DDoS, they probably want to go harden their server. It helps them to prioritize what is on their plate," O'Leary said. Given that some cybercriminals are lazy, they are going to go after known vulnerabilities that are easy. Attribution provides a security team with the information they need to identify the vulnerabilities they have in order to fix them. "They can spend money on fixing them rather than doing analysis on trends and figuring out who is trying to attack them. It's about reducing the attack surface," O'Leary said.


The best response to some cyberattacks may be to ignore them

To figure out why pointing the finger in cyberattacks is not always the right move, political scientist Robert Axelrod of the University of Michigan in Ann Arbor and postdoctoral researcher Benjamin Edwards of IBM Research in Yorktown Heights, New York, turned to game theory—the mathematical modeling of competition and cooperation among people, organizations, or governments. They and other researchers have used game theory to study how to carry out and defend against cyberattacks, but the new research takes a broader approach by also factoring in the attacker’s and victim’s political strengths and weaknesses and how much they know about each other. “We’re trying to incorporate that uncertainty and that political climate into the game as well,” Edwards says.


How A.I. could affect the world of corporate training

No training program is going to be perfect immediately, but with most training programs having a digital element, it’s time-consuming and expensive to make changes on any kind of frequent basis. Depending on the nature of your industry and the size of your business, your training programs should be updated at least quarterly, if not monthly. An A.I. program could feasibly handle this for you -- at least in some ways. It could gather information about employee engagement or failure points within the program, and automatically test new variations to try and solve the problem on its own. ... An A.I. program could more effectively measure each employee’s engagement with the program, and intelligently compare their results to a control population to figure out whether the program is doing its job. It would then, of course, be able to make adjustments to the program to improve it.


The Future Is in Fog Computing

A fog computing network has two planes, the data plane, sometimes referred to as forwarding plane, and the control plane. The data plane determines what happens to the data packets. It allows computing resources to be placed anywhere in the network, as they don’t have to be centered on a server as they can be distributed on the edge of the network. The control plane provides an overview of the network, and it functions with the routing protocols that run in the architectural control element. Fog computing allows IoT data to be processed in a data hub or smart device closer to the sensor that’s generating it. With cloud computing, you always had to depend on the cloud repository and accessing data required bandwidth allocation and connectivity.


CDN Security is NOT Enough for Today

Attackers have learned that a significant blind spot in CDN services are the treatment of dynamic content requests. Since the dynamic content is not stored on CDN servers, all the requests for dynamic content are sent to the origin’s servers. Attackers are taking advantage of this behavior and they generate attack traffic that contains random parameters in the HTTP GET requests. CDN servers immediately redirect this attack traffic to the origin, expecting the origin’s server to handle the requests. But, in many cases, the origin’s servers do not have the capacity to handle all those attack requests and they fail to provide online services to legitimate users, creating a denial-of-service situation. Many CDNs have the ability to limit the number of dynamic requests to the server under attack. This means that they cannot distinguish attackers from legitimate users and the rate limit will result in legitimate users being blocked.



Quote for the day:


“What seems to us as bitter trials are often blessings in disguise” -- Oscar Wilde


Daily Tech Digest - March 07, 2017

From backup to data management

There are some industries like financial services and healthcare that are very global in nature. They have similar challenges such as being heavily regulated and the need to manage large-scale data. But I do see differences in the maturity scale. ...It’s important because as you’re creating data, you need to have good understanding of that data to manage it. In the US and other more developed economies, there are already massive amounts of data collected, but classifying all that data takes a large amount of effort, which means it’s never going to get done. Also, organisations in countries that are spearheading digitisation efforts will also have to take into account data protection laws not only at home, but also in countries where they operate. Organisations are custodians of customer and employee data that has to be managed from both compliance and cost standpoints.


Function as a service, or serverless computing: Cloud's next big act?

When using serverless computing, coders upload code snippets packaged as a function that carries out a specific task. The code only runs when triggered by an event. But while the coder is responsible for the code itself, the service provider manages the compute stack that runs it; the provider automatically provisions the compute and storage resources needed for that function. Users (generally enterprise IT departments) then are billed on a pay-per-use basis, determined by the number of requests served and the compute time needed to run the code, metered in increments of 100 milliseconds. On the other hand, if the code is never triggered, the user is never billed. Serverless computing differs from other cloud services, such as infrastructure as a service and platform as a service, in that under those cloud versions, users must spin up virtual machines for their applications and also deploy codebase as an entire application.


‘Artificial Intelligence’ Has Become Meaningless

Griping about AI’s deflated aspirations might seem unimportant. If sensor-driven, data-backed machine learning systems are poised to grow, perhaps people would do well to track the evolution of those technologies. But previous experience suggests that computation’s ascendency demands scrutiny. I’ve previously argued that the word “algorithm” has become a cultural fetish, the secular, technical equivalent of invoking God. To use the term indiscriminately exalts ordinary—and flawed—software services as false idols. AI is no different. As the bot author Allison Parrish puts it, “whenever someone says ‘AI’ what they're really talking about is ‘a computer program someone wrote.’”


SMBs Are Never Too Small To Be At Risk Of Cyber Security Attacks

Businesses today run on IT. This makes cyber security a business necessity as well as a technology requirement. A strong security program can not only protect a business’s assets, it can also give it a competitive advantage. Although SMBs face the same cyber security challenges as large businesses, they often have fewer resources and little in-house expertise to address these challenges. This makes it important that they get the best return on their security investments by prioritising the right things in their security programs. Cloud computing and hosted services can make advanced technology affordable, and SMBs often find it cost-effective to outsource many IT functions, including security. But at the end of the day, each business is still responsible for its own security. Owners and executives need to understand the basics of cyber security, know what their service providers are doing and what questions to ask of them.


Consumers Are Wary Of Smart Homes That Know Too Much

If several products can be orchestrated together, they can build up complex sets of actions like dimming the lights, drawing the blinds, and pausing the dishwasher when the TV comes on -- at least in theory. But for all this to succeed in the long term, consumers will have to want smart homes and be willing to pay for them, probably through subscriptions, Gartner analyst Amanda Sabia said. Some of the results revealed Monday aren’t promising. Three-quarters of respondents said they’d just as soon set their lights and thermostats by hand as have IoT do it, while only a quarter were attracted to the idea of devices anticipating their needs and making changes automatically, Gartner said. The results were similar for doing things manually versus through voice commands to IoT devices.


How to become a master cyber-sleuth

First, Bandos said, determine threat vectors and points of access. Gather data about your system, potential vulnerabilities, and previous hacks. "The first weapon any cyber threat hunter needs is data. A centralized Security Information & Event Management (SIEM) system is preferred, but simple access to proxy logs and antivirus logs is also highly beneficial. If there are hundreds or even billions of events, the hunting process whittles away the noise like a digital wood carver chipping away to reveal his masterpiece." The data aggregation and culling process should reveal a short list of suspicious activities. Proxy logs are a great place to start hunting, he said, because warning signs like slow connections and automated behavior are easy to spot.


Data Preprocessing vs. Data Wrangling in Machine Learning Projects

A key task when you want to build an appropriate analytic model using machine learning or deep learning techniques, is the integration and preparation of data sets from various sources like files, databases, big data storage, sensors or social networks. This step can take up to 80 percent of the whole analytics project. This article compares different alternative techniques to prepare data, including extract-transform-load (ETL) batch processing, streaming ingestion and data wrangling. Various options and their trade-offs are discussed using different advanced analytics technologies and open source frameworks such as R, Apache Spark, KNIME or RapidMiner. The article also discusses how this is related to visual analytics, and best practices for how different user roles such as the Data Scientist or Business Analyst should work together to build analytic models.


CIO interview: Sarah Wilkinson, Home Office

Prioritising what to focus on is hard in any organisation, but Wilkinson says “everybody understands that one of the really critical success factors for us now in this post-Brexit world is to be far more brutal about what we actually need to get done versus what we would like to get done in a slightly simpler world”. While the exact implications Brexit will have on the Home Office’s IT projects aren’t yet set in stone due to ongoing negotiations in government, the department is trying to “hone in on the stuff that really matters”. “But we need to get that really clear to focus on it, because it’s important we ensure the critical matters are delivered. We’re going to have to let go of, or postpone, some of the stuff we wanted to do in a pre-Brexit world,” she says.


Ransomware: An executive guide to one of the biggest menaces on the web

While some ransomware developers -- like those behind Locky or Cryptowall -- closely guard their product, keeping it solely for their own use, others happily distribute ransomware to any wannabe hacker keen to cash in on cyber extortion. One of the most common forms of ransomware distributed in this way is Cerber, which has been known to infect hundreds of thousands of users in just a single month. The original creators of Cerber are selling it on the dark web, allowing other criminals to use the code in return for receiving 40 percent of each ransom paid. In exchange for giving up some of the profits, wannabe cyber fraudsters are provided with everything they need in order to successfully make money through extortion of victims.


Ransomware Picks Off Border Targets With Greater Security

The business should ensure that its business continuity/disaster recovery plan and backup and recovery tools are entirely separate from the data and systems that could fall under attack by ransomware. “There are many automated on-site and cloud-based backup solutions that will leave you with options even if ransomware hits network drives,” says Moffitt. There are measures to address ransomware that starts with phishing emails that contain macros, which prerecord commands that will run automatically, in this case unleashing malware and, ultimately, ransomware attacks. You can disable macro functionality in the trust center in Microsoft Office.



Quote for the day:


"When a man assumes a public trust he should consider himself a public property." -- Thomas Jefferson


Daily Tech Digest - March 06, 2017

Cobalt's robot is like a superhuman security guard

As far as design is concerned, the robot is nothing like the RoboCop we envisioned when we first heard about robotic security guards. Instead, it is a large cloth-covered gadget that is meant to blend in with minimalistic office décor. The Cobalt robot was designed by Yves Béhar, the industrial designer whose body of work includes iconic designs for Jawbone, Herman Miller, and Puma, to name just a few. According to Béhar, "The Cobalt robot's semi-cylindrical self-driving mechanism, sensors and cameras are covered by a tensile fabric skirt. This helps maximize the access and usability of the internal technologies, creates airflow to prevent overheating, and conveys a soft and friendly persona." The sensors include 360 degree day-night cameras, thermal cameras, point cloud cameras, laser scanners, a directional microphone array, long-range RFID, a badge reader, and environmental sensors including carbon monoxide and smoke detectors.


Mozilla Partnership Provides No-Cost Firefox Mobile Device Testing

Desktop testing is also being offered free for one month, reflecting the growing emphasis on the mobile Web, which recently surpassed desktop browsing for the first time, according to StatCounter. Under the program, developers can also accrue up to 30 minutes of testing across all browser/OS/device combinations available in the BrowserStack device cloud, a Mozilla spokesperson told ADTmag. Mozilla said the free testing will simplify the complicated device testing process, which adds even more complexity to mobile Web development where it's notoriously difficult to even create equivalent cross-browser functionality. Running Web sites on the multitude of mobile devices introduces many more variables, such as different screen sizes, display densities and more.


FinTech unleashed: Has FinTech outlived its usefulness?

The term FinTech may have outlived its usefulness, or at least be weakened by overuse since the industry is using the term to describe not only new start-up firms but also the entire concept of financial services innovation. In addition, the connection between ‘FinTech’ and ‘disruption’ is often misused since most new services are more of an evolution of what has been done in the past as opposed to a revolution in banking. Both terms are great descriptors but using them together is often done in error. ... Good FinTech start-ups should be disruptive if they want to succeed because why else would someone become a customer if the start-ups don’t offer something better than what’s out there? So, to answer your question, great FinTech companies never stop trying to be disruptive and providing value to customers.


Making sense of machine learning

Deep learning is the hottest area of machine learning. In most cases, deep learning refers to many layers of neural networks working together. Deep learning has benefited from abundant GPU processing services in the cloud, which greatly enhance performance (and of course eliminate the chore of setting up GPU clusters on prem). All the major clouds — AWS, Microsoft Azure, and Google Cloud Platform — now offer deep learning frameworks, although Google’s TensorFlow is considered the most advanced. If you want a full explanation from someone who actually understands this stuff, read Martin Heller’s “What deep learning really means.” Also check out his comparative review of the six most popular machine/deep learning frameworks.


Driving Innovation in Your Cloud Adoption Program

Within the cloud, lean & agile methodologies fully realize their potential. The central tenet of lean philosophy is to maximize value, while minimizing waste. The agile approach seeks to shift from large scale releases to smaller work increments, including frequent releases and iterations, prototyping and increased collaborations with stakeholders and users. A primary benefit of agile methodology is to reduce risk and increase success rate. Cloud PaaS development platforms, containerization, microservices and the increased adoption of cloud serverless computing fully complement an agile approach. Any organization that is serious about innovation as part of a cloud adoption strategy, will need to require its people to adopt an agile approach to software development.


Lessons learned from data center outages, but still a long trip ahead

Change control and tests are the keys to keep any environment healthy, Mansfield said. Robust change control is needed to recognize and review changes, and there should be a plan to back out of them. When IT pros get ready to make a change, they need to rigorously test in an environment representative of the one to be changed. Users are most often the cause of a mistake, and automation helps avoid this, he said. Despite the progress that airlines are making in the eyes of some experts, a six- to eight-hour outage is substantial, and airlines must address the severity and duration of data center outages, said Ahmed Abdelghany, a professor of airline operations at Embry-Riddle Aeronautical University in Daytona Beach, Fla., and a former analyst in United's information services division.


Does Your Association Need Cyber-Liability Insurance?

“Depending on your organization’s exposure to cyber liability, you may feel the cost of purchasing a cyber-liability policy is not cost-effective for your organization,” said Pam Townley, VP of cyber at AXIS Capital, during a recent ASAE webinar “Ask the Insurance Nerds: Cyber Security.” So, how do you go about determining your exposure to a cyber breach, figuring out if you need a cyber insurance policy, and determining which kind of cyber coverage is required? ... Organizations tend to not realize that the cost of responding to a cyber data breach can be very expensive, Townley said on the webinar. As of November 2016, Townley said that the average estimated cost per compromised record was $214. (Warning: Multiplying that cost by the number of member records your association has could keep you up at night).


7 Tips For Managing An IT Outsourcing Contract

Those professionals managing the engagement often don’t understand how their conduct or communication can impact their company’s legal rights, which can cause a number of problems should disputes arise. “The result is that the benefits for which you negotiated hard and are paying great amounts may be lost,” says Peterson. What’s more, disputes may be more difficult to resolve, and those that aren’t becoming costly to litigate, requiring interviewing dozens of witnesses and sorting through thousands of emails to figure out what has happened and who is responsible. The real value of IT outsourcing is achieved through active governance—not only of the projects in play, but of the communication and interaction between customer and provider. “Protecting the value of the contract after the ink is dry is about motivating suppliers to deliver on their promises,” says Peterson, “and preserving remedies for failure.”


Big data disruption gets real for car insurers

The insurance industry has long feared the entry of companies such as Google, Amazon.com Inc. or Facebook Inc. which have a closer relationship to customers -- and above all better data on them. The potential disruption to the market is adding to pressure on providers already seeing their investments hurt by record-low interest rates. Insurers are reacting to the challenge. Allianz SE Chief Executive Officer Oliver Baete has pledged to make Europe’s biggest insurer “digital by default” to help boost productivity and retain clients. Thomas Buberl, CEO of Axa SA, told investors last year that “customers are now used to buying things at Amazon to interact with Google and Facebook, they are demanding the same from us and, as you can imagine, buying an insurance policy at Axa is not yet quite the same as buying a book at Amazon.”


Cyber security readiness study finds widespread shortcomings

Overall, 40 per cent of firms say they have taken out cyber insurance, a higher figure than generally quoted elsewhere. The figure is highest in the US, at 55 per cent, while nearly two-thirds of the ‘expert’ companies say they are insured for cyber risks. These higher than expected take-up figures may also reflect confusion over what exactly constitutes cyber insurance cover with some companies believing they are protected under their existing insurance coverage. Steve Langan, chief executive, Hiscox Insurance, comments, ‘With fewer than a third of businesses qualified as ‘expert’, our study reveals a worrying absence of cyber security readiness among business consumers. ‘By surveying those directly involved in the business battle against cyber crime, this study provides new perspective on the challenges they face and the steps they are taking to protect themselves.



Quote for the day:


"Any sufficiently advanced technology is indistinguishable from magic." -- Arthur C. Clarke


Daily Tech Digest - March 05, 2017

CTO: Our quest for agility led us to the OpenStack framework

Once we were confident that we were not about to make a tragic mistake, we plunged in and started the sometimes challenging process of redoing what we had in order to fit our lives into the OpenStack framework. We started with one of the available implementations of OpenStack/Cloud Foundry. (As with Linux, you can go entirely open source or you can choose a supported version from a number of providers). But, as our knowledge and experience of the OpenStack framework grew, we identified some gaps that created issues around segregation of duties (which is critical for SOX, SOC 2 and other compliance standards). We began modifying our way into our own version, which includes some technologies we created to better handle application-level security and data access controls.


Metadata Management and Data Governance: The Essentials of Enterprise Architecture

Bremeau illustrated what a successfully integrated – but simplified – big picture model would look like, using a classic Enterprise Architecture for Data Warehousing. “If there is one thing to learn and master in any Metadata Management and Data Governance solution, it’s known as ‘the big picture.’” “There are a couple of data stores on one side, could be files, and then I have some ETL tools that are bringing everything into the Data Warehouse,” with the BI tools on the other side. “Now this is very simplified because most of the customers that we deal with do a lot of staging areas, before. And they’re not using one ETL but three or four different types of ETLs, and some hand-written SQL scripts, and you have to deal with all that. That’s the reality, if you truly want to know the lineage of what’s going on in the enterprise.”


The worst enterprise architecture anti-pattern of them all

What many have tried is give more power to the IT department, or have stricter controls, more principles and guidelines, reviews, gates, and so forth. All are ways in which the agility of the enterprise suffers, with nothing much to show for it. Because in the end: the immediate business goals almost always outweigh the long term architecture goals. So, these ways never last. What we need is more agility in architecture (especially now that agile change methods have become popular), not less. My answer is that boards of enterprises should not give these IT architecture goals to the IT department, they must explicitly give them to the business units instead. And they must have the strength of conviction to actually hold those business units accountable for the IT goals, in the same way that they hold those units accountable to compliance with external demands, from owner/shareholder to regulator.


Your Data Is Your Strategic Firewall Against Competition

Per VoC research conducted by our firm, today’s personalization is broken. It relies on implicit data, i.e., web browsing behavior, data mined from social media, data modeling, and purchase-based behaviors. These are not providing the necessary depth of information to drive relevant communications and offers. As a result, most attempts at personalization simply do not drive the expected increases in response. Marketers must now make a profound shift and move to human data, which is based on explicit, self-profiled, opt-in preference data. Human data personalization is unique in that it lends itself to segmentation based on self-described personality types, attitudes, and life stages. Human data-based personalization is consistently driving double-digit response rates.


Executive Guide to Artificial Intelligence

Most crucial part of an AI system that brings machines slightly closer to humans is continuous learning. How does a recommendation engine in eCommerce sites works? It uses humongous historical data on browsing and purchase behavior of website visitors, learns patterns in the data for product preferences and makes relevant product recommendations. This learning is not one time, but happens continuously whenever chunk of new data is available. In fact learning is a key component that differentiates current generation of AI systems from earlier generations. This advancement is made possible mainly because of prevalence of machine learning algorithms. For example, recently Google announced that its language translation tool is dramatically improved because of using a versatile machine learning technique called Deep Learning.


FinTech and Blockchain: Financial Services In Transition

From a macroeconomic perspective, there is a significant pressure on banks, especially European banks, because there is limited growth opportunities. If you cannot go because interest rates are low, and the transaction volume is also because in the market of uncertainty, is low. So, there’s a revenue pressure, the pressure even on the cost side is going up significantly. Cost side meaning your cost-income ratio is under pressure compared to US banks. US banks are at 55% of your cost vs. revenue, most European banks are at over 70%. So there is a significant pressure on those banks to be very careful to reduce your operating expenses, which has also an impact on potential investments going forward. So it is constrained and stressed environment, and the new technology is even triggering, from my perspective, even bigger, significant change.


Pain in the bot? Artificial intelligence in banking

Interactions via natural language processing must be quick and simple - in a word, functional. From a bot perspective, one key differentiator is the capacity for banks to allow richer “mini-apps” as part of their messaging experience, in which each message has the potential to become an atomic application. That means functionality must be broken down into manageable chunks supported by services or better said, micro-services, in the integration layers of core systems. Sadly, if you are a banker, these micro-services are unlikely to exist in your organization. ... If we limit the choice of what users can do in a chat, we will need to somehow train the users or offer “menu” choices, much more obvious in a traditional interface, which reduces useability and defeats the purpose of a “conversation” in the first place. Check out this example from BI Intelligence:


IT Service Management In Disruption, Moving Toward Automation

The incoming description of a problem can be analyzed for its underlying patterns. Much time is wasted in IT services as one service desk listener responds to a problem one way and another in a different way.  "Incident management needs better categorization. There's sometimes misinterpretation of what the incident is about and what skill set is needed to resolve it," said Hough. With ServiceNow's existing configuration management database and change management products will be tied into machine learning to get a more accurate incident management process. The information available through them will also make it possible for a machine-learning system to look at pending changes and "assess the risk as changes come through, based on its learnings from what's happened in the past," she said.


Harnessing the value of big data with MDM

At first hand, it appears that MDM and big data are two mutually exclusive systems with a degree of mismatch. Enterprise MDM initiative is all about solving business issues and improving data trustworthiness through the effective and seamless integration of master information with business processes. Its intent is to create a central trusted repository of structured master information accessible by enterprise applications. The big data system deals with large volumes of data coming in unstructured or semi-structured format from heterogeneous sources like social media, field devises, log files and machine generated data. The big data initiative is intended to support specific analytics tasks within a given span of time after that it is taken down. In Figure 1 we see the characteristics of MDM and big data.


How data governance is now a strategic boardroom consideration in a data-driven world

Data is without a doubt a boardroom responsibility in a digital economy. Organizations have to think of their business from a customer and data perspectives if they want to thrive amidst rapid progress of data-enabled technologies and increasingly competitive environments. The recent Microsoft Asia Data Culture Study 2016, which polled 940 business leaders from medium to large-sized companies in 13 markets in Asia, found that 87% of respondents felt a data culture should be driven from top down, and that there should be a formalized role in the leadership team to drive successful adoption of their data strategy. ... A data strategy is needed to define what data is to be used by the organization – and how that will add long term value. As part of an overall data governance framework, this requires an understanding of the value, risk and constraints inherent in all data.



Quote for the day:


"If someone's criticism is completely unfounded on data, then I don't want to hear it. It doesn't hold up to scrutiny." -- Tim Ferriss



Daily Tech Digest - March 04, 2017

A (Short) Guide to Blockchain Consensus Protocols

A consensus algorithm, like bitcoin's proof of work, does two things: it ensures that the next block in a blockchain is the one and only version of the truth, and it keeps powerful adversaries from derailing the system and successfully forking the chain. In proof of work, miners compete to add the next block (a set of transactions) in the chain by racing to solve a extremely difficult cryptographic puzzle. The first to solve the puzzle, wins the lottery. As a reward for his or her efforts, the miner receives 12.5 newly minted bitcoins – and a small transaction fee. Yet, although a masterpiece in its own right, bitcoin's proof of work isn't quite perfect. Common criticisms include that it requires enormous amounts of computational energy, that it does not scale well and that the majority of mining is centralized in areas of the world where electricity is cheap.


How to install the OpenVAS vulnerability scanner on Ubuntu 16.04

The Open Vulnerability Assessment System (OpenVAS) is a set of tools for vulnerability scanning and management. OpenVAS can scan systems for thousands of known vulnerabilities. It's incredibly powerful and should be considered a must have for anyone who serious about their network and system security. I'll walk you through the process of installing this powerhouse security admin tool on Ubuntu 16.04. The process is a bit time consuming, but what you gain in the end is worth every second. OpenVAS is an outstanding way to test machines you own/service/administer for vulnerabilities. Do not use this tool on systems outside of your purview.


The potential of blockchain as a future financial services infrastructure

The value of DLT to industry players is multifaceted. The top value driver noted by the report is operational simplification, whereby DLT reduces or eliminates manual efforts required to perform reconciliation and resolve disputes. The second key driver stems from improved regulatory efficiency, as DLT allows regulators real-time monitoring access to financial activity between regulatory entities across borders. The technology also allows for counterparty risk reduction propositions, reduce the clearing and settlement time, reduce locked-in capital requirements and boost liquidity as well as minimise fraud, by creating a full transparent and a practically immutable transaction history.


Peugeot concept learns from your IoT gear to improve the ride

It's not just pulling in the information, either. The car also gives you access to all of those connected devices from the interior of the Instinct Concept. We're talking about temperature information from your Nest, what you like to watch from your smart TV or details from your virtual assistant on a gadget like Amazon Echo. Speaking of Amazon, a number of other automakers have already enlisted Alexa to power AI inside their vehicles, Peugeot decided instead to go with Samsung's cloud platform to collect all of the info and data science company Sentience analyzes the details for what's relevant to the system. The car has it's own AI that passengers can interact with via spoken cues. The Instinct Concept also features four modes that tailor the ride to you. There are two driving modes -- Drive Boost and Drive Relax -- for performance or more every day driving scenarios.


The Trends, Companies, And Categories The Top VC Firms Are Betting On

So where is smart money going? We crunched the data to identify where smart money VCs are investing in early-stage companies in recent years and how that investment focus has shifted. Using CB Insights’ natural language processing, we identified the most common words used in company descriptions among those companies that received early-stage investment from a smart money VC between 2010-2016. We then looked at which words have trended up and down among this cohort over the years. In addition, we identified the categories and industries that are seeing the most smart money early-stage investment. Through this lens, we can see where top investors see the most potential.


Blockchain and Cloud kissing cousins

Whilst cloud removes old legacy systems, blockchain removes the middleman within such systems. Why then would you want to deploy your shiny new blockchain project on an old restrictive, expensive and possibly less than safe on-premise system? Cloud also opens up the bank to immense scale as we are now seeing with Black Friday, Cyber Monday and Singles Day where on one day $17 billion in sales occurred. Imagine the supply chain finance activities needed to support that single day’s activities. The traditional legacy system was to build more capacity by buying more computers, more software and hiring more IT people. The cloud provides cyber security and pay as you go so you can scale in safety. A second generation of banking is coming. We’re already on the cusp of it, and banks are running out of time before they become completely marginalised.


New technology, same bugs: the rise and fall of the robot revolution

The trouble is that right now it's almost impossible to tell for regular users if a robot has been hacked or not, so it's a good target for APT attacks. So just how 'real world' is the robot hacking threat according to other security industry experts? Mike Pittenger, vice president of security strategy at Black Duck Software, is in no doubt that we will have already seen the consequences. "Drones (unmanned aerial vehicles) are a form of robot," he explains, "and an attractive target for our adversaries. Taking control of a drone would certainly disrupt a military mission, and could possibly turn a military's weapons on itself."  ... Deral Heiland, research lead at Rapid7, agrees that the problem is both real and current. "On the personal level, the boom in IoT technology that we are now seeing has led to robots in various forms becoming part of our daily life," Heiland says.


The Best Machine Learning Tools? Here Are 7 Everyone Should Look At

Artificial Intelligence is the hottest buzzword in computing and business at the moment, and Machine Learning is the cutting edge. If you’re looking to expand your horizons as an IT professional or harness technology to move your business forward, an understanding of how it works will be a huge advantage in the next few years. I’ve written a basic introduction to the terms AI and ML here, and this article is for those who want to look into the subject a little bit more deeply. There are already a large number of well-supported frameworks available which allow anyone to jump in at the deep-end and by process of trial and error, learn how to use machine learning to solve real-world problems. These platforms highlighted below vary in complexity and beginner-friendliness. Some of them are fully fledged “as a service” cloud offerings from big players, while some are extensions of existing toolkits like Spark and Python.


Why YOU Should Lead Digital Transformation

CIOs have always been told to get closer to the business -- but now their very survival may depend upon it. New executive titles such as "Chief Data Officer" are proliferating, and Gartner says there are two different types of CIOs emerging: the "Chief Innovation Officers" who spearhead the technology-led business models of the future -- and the "Chief Infrastructure Officers" who are relegated to looking after the IT plumbing. IDC's research shows that digital business has thus far relied on a culture of experimentation and innovation driven primarily by the business and shadow IT -- and this is set to continue. For example, Gartner says that in 2017 -- for the first time ever -- the average Chief Marketing Officer will spend more on technology than the average CIO. These funds are being used to create "islands of innovation" outside the realm of core IT



Quote for the day:


"Don’t look for your dreams to become true; look to become true to your dreams." -- Michael Bernard Beckwith