Daily Tech Digest - May 26, 2017

How quantum computing increases cybersecurity risks

We already see rapidly increasing numbers of data breaches as more connected devices make more attack surfaces available. As companies and governments work continually to protect against cybersecurity attacks through advances in technology, the advent of quantum computing could create a free for all for cybercriminals. But there is a solution in the form of quantum-safe cryptography. The key will be updating quantum-vulnerable solutions in time, and that means understanding now which systems will be affected by quantum risk and planning a migration to potential quantum-safe security solutions that includes appropriate testing and piloting. The transition can begin with hybrid solutions that allow for agile cryptography implementations designed to augment the classical cryptography we use today.


HTML5: Where The Core Web Technology Is Headed

So will there ever be an HTML6? Jaffe suggests that web payments might justify such a whole-number revision, to provide a consistent way of doing payments on the web. “If we were going to linearly call something HTML6, this might be it.” Although buying through the web is not new, the increased dominance of mobile web usage is causing people to abandon shopping carts because of the complexity—and may require a different approach baked into HTML itself. The W3C has a working group to explore this very issue. W3C also is working on Web Components, a framework to identify reusable website components, and Service Workers, to make it easier to run multiple functions inside a browser, featuring offline capabilities. Maybe they’ll justify a name change to HTML6.


The WannaCry scramble

WannaCry could have been much more devastating than it was — and it was very disruptive, affecting hospitals and other health services in disproportionate numbers — if not for a “kill switch” that the malware author included in the code. There are various schools of thought as to why this kill switch existed, but the consensus is that the author wanted a way to stop the malware from propagating. The method was to register an obscure web domain. As long as the domain didn’t resolve to anything, the malware would continue to propagate and infect vulnerable devices. But a security researcher discovered the kill switch and registered the domain, which stopped the malware. In the end, something like 200,000 devices (that we know of) were impacted.


Ignoring software updates? You’re making one of five basic security mistakes

Forget technology for a second, culture is arguably the biggest issue with security right now, and this has been the case for 20 years. CEOs think they won’t be targeted and citizens think much the same (i.e. it won’t happen to me). This complacency is misguided, as everyone is a target and a potential victim. Accordingly, this attitude can often result in poor security habits, with individuals and organizations treating, for example, password and Wi-Fi security not as seriously as they should. This is despite the fact that good cybersecurity can be achieved relatively easily, through good password hygiene, regular software updates, anti-virus and even password managers, VPNs and secure encrypted messaging apps.


The Business Of Apps, Security, And Consumer Expectations

There is a need to implement more automation around application security. This translates to embedding of security capabilities into the application code itself—referred to as Runtime Application Self-Protection (RASP). While a promising area of security technology, RASP solutions are emerging technologies as their effectiveness and impact on application performance are yet to be fully understood. On the other hand, the Web Application Firewall (WAF) remains a purpose-built application security tool. The more advanced WAFs leverage automation capabilities to improve security and streamline operations. WAFs are preferable because they offer automated policy generation, a feature that analyzes the protected application, generates granular protection rules and applies security policies.


New cyberattack rule looms over federal contractors

“We are finding that a lot of companies are not aware of this requirement and face losing their government contracts,” said Tamara Wamsley, a strategist with Fastlane. “This issue could impact the success of many local companies, could result in lost jobs. This is a big deal.” “It’s not just for R&D (research and development firms),” Gillen said. “It’s for janitors, it’s for accountants.” “Anyone who has information classified by the government that needs to be protected,” said Shawn Walker, co-founder and vice president of Miamisburg-based Secure Cyber Defense LLC. Today, the rule affects only Department of Defense contractors. But Gillen said it will “almost certainly” expand to impact every federal contractor and sub-contractors, Gillen said. The rule is essentially a list of 110 requirements with which contractors must comply.


Secure IoT networks, not the devices

To protect IoT deployments, Cisco recommends that customers isolate the devices on network segments. Traditional segmentation using VLANS can become complicated at an IoT-deployment scale though, Cisco says. Cisco’s TrustSec platform that includes network segmentation capabilities. “The logical move is to segment these devices to put them out of attackers’ reach,” Cisco says. “If devices are compromised, organizations can prevent them from being used as pivot points to move through the network, and to activate incident response processes to protect the business.” IoT Threat Defense can detect anomalies in network traffic, block certain traffic and identify infected hosts. Cisco is targeting initial use cases in the medical, power utilities and automated manufacturing industries.


King Chrome: Microsoft's Browsers Sidelined On Its Own OS

IE retains a sizable share -- Smith called it "a significant presence" -- largely because it's still required in most companies. "There are a lot of [enterprise] applications that only work in IE, because [those apps] use plug-ins," Smith said, ticking off examples like Adobe Flash, Java and Microsoft's own Silverlight. "Anything that requires an ActiveX control needs IE." Many businesses have adopted the two-prong strategy that Gartner and others began recommending years ago: Keep a "legacy" browser to handle older sites, services and web apps, but offer another for everything else. That approach lets employees access the old, but does not punish them with a creaky, sub-standard browser for general-purpose surfing. Under such a model, Internet Explorer has played, and continues to play, the legacy role.


How to Build a Better IoT Framework

A starting point is to understand that business and IT leaders must work in new, more collaborative ways to identify where value exists. IT must support the endeavor with an agile, flexible IT infrastructure that, among other things, taps clouds, mobility, APIs, artificial intelligence (AI), real-time connectivity and advanced analytics. Accenture's McNeil says that it's important to identify potential use cases before diving into an initiative. These often revolve around financial impact and cost drivers, but they may also touch on business opportunities and remapping processes, workflows and customer interactions to unlock untapped and previously hidden value. New and different thinking is paramount. "Oftentimes, it's really about experimenting with sensors and data inputs to see what makes sense for the business," McNeil explains.


Are Unit Tests Part of Your Team’s Performance Reviews?

Unit testing achieves several important business objectives: quality improvement, ability to test legacy code, developers stay up-to-date with the latest and greatest methodologies, and yes, good unit testing even increases developer motivation. Writing good unit tests that won't break on every single code change is not difficult and can be achieved easily by following a few simple practices: A unit test should not be dependent on environmental settings, other tests, order of execution or specific order of running. Running the same unit test 1000 times should return the same result. Using global state such as static variables, external data (i.e. registry, database) or environment settings may cause "leaks" between tests. The order of the test run should not affect the test result, and so make sure to properly initialize and clean each global state between test runs or avoid using it completely.



Quote for the day:


"Success. It's got enemies. You can be successful and have enemies or you can be unsuccessful and have friends." -- Dominic, American Gangster


Daily Tech Digest - May 25, 2017

Split Tunnel SMTP Exploit Bypasses Email Security Gateways

The so-called Split Tunnel SMTP Exploit works against pretty much any email encryption device—virtual, hosted or in-house—that accepts inbound SMTP and there's very little anyone can do to stop it, according to the company. Attackers can use the exploit to inject any payload that supports MIME encoding including ransomware, macro viruses and password protected ZIP files. The exploit, says Vikas Singla, CEO of Securolytics, takes advantage of the fact that an email encryption appliance has a publicly accessible IP address and is able to receive and transfer emails. Such devices are typically deployed beyond the enterprise firewall and are often used in conjunction with an email security gateway. Singla says that during an engagement at a healthcare customer site Securolytics discovered an attacker could completely bypass the email security gateway by connecting directly to the encryption appliance.


Big Data versus money laundering: Machine learning, applications and regulation in finance

ML is efficient, but opaque: "It works, and it works well, but we do not exactly understand why or how." Although that has been said on deep learning, it applies more broadly for ML as well, and coming from experts in the field it is not something to be dismissed lightly. This may raise some philosophical questions, mostly having to do with the increasing feeling of being sidelined and not being able to keep up with technology, but there are also some very practical implications. As Mathew notes, whatever anti-MLA approach taken, getting results is not enough. It must also comply with a number of guidelines, ensuring for example there are no discriminations against certain groups of the population. The issue of algorithmic transparency is becoming increasingly understood and widely discussed, and there are many examples in which opaque algorithms embody all sorts of bias.


10 ways to protect your Windows computers against ransomware

With the recent WannaCry ransomware infection affecting users on an international scale, the stakes are extremely high for those who rely on technology to protect their data at all costs. This is especially true of critical systems, such as those that provide life-saving care in hospitals, infrastructure used to manage utilities, and information systems used in government services. .... Consideration must also be given to complying with any regulations that may exist specific to your industry. With that said, safeguards are merely that. The risk associated with malware infections is always present, as risk can't be eliminated. But applying multiple security applications as a layered solution provides comprehensive protection on several fronts to minimize the threat of a potential outbreak in accordance with best practices.


The Importance of Teaching Students About Cyber Security

As for teens and pre-teens, it may be a good idea to show them some real examples of Internet sharing gone wrong. Teens like to share what they are doing, who they are with, what they're wearing and many more aspects of their lives on social media. But they usually don't realize that what they post can be viewed by anyone – their teachers, their principal, their families. Even if their accounts are private, kids talk, and word gets around to the adults. And there are ways for strangers to hack right into their accounts and see those posts they thought were private. Even deleting a post is not as sound as it may seem, because once something goes up on the Internet, there are ways to dig it back up even if it's been deleted. Teens need to know that this can greatly affect their reputation now, and in the long run when it comes to getting into college and applying for jobs.


IT still needs the tried-and-true on-premises data center

As regions such as Africa and Asia focus on new data centers, more established data center regions such as the United States are seeing stagnation in new data center builds and more interest in colocation facilities. Over half of enterprise respondents to IDC's annual survey use colocation services. "At this point, a lot of the nonearly adopters are testing the waters of the colocation market," Quinn said. Among the other half of respondents, there's still a lot of uncertainty around colocation adoption. IT leaders still ask a variety of questions, said Quinn: How do we engage colocation providers? Which workloads should we migrate? And how secure is this going to be? Colocation providers must help customers answer these questions prior to the initial engagement, which can be a bit of a handholding process.


WannaCry Ransomware Cyberattack Raises Legal Issues

The firm can consider what specific steps can be taken to avoid or mitigate potential civil actions, including private rights of action or class actions regarding a cyber incident. Many states allow for a private right of action to be filed in order to recover damages. On cybersecurity matters, there has been substantial activity involving class actions. Engaging experienced counsel early after the cyber incident may help the firm recognize potential litigation, and counsel can recommend steps to anticipate and mitigate costly legal actions. Another important question involves whether and when to contact law enforcement. Federal authorities recommend that law enforcement be contacted when ransomware occurs.[6] The facts of each case must be carefully considered by the firm. Law enforcement will likely want to obtain relevant data about the cyber incident that is properly authenticated under chain of custody protocols.


Here's How Windows 10's Rapid Release Works & Looks

Pilot, Microsoft says, is the state of an upgrade's first four months, when enterprises should install it only in small-sized pilot programs. (Consumers running Windows 10 Home are always fed from the Pilot channel, and so are roped into testing the earliest versions.) After about four months, Microsoft -- in discussion, it claims, with software developers, hardware partners and customers -- declares an upgrade as fit for wider business deployment and thus flags it as Broad. In one example, Microsoft suggested multiple deployment "rings," or groups in an enterprise, with across-the-board upgrades beginning as soon as the Broad release was available. For 1709, that would be about four months after the September 2017 launch, or in January 2018. A second group, recommended Microsoft, would begin the upgrade process two weeks later. Your company might postpone that further or break the business into more than two Broad groups.


Understanding the benefits and threats when building an IoT strategy

By having critical infrastructure components, IoT is a potential target for national and industrial espionage, as well as denial of service and other types of attacks. Another major area of concern is privacy with the personal information that will potentially reside within networks, Big Data and the cloud is also a potential target for cyber attacks. IoT is still a technology in development, and that must me taken in consideration when evaluating its security needs and requirement. Many devices are connected to the Internet and sending data and information to the Cloud, and that will definitely increase. With the advent of contextual data sharing and autonomous machine actions, IoT will become the allocation of a virtual presence to a physical object, and these virtual presences will begin to interact and exchange contextual information.


82% of Databases Left Unencrypted in Public Cloud

The problem isn't in cloud providers failing to secure data centers, but in organizations failing to secure applications, content, systems, networks, and users that use the cloud infrastructure. "That is where people are not aware, or not investing the right resources," he continues. Researchers found of the 82% of databases left unencrypted in the public cloud, 31% were accepting inbound connection requests from the internet. More than half (51%) of network traffic in the public cloud is still on the default web port (port 80) for receiving unencrypted traffic. Nearly all (93%) public cloud resources have no outbound firewall rule, says Badhwar. "You need to have control at the network, configuration, and user layers so it's hard for someone to get in, and harder for them to take your data out," Badhwar emphasizes


Identity management the new 'perimeter' for hospital cybersecurity

“Ten or 12 years ago, we looked at what it would have taken to buy an identity platform, and it would have taken six or seven different commercial software packages to cobble together a sufficient platform,” Houston said. “Had we done that, we would have replaced all of them by today, either because they no longer would be on the market or because they would be out of date.” Houston added that the most important capabilities of an identity management platform, whether proprietary like UPMC’s or purchased from a vendor, include the ability to understand who your users are and ultimately run analytics on their activities.  “We link into our human resources system, our physician credentialing system, we know when people come into our employment, when they change positions, when they leave,” Houston said. “Who they are, where they report to, where they are in the organization, we have a lot of understanding of who these people are.



Quote for the day:


"Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it." -- Edsger W. Dijkstra


Daily Tech Digest - May 24, 2017

Identity, authentication and authorisation becoming risk-led

A risk-led approach means organisations can automatically adjust the number of authentication factors required between one and seven, depending on the context and in line with best practice guidelines. In addition to continuous identity assurance and user authentication their way, RSA is also addressing organisations’ need to secure legacy applications. “By putting SecurID on a unified platform, businesses can access cloud applications and on-premise applications in the same place with the convenience of a single sign-on, but without using [the dangerous approach of] synchronising identities in the cloud,” said Darisi. “We provide a 360-degree view of an individual’s identity through our portal – of the person’s role, location and devices - but we do not merge the identity stores,” he said.


How to fix the broken coding interview

Mimicking the real job of a coder in an interview is enormously difficult, however, particularly when it comes to achieving context. Even in a relatively new codebase it can take months for a programmer to gain sufficient context to be able to contribute at a decent pace. In the past employers have tried to cater for this in their own interview process by having candidates work with a much smaller sample codebase. But companies like Hired have learned even a few hundred lines of code across a handful of files can be overwhelming for a one-hour session. As a result, it is necessary to keep the tasks quite simple, meaning good candidates didn’t have the chance to properly shine. One way to overcome this is taking that same concept of a smaller codebase but sending it to candidates ahead of time.


New Tool Tallies Your Big Data Debt

As the big data era emerged, new technologies were created to support modern data structures and to deliver to the always-connected user," he added. "These newer systems have an important role in building modern applications, however they produce data that is fundamentally incompatible with existing analytical infrastructures including data warehouses, ETL, BI and data science systems like R and Python. As a result, many organizations are collecting significant data debt." Dremio's new Big Data Debt Calculator is intended to help organizations get their arms around this unplanned debt. Dremio says it gives recommendations for minimizing debt, strategies for paying it down and ensuring it remains within acceptable bounds.


How banks can learn the lessons of BlackBerry

With regulation forcing banks to open up their technology to FinTech firms, startups and other financial service providers, consumers are becoming less reliant on providers of traditional financial services. The idea that one institution will manage all financial services for one customer will no longer be the most viable or the expected model. Open banking creates a new relationship between banks and customers. It requires banks to adopt a customer centric and data centric view on how they do business. In this emerging model, data and services become valuable commodities. To extract value, banks must have the appropriate core-technology and infrastructure in place. It requires a good API governance structure that must include: standards, management policies, data access and statistics, and development processes.


Multi-cloud is a messy reality, but there's hope

While each of the clouds is a solid choice for machine learning, for example, Google generally gets the nod as the frontrunner. Many enterprises will turn to Google for machine learning, AWS for Lambda, Microsoft Azure to modernize their legacy applications, and so on. Such cloud differentiation makes the likelihood of multi-cloud management ever harder. As cloud luminary Bernard Golden told me, "While it appears attractive to use a management tool that encapsulates the individual cloud providers and provides a single management framework, since it promises to reduce costs by amortizing training and employee costs across a greater breadth of applications, in practice it typically means using a lowest-common denominator application management approach, which often forfeits use of functionality that resides within a provider's IaaS/PaaS offerings."


In Search of an Rx for Enterprise Security Fatigue

The security fatigue phenomenon affects consumers and enterprises alike. According to the National Institute of Standards and Technology (NIST), security fatigue is also causing consumers to make poor security decisions, such as reusing the same password across all online accounts. But what enterprises can glean from this report is NIST's suggestions to combat security fatigue, including limiting the number of security decisions that users need to make; making it simple for users to choose the right security action; and designing for consistent decision making whenever possible. But up until a few years ago, many enterprise networks in Fortune 500 companies didn't have the ability to identify a compromised network or subnet in a timely manner. Now, the sheer amount of security measures used to detect a network compromise can create this fatigue. Without knowing what to pay attention to, identifying an inside threat is like trying to find a needle in a haystack.


Why Ansible Has Become The Debops Darling For Software Automation

One reason it has gained momentum since being acquired by Red Hat may have been the acquisition itself, according to Paul Delory, a research director at Gartner. "We definitely have seen a bump in interest since the acquisition by Red Hat, because it now has more credibility in the enterprise," he says. Part of the reason for this is that there was a perception in the software development and devops community that Ansible's support offering was not as good as that of Puppet or Chef. But under Red Hat's ownership this support gap has been closed, he says. "Support is important to enterprises, and the quality of support available is of critical importance," he adds. But there's more to Ansible's popularity than the availability of decent support options, vital though those options are to enterprise customers.


The Rise Of Toxic Data

Data growth raced ahead while information security fell behind, and the collateral damage is making headlines. Data breaches like those that happened to Sony, Mossack Fonseca, the U.S. Office of Personnel Management (OPM) and the Democratic National Committee (DNC) are practically daily occurrences. Instead of increasing revenues or furthering goals, stolen files and emails disrupt and subvert plans. If an organization stores valuable data (and most store more than they realize), someone will try to steal it. Your next breach may be perpetrated by someone who has never heard of you; ransomware, a form of file extortion, is now a $1 billion business. Worldwide cybersecurity spending is expected to exceed $1 trillion over the next five years, according to Cybersecurity Ventures. So why do organizations still have so many breaches?


Google raises heat on Microsoft with new Chrome bundle for enterprises

"Every couple of years, Google makes noise about Chrome in the enterprise," said Gary Schare, president of Browsium, a maker of browser management tools. Schare was formerly the head of project management for Microsoft's Internet Explorer (IE). "This looks like Google is trying to make Chrome a better citizen in the enterprise." Schare applauded the group policy templates, noting that because Microsoft's own browsers, IE as well as Windows 10's Edge, have traditionally been the best equipped for enterprise management, any help from Google on Chrome would be welcome. The LSB add-on has long been available from the Chrome Web Store, Google's authorized mart for browser extensions. Once configured by company IT administrators, LSB will automatically open IE11 when links clicked within Chrome lead to websites, web services or web apps requiring, for example, an ActiveX control or Java, neither of which Google's browser supports.


With Billions Spent on Cybersecurity, Why Are Problems Getting Worse?

Despite what people generally think, there are surprisingly few regulations that force companies to take reasonable steps to protect their data. Even in areas such as healthcare, regulations like the Health Insurance Portability and Accountability Act lack clarity and are insufficiently enforced. In 2016, more than 27.3 million patient records were breached, but despite this, the Office for Civil Rights (the healthcare security and privacy regulator) settled alleged HIPAA violations with only 12 healthcare organizations. Outside of areas like healthcare, finance, government, etc., most federal security enforcement has defaulted to the Federal Trade Commission, which uses an arcane statute of the Federal Trade Commission Act that prohibits unfair or deceptive practices in the marketplace. This means that only the most egregious violations are penalized, leaving implementation of effective cybersecurity to the discretion of most business leaders.



Quote for the day:


“The electric light bulb did not come from the continuous improvement of candles.” -- Oren Hariri


Daily Tech Digest - May 22, 2017

Don't worry CIOs: You still control tech spending

"The growing tech-savviness of business leaders and the wider availability of cloud solutions does mean that business leaders are playing a bigger role in the front end of this process," Bartels says. But the persistence of licensed software, the growing adoption of cloud as a replacement for licensed software, and challenges of implementing and optimizing solutions mean that CIOs and tech management teams still play a dominant role in overall tech purchases. Moreover, CIOs are procuring software-as-a-service (SaaS) solutions — the top shadow IT target for business leaders — more than ever as it allows them to meet business requirements. “That becomes a mechanism to manage demand,” Bartels says.  CompTIA just announced similar findings from a survey of 675 U.S. businesses in its report, “Considering the New IT Buyer."


Digital India must protect itself with well-crafted cloud security strategy

As the government's cloud strategy takes shape, this is the time to move over to security in the cloud paradigm. It will require a radical reassessment and revamping of existing security provisions, because a move to the cloud changes the technology landscape quite drastically. The new paradigm must incorporate the spirit of the legacy security provisions, but requires much more sophistication to secure a hybrid cloud setup. Furthermore, it is critical that this broad-gauged security policy be executed uniformly across all components of the hybrid cloud. This is where cloud security gateways and security brokers come into action. Rather than leave it to individual systems in the cloud to take care of their own security (as is the case in the legacy setup), a cloud security broker can monitor and defend the entire cloud and all the systems within. Security across all the different systems is addressed holistically, regardless of where the systems may reside.


Digital transformation: MIT's Westerman shares new lessons

Successful digital transformation is like a caterpillar turning into a butterfly. It’s still the same organism, but it now has superpowers. Unfortunately, when it comes to digital transformation, many senior execs aren’t thinking about butterflies. They’re just thinking about fast caterpillars. And it’s hard to keep up with your competitors if you’re crawling ahead while they can fly. ... It would be nice to think that companies and governments would be responsible for finding new roles for the people who are displaced by technology. But more realistically, people need to be responsible for their own professional development. Stay abreast of what is happening with industries and technologies. If you’re in a job that will die soon, do what you can to shift to another. Hopefully, our society will find ways to help people who lose in the race against machines.


HP'S Spectre X2 May Be The Surface Pro Killer We've Been Waiting For

Going outside the box, HP has seriously revamped the Spectre x2's screen to 12.3-inch diagonal size and 3000x2000 resolution screen. Compared to the first gen Spectre x2's 16:9, 1920x1080 panel, this new model has the same aspect ratio as MIcrosoft's Surface Pro 4. The newer screen is IPS and brighter, too, with a maximum of 450 nits, slightly better than the Surface Pro 4 panel's spec. HP has also cloned the Surface Pro 4's pen tech. The original Spectre x2 used a Wacom-based pen, while the 2nd gen replaces Wacom for N-trig, which Microsoft uses for its own Surface Pen. Microsoft even bought N-trig. The N-trig pen's main appeal to PC makers is its ability to use the capacitive touch layer to sense the pen. Wacom-based devices require a separate digitizer, which adds thickness to the screen.


How A Common Language For Cyber Threats Boosts Security

Orchestration and automation may be the most significant advantages governments obtain when they adopt standard threat information formats. It’s no secret there is a cybersecurity talent shortage. To manage a growing volume of increasingly sophisticated threats, it is critical to have infrastructure and security tools that enable quick, automated and synchronized responses without human intervention. The goal of Open C2 and other groups work is to expand the development of orchestration software and standardized command and control languages. Central to the OpenC2 movement’s platform is the idea that standardizing language between machines enables rapid response to shared threat intelligence. As the OpenC2 forum states, “Future defenses will require the sharing of indicators, the coordination of responses between domains, synchronization of cyber defense mechanisms and automated actions at machine speed against current and pending attacks.”


How the CISO moved from the basement to the boardroom

If the CISO is overwhelmed with projects, it can be helpful to determine which departments you are serving, who the stakeholders are, and what is critical to them, Hayslip said. That will help you create a more narrow list of issues to tackle. It's often wise to start with cyber hygiene, he added: If you have basic security policies and patch management, antivirus, and firewalls in place, updated, and managed, it builds a strong foundation for your organization's cyber health. CISOs also have an opportunity to redefine their role as a business strategist during the digital transformation, Pollard said. To prove their value, they should spend time mapping the firm's technology touchpoints, foster security champions across the company, and get involved with customer-facing activities like product design and development, he added.


How Cybersecurity Benefits from Hackers

To put it simply, many of those who identify themselves as “hackers” are very talented programmers. The creators of some of the most well-known software are self-proclaimed hackers. Among these hackers-turned-programmers are Mark Zuckerburg, founder of Facebook; Linus Torvalds, the creator of Linux; and Tim Berners-Lee, one of the driving forces behind the creation of the World Wide Web. Often, these programmers will seek a solution that doesn’t involve working with one of the entrenched proprietary software companies. Instead, they will create open-source projects, where the source code is made publicly available. The programming community, including several who identify as hackers, work together to produce software solutions that is available to everyone. Even people who never use open-source software benefit from these projects, as the public community will often create new innovations that the proprietary companies either use for inspiration or simply copy outright.


Google, A.I. and the rise of the super-sensor

Here's a simplified version of how such a sensor might work in a warehouse setting. You plug in one or a few super sensors. Then somebody uses a forklift. The resulting vibration, sound, heat and movement detected by the super sensor generate patterns of data that are fed into the system. You can identify this as "forklift in operation." (Further tweaking might determine not only when a forklift is in use, but where it is, how fast it's moving, how much weight it's carrying and other data.) You can then program next-level applications that turns on a warning light when the forklifts are moving, calculates wear-and-tear on forklift equipment or detects unauthorized operation of forklifts.The output from these "synthetic sensors" can be used by developers to create any kind of application necessary, and applied to semantic systems for monitoring just about anything.


Lessons From Women IT Leaders On How To Transcend The Middle Order

“Women must make themselves visible for the right skills and projects, and to the right people if they want to advance into senior leadership,” says Ghosh. Women, especially in enterprise IT, says Ghosh, must be vocal about their goals and ambitions, and this is one way they can deal with implicit bias. “The other thing is that even organizations must acknowledge the bias,” says Dar. “Consciously or unconsciously, we are discriminated against and when these things come to light they have to be dealt with the same importance that any other labour matter or such would be dealt with. Just pushing it under the rug does not work anymore”, says Dar. Her company, Godfrey Phillips runs awareness sessions for employees, “and that is something all companies should run on a timely basis,” she says. Even at organizations, Vijay says, it is very important to introduce programs that are aimed at increasing the representation of women in IT leadership positions.


Companies Ramp Up Recruting Veterans As Cybesecurity Urgency Grows

Like most new hires, veterans in the private sector must navigate a culture that’s vastly different from military life. When Navy veteran Dana Hawkins took his first private sector job as a contractor, “just getting used to the lack of process” at some smaller companies compared to the stringent cybersecurity processes of the military was a big challenge. Hawkins is now director of security services at Proficio. Other veterans find it challenging to work with a virtual team after years of direct contact with leaders. “It takes a while for our veterans to get used to it,” Stoner says. To smooth the transition, PwC assigns veteran mentors to help new hires assimilate into the firm. Stoner, an Army veteran and reservist himself, finds that military “athletes” – those veterans withleadership, self-discipline and a goal-oriented approach– make the best transition to private sector cybersecurity careers – and there’s plenty of room for more.



Quote for the day:


"Careers, like rockets, don't always take off on schedule. The key is to keep working the engines." -- Gary Sinise


Daily Tech Digest - May 21, 2017

Using ‘Faked’ Data is Key to Allaying Big Data Privacy Concerns

The MIT researchers, led by Kalyan Veeramachaneni, proposed a concept they call the Synthetic Data Vault (SDV). This describes a machine learning system that creates artificial data from an original data set. The goal is to be able to use the data to test algorithms and analytical models without any association to the organisation involved. He succinctly states that, “In a way, we are using machine learning to enable machine learning,” The SDV achieves this using a machine learning algorithm called “recursive conditional parameter aggregation” which exploits the hierarchical organisation of the data and captures the correlations between multiple fields to produce a multivariate model of the data. The system learns the model and subsequently produces an entire database of synthetic data. To test the SDV, synthetic data generation for five different public datasets was performed using anti debugging techniques.


AI Platforms: How to Make the Smart Choice

Companies will in all likelihood build an infrastructure that uses more than one platform. The company specific business technology platform can easily become a platform of technology platforms. If these platforms, and the applications on top of them, do not play nicely the company will have a hard time to provide customers with positive engagements that leave a good experience. Instead, there will be inconsistent and duplication of data, broken processes, hence frustrated employees and customers; in summary poor experiences for everybody involved, leading to poor experiences. And not concentrating on good experiences leaves companies with missed opportunities, as studies like this one show. ... If there already are platform based business applications available then it is a good option to look at what the underlying AI platform of the vendor is offering, always keeping the answer to the question “What experiences do you want to deliver?” in mind.


How to Handle the Data Deluge and Drive Business Success

The challenge for organizations is pulling all the critical data together into a coherent view of the business that allows them to confidently act on well-founded plans. Increasingly, CFOs are being tasked with not only understanding and communicating financial results but also with helping the organization understand the operational drivers behind them, requiring a more detailed analysis of business KPIs, many of which are non-financial. In fact, a recent (Adaptive Insights CFO Indicator Q3 2016) report found that 76 percent of CFOs are tracking non-financial KPIs, which involves greater collaboration across the organization and data integration to create a holistic view of the business. While it is positive that so many CFOs are taking this step to become a more strategic advisor, this influx of new data can pose a problem when it comes to consolidation.


Harnessing the Secret Structure of Innovation

Reassuringly, many innovators already have the tools to do so: Companies routinely re-engineer competitors’ products, analyze the patent landscape, and conduct interviews with technology experts to guide their operational decisions. We believe innovators can and should also use these same tools and information to guide their strategy by methodically measuring the evolution of product complexity in their space. This requires developing a taxonomy of components by sampling competitors’ products and dissecting not only physical components but also intangible ones like process innovations or business model choices. While we are not aware of any company that is yet explicitly doing this, we do see that many startups implicitly follow this logic by shifting from an impatient minimum viable product (MVP) logic to more patient innovation centered on more complex designs once cash flow and funding have been secured and the space begins to mature.


Data protection is not just an IT issue

Independent Cybersecurity Expert, Dr Jessica Barker said: “With so many data breaches hitting the headlines, there can be a sense of defeatism among some organisations. Breaches are seen as inevitable so some organisations question the value of spending on security when it won’t make them 100% secure. However, this research has found that investing in security helps protect the organisation when even the worst happens, as companies with a strong security posture experience much quicker stock price recovery than those with a poor security posture following a data breach.” “In this past year alone we’ve seen high-profile data breaches, such as Yahoo and TalkTalk, experience the significant consequences that a breach can have on shareholder value and brand reputation,” said Bill Mann, senior vice president of products and chief product officer, Centrify.


A More Practical Approach to Encrypting Data in Motion

RSA and Elliptic Curve Cryptography are two classes of algorithms that achieve these characteristics. The keys associated with these algorithms can be serialized and encoded in special files called X.509 certificates. Certificates contain a ton of other information like names, network addresses, and dates. When people refer to certificates, they usually are referring to X.509 certificates that contain public keys. Data (like certificates) can be digitally signed. Digital signatures are encrypted hashes. A good hash function is one that’s hard to reverse and not prone to collisions. If you apply a good hash function to a message, the result of the function does not reveal anything about the original message. Further, a good hash function applied to any two messages are not likely to have the same resultant value (also called a collision).


Graph databases and RDF: It's a family affair

Should we just dismiss RDF as impractical and RDF stores as inferior software delivered by academics and move on? Maybe not so fast. There are RDF vendors who are entirely professional about what they do, and RDF does have certain things to offer that are not there in other graph data models. We had a discussion with Vassil Momtchev, GraphDB product owner at Ontotext, about the benefits and use cases of RDF. Ontotext's legacy is in text mining algorithms, however these days it's mostly known for GraphDB, its RDF Graph database engine. "Our text mining is backed by a complex semantic network to represent background and extracted knowledge. Back in 2006, we found that none of the existing RDF databases were able to match our requirements for a highly scalable database. This was how GraphDB started," says Momtchev.


Digital Currencies Like Bitcoin and Ethereum are Booming

The ethereum dream. Just by some code that no one can stop, we’ll bring back money to the people. Not because we dislike banks or governments, but because they too will see it is better for the regulator of the free market, money, itself to be regulated by the free market. That statement isn’t a hypothesis, but the insight of Hayek, a Nobel prize winner who spent his life studying money. So the foundations of this space are built on strong and mainstream grounds, not fringe thoughts as many were led to believe in the previous years. Its walls are futuristic in architecture. We are digitizing money, making it dynamic, turning it into code, while giving it a very primitive level of intelligence in that we can tell it to do things and it does do so. The applications only imagination can constrain and the benefits, for poor or rich, banks and governments or the people, will probably be very considerable. To the point where we might actually get those flying cars in our own lifetime.


IoT and Blockchain Technology Collide in the Payments Industry

The collision between the IoT and blockchain worlds portends some important payments industry developments around the efficient tracking of device payment history, all supported by a ledger of secure data exchanges among devices, web systems and users. Further, this technological convergence also shows promise in terms of the use of smart devices that are programmed to conduct a variety of transactions such as the automatic issuance of invoices and payments.  Dan Loomis, vice president and director of mobile product management at the business and financial software firm Intuit, is firmly entrenched in this evolving IoT/blockchain conversation through his work in creating payment experiences for businesses that operate on a global scale, and brought this expertise to the TRANSACT panel discussion.


Under The Hood With the JVM's Automatic Resource Management

The finalize() mechanism is an attempt to provide automatic resource management, in a similar way to the RAII (Resource Acquisition Is Initialisation) pattern from C++ and similar languages. In that pattern, a destructor method (known as finalize() in Java) is provided, to enable automatic cleanup, and release of resources when the object is destroyed. The basic use case for this is fairly simple - when an object is created, it takes ownership of some resource, and the object’s ownership of that resource persists for the lifetime of the object. Then, when the object dies, the ownership of the resource is automatically relinquished. Let’s look at a quick simple C++ example that shows how to put an RAII wrapper around C-style file I/O. The core of this technique is that the object destructor method (denoted with a ~ at the start of a method named the same as the class) is used for cleanup:


10 Free or Low-Cost Security Tools

While free tools sound great, their usefulness varies from business to business. For some organizations, they are helpful means of solving small problems. For others, they are too "siloed" to be effective. "It depends on the environment," says Travis Farral, director of security strategy at Anomali, which is behind the Staxx free threat intelligence tool. "Some are against major deployment of anything open-source that doesn’t have a company behind it, for support or liability issues." Because many free and low-cost tools are designed for specific purposes, they often require advanced technical expertise. Several major businesses use a combination of major enterprise tools and FOSS utilities because they have the staff to support them. For organizations with less staff, siloed tools require security practitioners to become systems integrators because they need to have the solutions work together, says Lee Weiner.



Quote for the day:


"Power concedes nothing without a demand. It never did and it never will." -- Frederick Douglass


Daily Tech Digest - May 20, 2017

Ransomware Rocks Endpoint Security Concerns

"The larger companies are strapped with so many layers of protection that it will take more time to figure out where AI fits in the stack, but mid-market to smaller enterprises with 150,000 to 200,000 nodes and below can adopt cutting-edge technology more quickly," McClure says. IDC's Westervelt notes that although many endpoint startups have launched out of the gate with signature-free detection technology that was based more on users' behavior, many are now adding signature-based detection engines to their products. "The vast majority of threats are known threats, so why put extra pressure on your sandbox" to test potentially malicious software based on its behavior, Westervelt says. And in the meantime, traditional signature-based anti-virus vendors have added signature-free security software to their offerings. As a result, Westervelt says, there is less and less differentiation between new and shiny startups and the old guard.


Google Researchers Are Teaching Their AI to Build Its Own, More Powerful AI

"The way it works is we take a set of candidate neural nets, think of these as little baby neural nets, and we actually use a neural net to iterate through them until we arrive at the best neural net," explains Pichai. That process is called reinforcement learning, where computers can link trial and error with some kind of reward, just like teaching a dog new tricks. It takes a massive amount of computational power to do, but Google's hardware is now getting to the stage where one neural net can analyse another. Neural nets usually take an expert team of scientists and engineers a significant amount of time to put together, but thanks to AutoML, almost anyone will be able to build AI systems to tackle whatever tasks they like. "We hope AutoML will take an ability that a few PhDs have today and will make it possible in three to five years for hundreds of thousands of developers to design new neural nets for their particular needs," Pichai writes in a blog post.


Indian IT's planned layoffs give a glimpse of the bloodbath ahead

It may not be the most original declaration but sometimes detective work of this kind does have its advantages in exposing the rot that actually lurks beneath the sheets. Lakshmikanth's reading of the McKinsey report mirrors what many in the industry have been raising alarms this year about. India CEO of French IT Services CapGemini said earlier this year, "I am not very pessimistic, but it is a challenging task and I tend to believe that 60-65 percent [of Indian IT workers] are just not trainable," he said. "Probably, India will witness the largest unemployment in the middle level to senior level," he added. A more fundamental problem is the lack of real skills amongst most of India's engineering graduates. According to employment solutions company Aspiring Minds, a well-known institution that regularly tracks the worth of college graduates, a staggering 80 percent of engineers in India don't possess skills that can make them employable.


Currency is Under Attack. Diversifying Your Funds is Smarter Than Ever

You just never know what is going to happen next, so diversifying your currency can help manage your risk and increase your chances of survival should something go wrong. Plastic can be very secure for the consumer since there are protections against fraud, but it is costly for merchants in transaction fees and fraudulent charges. The downside is that all your money is in a bank where you can't get to it in a crisis and if the FDIC collapses you may have no recourse to get that money back. Cryptocurrency, like Bitcoin and Ethereum, doesn't rely on central banking, so there's a lot less chance they will open you up to identity theft. The problem is that if it is stolen, you can never get it back. Also because of its untraceable nature, it is often used in crimes.


Uber Freight Is the First Step to Automating Away Truckers

Much like the taxi service before it, Uber’s promise here is to remove friction from the current system. In a blog post announcing the new service, the firm bleats about how drivers will be able to pick up jobs with a simple search and some button presses, rather than spending “several hours and multiple phone calls” trying to achieve the same end in the past. ... But there is a larger narrative at play here. Uber’s move into shipping came after it acquired the autonomous truck company Otto last summer. And that sector is maturing quickly: while the trucks make use of similar technology to that being used by the autonomous cars being developed by Uber and Waymo for robotic taxi fleets, they also only have to contended with highways. That's far easier than inner-city driving.


Global banking technology overview

If banks don’t rise to the challenge, there are many FinTechs waiting to nibble away at their business. For instance, Caxton is a company that uses hybrid blockchain and core banking-type technology for foreign exchange (FX) end uses. Its CTO, Russell Stather, is convinced that smart contracts and DLT will disrupt the commercial banking environment serving corporate treasurers. “Smart contracts allow you to move an asset with multiparty involvement in a single transaction, which makes it cheaper, quicker and traceable end-to-end,” he says. The key benefit for a payment or trade finance end use is that not everyone in the chain is taking a transaction fee anymore, as blockchains – unlike correspondent banking ones – are fast, irrefutable and traceable. Settlement mechanisms and structures for investment banks could also easily be disrupted – or improved – by DLT.


Protect your enterprise data using Windows Information Protection (WIP)

Unfortunately, data loss prevention systems have their own problems. For example, the more detailed the rule set, the more false positives are created, leading employees to believe that the rules slow down their work and need to be bypassed in order to remain productive, potentially leading to data being incorrectly blocked or improperly released. Another major problem is that data loss prevention systems must be widely implemented to be effective. For example, if your company uses a data loss prevention system for email, but not for file shares or document storage, you might find that your data leaks through the unprotected channels. But perhaps the biggest problem with data loss preventions systems is that it provides a jarring experience that interrupts the employees’ natural workflow by stopping some operations while allowing others


What will the ‘mega security breach’ of the future look like?

The mega breach of the future could take on a variety of different shapes and guises. Douglas Crawford, Cyber Security Expert at BestVPN.com imagines what it might be like if a criminal got hold of a lot of banking passwords. “The economic chaos caused could, in addition to bankrupting potentially millions of individuals, destroy banks and banking systems, create global economic depression, and even bring down governments,” he says. The interesting thing about this situation is it could arise from a variety of different motivations. Yet Chad Schamberger, Director of Engineering at VirtualArmour believes that mega breach of the future “will be driven to affect a decision, a political election, a financial outcome, or the intent to cause mass chaos across a population. Not necessarily to gather sellable assets but intended to expose the attack surface that has developed by introducing more and more poorly develop connected devices (IoT).”


Supercomputing as a Service comes to the cloud with new Cray partnership

The first solution to come from the partnership will focus on the life sciences industry, and will feature the Cray Urikia-GX system, a complete, pre-integrated hardware-software solution. It also includes the Cray Graph Engine, which includes pattern-matching that takes advantage of the scalable parallelization and performance of the platform, according to the release. "The Cray Urika-GX system is the first agile analytics platform that fuses supercomputing abilities with open enterprise standards to provide an unprecedented combination of versatility and speed for high-frequency insights, tailor-made for life sciences research and discovery," the press release stated. Cray and Markley plan to quickly expand their offerings to include Cray's full line of infrastructure solutions.


The Enterprise Architecture Problem

In IT (Information Technology) most of it is commodity. Buying these commodities may, at most, get you what everyone else already has. Forget analyzing performance improvement here, there is little of it, so emphasize cost reduction, enterprise wide purchase agreements, and minimizing security vulnerability or TCO by restricting brands to those tested and approved. The real key lies in producing some enterprise IT that produces marked improvement in your business processes. If those are the same business processes everyone else uses, benchmarked against best practices elsewhere, you can buy it as a COTS (Off the Shelf) product and tweak it a bit. However the best of all is the automation of processes that only you have, completely custom. You will have these special processes either because you do business differently (differentiation) or because only you are in this business.



Quote for the day:


"Mistakes should be examined, learned from, and discarded; not dwelled upon and stored." -- Tim Fargo


Daily Tech Digest - May 19, 2017

3 critical steps to recover from a ransomware attack

One thing that IT security experts agree on this week is that the WannaCry attacks have raised awareness around data security and systems vulnerabilities. “There have been more than 4,000 daily ransomware attacks since early 2016 – a 300 percent increase over 2015,” according to Scott Kinka, chief technology officer at Evolve IP. “Victims paid a total of more than $24 million to regain access to their data in 2015 alone.” But Kinka believes the WannaCry epidemic raises the stakes, and organizations that haven’t placed a top priority on data security in the past need to now. The WannaCry attack “represents a massive ransomware explosion, even by these standards,” Kinka says. “Truthfully, it is impossible to stop the ransomware epidemic. However, taking the right proactive and reactive measures can help mitigate the damage.”


Red Hat's Cormier dishes on OpenShift.io and Container Health Index

In the past, people were playing with containers but not really betting their business on it. Now that they're starting to deploy them into production, security, manageability [and] lifecycle are more applicable now and you want a commercial-grade system to do commercial-grade containers in Linux. What we've done is containerize all of our products into a [Red Hat Enterprise Linux] container. With the Container Health Index, we scan the pieces of the OS that they've included … and tell them what shape it's in, if there's any known security vulnerabilities or any bugs and offer a newer version if available. We've done that for our own products and we're now exposing those tools to our partners so they can run containers they've built with our container kits. We're going to publish Container Health Index results on our portal.


Google weaves AI and machine learning into core products

One of the big announcements at I/O was Google Lens, a set of vision-based computing capabilities that seeks to understand what a user is looking at with their smartphone's camera, and help them take action based on that information. For example, a user can take a picture of a flower, and Lens will tell the user the kind of flower it is, Pichai said. Users will also be able to point their phone at a router, and it will connect them based on the given password. Google Lens will initially be rolled out to Google Assistant and Google Photos in the coming weeks. At last year's I/O, Pichai spoke about how computing was moving from mobile-first to AI-first, and that theme continued in 2017. Pichai said that Google is rethinking its computational architecture to build "AI-first data centers."


5 Steps To Ending Generational Stereotypes

Some of the most pervasive stereotypes surround millennials; those who are roughly 20 to 35 years old, says William A. Schiemann, CEO of Metrus Institute. Schiemann says he's continually faced with clients' confusion and misunderstanding about generational differences, and the stereotypes that arise from this confusion. "What's amazing is how often organizational leaders that I regularly interview at the Metrus Institute try to label these younger employees as needy, coddled, technology snobs, unprepared for organizational life or scores of other attributes. But digging deeper, I'll ask if there are differences between their 20-to-25-year-olds and their 30-to-35-year-old millennials. 'Oh, yes! The older millennials have clearer goals, understand corporate organizations better, they're more educated' and on and on. Dial this back for a moment -- Of course! They're more mature and experienced by about ten years!" Schiemann says.


How to defend against DDoS attacks like the one on the Dyn DNS service

“DDoS scrubbing services primarily use commercial cloud-based solutions,” says Andrew Howard, CTO, Kudelski Security. These are the kinds of solutions that have sufficient resources to make such an approach work, given the size of the traffic and the scrubbing task. You can buy services that run all the time or services that you turn up when you see DDoS attacks coming on. “With as-needed services, you’ll see a delay between the time when you come under attack and the time the mitigation starts—but we’re talking about a delay of minutes, not hours,” says Rachel Kartch, Researcher, CERT Division, Software Engineering Institute, Carnegie-Mellon University. ... “Hybrid services include an always-on, on-premise scrubbing device and rerouting for traffic to scrubbing centers when you come under heavy attack,” says Kartch.


The Best Cybersecurity Investment You Can Make Is Better Training

Firms must recognize and react to three uncomfortable truths. First, cyber risk evolves according to Moore’s Law. That’s a major reason that technology solutions alone can never keep pace with dynamic cyber threats. Second, as with all threat management, defense is a much harder role to play than offense. The offensive players only need to win once to wreak incalculable havoc on an enterprise. Third, and worst yet, attackers have patience and latency on their side. Firms can be lulled into a dangerous state of complacency by their defensive technologies, firewalls, and assurances of perfect cyber hygiene. The danger is in thinking that these risks can be perfectly “managed” through some sort of comprehensive defense system. It’s better to assume your defenses will be breached and to train your people in what to do when that happens.


Echo who? Google just turned Home into a productivity powerhouse

Calls aside, the newly announced upgrade will bring Home a handful of other interesting flourishes. First, Home devices will soon harness your existing smartphone and TV screens to provide visual accompaniments to responses, as appropriate -- sending directions to your phone when you ask about the location of a business, for instance, or showing your calendar on a Chromecast-connected TV when you ask to see your agenda. It's a clever and very Googley way to accomplish what Amazon could do only by creating an entirely new product. Home will also soon gain what Google calls a "proactive assistance" feature. In short, the device will flash its lights when a timely and important message awaits -- like a pending reminder, a traffic delay relevant to your day at any given moment, or a status change for a flight you've booked. When you see the lights flashing, you simply say "Hey Google, what's up?" to get the info.


Here's why IoT testing is really going to matter to QA pros

The stories are engaging. They raise legitimate, even alarming, concerns. But frankly, IoT testing always seemed like a specialized discipline to me. If you weren't building software for cars, big box home appliances or tiny wearable devices, its immediate relevance to quality assurance (QA) pros escaped me. For years, I heard and read a lot about it, but it wasn't clear to me why IoT testing mattered outside its own arena. ... "It's always been about functional testing," she said. "Let's make sure the software works and get it out the door." But concerns about how IoT works in the real world place new emphasis on nonfunctional types of testing, including performance and security, she said. That presents a career path for software testers who spot the opportunity. "QA pros should build up their resumes for nonfunctional testing skills."


Consensual Software: How to Prioritize User Safety

In the age of smart devices, IoT, and ever invasive advertisement practices, it becomes imperative that we build explicit consent into every feature that we build. What will happen if the next television ad asks Alexa to purchase twenty rolls of a certain brand of toilet paper every time the ad plays? What if Google Home plays ads for medication to a user who hasn’t told the rest of the family about their condition? What if Alexa outs a LGBTQ user to their family and puts them in danger? Or endangers a person trying to leave their abusive spouse by suggesting ads for self-defense classes and survival supplies based on their browser history? ... The easiest way to protect user privacy is to give users the information they need to make informed, consensual decisions to use our products and to not assume passive, implicit consent.


10 bad habits IT helpdesk professionals must break

"The biggest mistakes helpdesk professionals make is communicating with customers in ways that feels impersonal," said Jamie Domenici, VP of SMB marketing at Salesforce. "While it may be easier to use a script, or send a templated email when you are trying to respond to a customer quickly, the more work service teams put into building exceptional customer service experiences, the more they'll get out." While being polite is a must for a helpdesk professional, very formal language may alienate your client, said Eirini Kafourou, communications specialist at Megaventory. "We have noticed that the people who contact our customer support are usually feeling bad that they had to ask for help and try to ask as few questions as possible," she said. "Replying in a playful tone helps them relax and continue asking more, as if they had a friend helping them."



Quote for the day:


"It was character that got us out of bed, commitment that moved us into action, and discipline that enabled us to follow through." -- Zig Ziglar


Daily Tech Digest - May 18, 2017

The Promises and Limitations of Big Data

Although many people claim we have entered the era of big data, research firms tell us that most collected information is never used. It sits uncleaned, unanalyzed, unused in databases.  But when data analytics is used successfully, organizations reap the benefits. Financial services firms are using digital information about their customers to offer them a whole new range of customized products under the category of fintech. Cities are using data from Google Street View to guide economic development. And companies are finding that in some cases machines can make better hiring decisions than humans. In these stories from our recent archives, Harvard Business School researchers outline the promises of big data—and the limitations of trying to harness data from a firehose.


Financial Institutions in the U.S. Are Falling Behind In The Innovation Arms Race

In contrast to the other regions, the main barrier to innovation for the Americas seems to be culture rather than IT systems or lack of funding. While regulation and compliance issues are more of a challenge in the U.S. than most other regions of the world, this challenge is less of an issue today than it was just a couple years ago. “Innovation is simply not in the DNA of most bankers,” explains Nicols. “They’ve been trained throughout their whole career to identify and avoid risks, and innovation is about taking small risks and failing fast and cheaply and learning from those mistakes to get to the right answer quickly.” Nicols continues, “Another challenge is analysis paralysis. Most banks have too many silos with conflicting agendas, and that makes it hard to actually put new ideas into action. ...”


8 “Simple” Guidelines For Data Projects

Countless data posts out there will tell you to do things like “harness the cloud” or “run experiments.” The vagueness of these posts is not helpful. You can’t “tip and trick” your way to a successful data product. You have to have the right mindset. I got frustrated reading these posts and decided to write my own, but one that’s not presented as collection of tips, tricks, or rules, but as guidelines. Following all of these doesn’t guarantee success, but they might be useful for you… What follows is a collection of things I have recently observed at client meetings and also during project work. This post is inspired by an excellent article by Martin Goodson, “Ten Ways Your Data Project is Going to Fail” and includes my personal views on many things I currently see in data projects


73% of enterprises will run almost entirely on SaaS by 2020

Enterprises are rapidly shifting to Software as a Service (SaaS), with the industry poised to generate more than $112.8 billion in revenue by 2019, according to IDC. Enterprises now use 16 SaaS apps on average—up 33% from last year, according to a new report from BetterCloud. And 73% of organizations said nearly all of their apps (more than 80%) will be SaaS by 2020. BetterCloud—which, it should be noted, provides SaaS management software—surveyed more than 1,800 IT professionals for their report. Some 38% of tech workers said their company is already running almost entirely on SaaS, and that they run 2.1x more SaaS apps than the average organization, the survey found. Tech professionals from these SaaS-focused companies are 52% more likely to say that the delivery model helps them attract better talent than the average workplace, BetterCloud found.


Promise and pitfalls in the application of big data to occupational and environmental health

Big data and data sharing have the potential to inform occupational and environmental health by exploiting innovations related to non-traditional data sources or providers and novel partnerships. Promising applications include real time analysis and forecasting, and innovative analyses of clinical trial or observational data originally collected for other purposes. However, in order to support these innovations, advances are also required in data curation, protection of privacy and security, as well as data analysis methods. Challenges related to messy and unrepresentative data and spurious findings, as well as epistemological issues and equity considerations must also be addressed.


Financial services firms advised to ditch private datacentres and invest in cloud

“Cloud computing has reached the tipping point as the capabilities, resiliency and security of services provided by cloud suppliers now exceed those of many on-premise datacentres,” the whitepaper stated. “The combination of technology commoditisation with the scale and competition from public cloud suppliers is driving the unit prices of computing, storage and network services towards zero. “This gap will continue to grow at an accelerated rate, leaving laggards in cloud adoption at increased risk from a resiliency and cost perspective,” it added. This shift has altered the way organisations talk about cloud over the course of the past decade, with conversations about the safety and security of using it giving way to discussions about how shunning the technology could negatively affect an organisation.


The Promise of Blockchain Is a World Without Middlemen

In a world without middle men, things get more efficient in unexpected ways. A 1% transaction fee may not seem like much, but down a 15-step supply chain, it adds up. These kinds of little frictions add just enough drag on the global economy that we’re forced to stick with short supply chains and deals done by the container load, because it’s simply too inefficient to have more links in the supply chain and to work with smaller transactions. The decentralization that blockchain provides would change that, which could have huge possible impacts for economies in the developing world. Any transformation which helps small businesses compete with giants will have major global effects. Blockchains support the formation of more complex value networks than can otherwise be supported.


Brainstation's Fintech Panel Says Retraining Key To Preparing For AI Disruption

What big FIs should really be thinking of is, “what core competencies can we not afford to outsource?” says Gene. Once you’ve answered this, the options actually extend far beyond ‘buy versus build’ into buying, building, buying-then-customizing, investing, partnering, and incubating. EQ Bank, for instance, is a small bank with limited budget, meaning that both “buy” and purely “build” are not really on the table, says Dickinson. However, what EQ does instead is build infrastructure that enables integrations with FinTech companies. Then they invest in companies building the technologies their customers need. “We focus on an ‘execute what the customer wants’ mentality,” says Dickinson. “which means that investments and partnerships are good strategies for us.” BMO feels the same way, says Gene, explaining that BMO has three FinTech incubators, including one in Toronto.


Interview: Mark Potter, CTO, HPE

“The Machine’s architecture lends itself to the intelligent edge,” he says. One of the trends in computing is that high-end technology eventually ends up in commodity products. A smartphone probably has more computational power than a vintage supercomputer. So Potter believes it is entirely feasible for HPC-level computing, as is the case in a modern supercomputer, to be used in IoT to process data generated by sensors locally. Consider machine learning and real-time processing in safety-critical applications. “As we get into machine learning, we will need to build core datacentre systems that can be pushed out to the edge [of the IoT network].” It would be dangerous and unacceptable to experience any kind of delay when computing safety-critical decisions in real time, such as for processing sensor data from an autonomous vehicle. “Today’s supercomputer-level systems will run autonomous vehicles,” says Potter.


If Software Eats Everything, Are Network Engineers On The Menu?

In-house networking teams will need to match the speed of public cloud providers in tasks like spinning up new virtual machines, Stanford professor David Cheriton said, echoing a concern users expressed in private interviews at the conference. "At some point, the CIO is going to ask, 'Why is it that it costs so much more and it takes so much longer to do this (ourselves)?,'" Cheriton said. SDN takes over configuration tasks that some network engineers have spent their careers doing manually, which has raised concerns about job security and what these technicians should do next. There are big changes afoot, panelists and participants said. Freed from configuring ports and routes, some network engineers are taking on higher tasks like designing better systems.



Quote for the day:


"Never let the fear of what other people think stop you from being yourself." -- Joubert Botha