Daily Tech Digest - January 31, 2018

Separating Science Fact From Science Hype: How Far off Is the Singularity?

First, on a hardware level, we are hitting the ceiling of Moore’s law as transistors can’t get any smaller. At the same time, we have yet to prove in practice that new computing architectures, such as quantum computing, can be used to continue the growth of computing power at the same rate as we had previously. Second, on a software level, we still have a long way to go. Most of the best-performing AI algorithms require thousands, if not millions, of examples to train themselves successfully. We humans are able to learn new tasks much more efficiently by only seeing a few examples. The applications of AI [and] deep learning nowadays are very narrow. AI systems focus on solving very specific problems, such as recognizing pictures of cats and dogs, driving cars, or composing music, but we haven’t yet managed to train a system to do all these tasks at once like a human is capable of doing.

What is TOGAF? An enterprise architecture methodology for business
TOGAF helps organizations implement software technology in a structured and organized way, with a focus on governance and meeting business objectives. Software development relies on collaboration between multiple departments and business units both inside and outside of IT, and TOGAF helps address any issues around getting key stakeholders on the same page. TOGAF is intended to help create a systematic approach to streamline the development process so that it can be replicated, with as few errors or problems as possible as each phase of development changes hands. By creating a common language that bridges gaps between IT and the business side, it helps bring clarity to everyone involved. It’s an extensive document — but you don’t have to adopt every part of TOGAF. Businesses are better off evaluating their needs to determine which parts of the framework to focus on.

Cyber security salaries will rise 7% in 2018, says research

The increasing investment in cyber security professionals across all industries has often been attributed to the growing number of high-profile cyber attacks over the past year, which have brought attention to the importance of being prepared for these occurrences. In London, a professional in a head of information security role with 10 or more years of experience can expect an annual salary of between £105,000 and £170,000 a year – an increase on a London-based head of information security salary of between £95,000 and £155,000 in 2017. Though the annual salaries for these roles in other parts of the UK, such as the Midlands and the North of England, are lower on average, there has still been a significant increase in the total yearly remuneration for cyber-based roles from 2017 to 2018.

Big Data Isn’t a Thing; Big Data is a State of Mind

Big Data is about exploiting the unique characteristics of data and analytics as digital assets to create new sources of economic value for the organization. Most assets exhibit a one-to-one transactional relationship. For example, the quantifiable value of a dollar as an asset is finite – it can only be used to buy one item or service at a time. Same with human assets, as a person can only do one job at a time. But measuring the value of data as an asset is not constrained by those transactional limitations. In fact, data is an unusual asset as it exhibits an Economic Multiplier Effect, whereby it never depletes or wears out and can be used simultaneously across multiple use cases at near zero margin cost. This makes data a powerful asset in which to invest ... Leading organizations that embrace digital transformation see data and analytics as a business discipline, not just another IT task. And tomorrow’s business leaders must become experts at leveraging data and analytics to power their business models.

A day in the data science life: Salesforce's Dr. Shrestha Basu Mallick

Pricing is an incredibly complex science and art that takes years to master -- there are hundreds of variables to take into consideration and today's sales process moves faster than ever before. The best price does not always mean the highest price -- it means finding the right price point at which the customer feels that they are getting a good value while capturing that value for the vendor as well. Sales reps are always juggling at least three priorities on any given deal: the customer's satisfaction in the value they're getting; the speed in which they close the deal; and the best price for the company. No human can effectively balance these priorities on their own keeping in mind the different variables -- and its especially difficult for newer sales reps who are just starting off their sales careers. Very few tools drive these pricing conversations effectively. It's equally difficult for sales managers, who have very little data-driven insight into the pricing stages of deals.

These firms have historically maintained a rich history of customer transactions and activity for what can be very high-value customer relationships. Leading firms in sectors such as healthcare, life sciences, media, and manufacturing are also represented. 2018 survey participants included bellwether firms American Express, Bank of America, Capital One, Charles Schwab, CitiGroup, Fidelity Investments, Ford Motors, Goldman Sachs, JP Morgan, IBM, Wells Fargo, and VISA, among nearly 60 industry leaders. In spite of the common recognition of the need for a Chief Data Officer, there appears to be a profound lack of consensus on the nature of the role and responsibilities, mandate, and background that qualifies an executive to operate as a successful CDO. Further, because few organizations -- 13.5% -- have assigned revenue responsibility to their Chief Data Officers, for most firms the CDO role functions primarily as an influencer, not a revenue generator.

Artificial Intelligence will open up new avenues: Experts

Demystifying Artificial Intelligence and addressing these concerns on the future of human work and employment Mr. Binay Rath, Director, APAC Alliances and Channels, Oracle Health Sciences GBU, mentioned that the benefits of AI is clear so instead of being concerned about the threats of AI we need to first understand what AI is and what its potential will be. Followed by which, ethical policy frameworks need to be in place for the operation of machines and AI automated systems. Mr. Nishith Pathak, Vice President at Accenture Labs India and author of many books on AI, echoed similar sentiments. Citing the example of Postal Services he said though we do not write letters nowadays but postman and post offices have not gone redundant; they have a different role to play with a different set of responsibilities. Likewise some low level programming jobs may go and instead data scientists will be in huge demand.

Why You Need To Keep Learning In Data Science

It is by no means a static profession, you can apply your skills to an array of different industries and domains, so knowledge on the area you work in is key so that you are able to derive the most valuable insights out of your data. With so many businesses hiring for and building out data science teams, it’s important that you have knowledge of the business you’re working in, or if you’re looking to move jobs – the one you’re wanting to make a move into. The job role is so much more than numbers and big data sets, it’s about contributing to the bigger picture, solving problems for businesses and individuals alike. Continuing your education in Data Science and all related facets within the field throughout your career will benefit your working life greatly. This real-world application of your work requires constantly learning new skills, technologies, and approaches.

Government surveillance regime unlawful, court rules in Tom Watson case

“The government must now bring forward changes to the Investigatory Powers Act to ensure that hundreds of thousands of people, many of whom are innocent victims or witnesses to crime, are protected by a system of independent approval for access to communications data,” said Watson after the verdict. In a 10-page judgment, the Court of Appeal ruled that it is unlawful for the government to access private communications data retained by phone companies and internet service providers, for reasons that fall short of fighting serious crime, or without a prior review from a court or an independent watchdog. ... The proposals limit the ability of senior police officers, and officials at the Department for Work and Pensions and HM Revenue and Customs, to authorise their own access to communications data, requiring permission from a new authorising body, the Office of Communications Data Authorisation.

Disasters: Tech to Map Human Behavior in Crises

According to new research from Embry-Riddle Aeronautical University, during a disaster, escapees are more likely to look for an exit they saw earlier, even if it's farther away, or follow a crowd, even if it means they'll be moving slowly. As evidenced by the unfortunate number of disasters that happened last year, this "panic behavior," if unassisted, can result in fatalities and many injuries.  The rise in incidents that require emergency evacuations, such as airport and concert shootings, hurricanes, and terrorist attacks, has a team of students and professors at Embry-Riddle Aeronautical University’s Daytona Beach Campus employing parallel computing to tackle a topic with very little existing research. Their goal is to analyze human behavior in emergency evacuations from places like airports and other enclosed areas to help create more efficient evacuation policies and save lives.

Quote for the day:

"A good leader inspires others with confidence in him or her. A great leader inspires them with confidence in themselves." -- Raymond Dreyfack

Daily Tech Digest - January 30, 2018

Today, most task work is driven by logic, which typically requires working from our left-brain. Logic is all about numbers, critical thinking, black & white. Those of us who live in our left-brains have great difficulty perceiving right-brain work. But, the right brain is still there waiting for us! Right-brain work includes creativity, design, communications, influencing others, building relationships, engaging, consultative sales, strategy, among others, a lot of stuff our parents warned us not to do. In my new book, The Workplace Engagement Solution, I talk about how a global engagement figure of 13% isn’t just a business problem, disengagement is a tragedy infecting our lives, families, customer satisfaction and day-to-day living. The great disengagement of the modern worker is leading directly to the scourge of our modern economy: underemployment. How many task workers are now part-time task workers? “Consultants?”

How To Manage Your KPIs And Expectations During Digital Transformation

Digital transformation should set you up for the long-term by giving you a scalable, streamlined approach to business growth. And as such, it starts on the inside, building the internal systems that support even the outward-facing initiatives. With that in mind, having a baseline of what's working and what's not in your organization will provide you a good jumping off point. Are processes clear, well documented and accessible to everyone? Are all those processes easily repeatable and scalable, and supported by the right technology? Is there a system in place for hiring the right people, documenting and sharing customer data across different departments, making suggestions for process improvement and acting on it, and getting leadership's approval of major initiatives in an efficient manner? If any of these elements are off, your path to transformation may lead to expensive dead-ends.

Make the case to follow a container strategy at your company

Adding even a simple orchestration tool to the primary container software facilitates container deployment on public and hybrid cloud, as well as across multiple data centers. It can also help organize multicomponent applications, particularly those that share components among applications. Large enterprises should look into this path to achieve container success. The second trail off of the main container highway is for businesses with stringent security and compliance requirements. There are many ways to separate containers within a given server, and not all of them offer the same level of security. The most basic container software is the most popular, but not the most secure. If you have applications that demand an exceptionally high level of isolation and control, such as financial applications and even cloud applications involved in storefront missions, you may want to use something designed for that purpose.

The Anatomy of a Data Story

The Anatomy of a Data Story
The function of data in a data story is to tell what happened, and figuring out what happened typically starts with a question. Typical business-oriented questions usually revolve around sales or other KPIs, but they could be about anything—the results of a customer satisfaction survey, the efficacy of a policy change, the user behavior on a website. Once the overall question is established (e.g., How are our sales doing this quarter compared to last quarter?), you can begin to partition the data into smaller, more manageable pieces. Maybe you can track sales by product or by a sales representative or by company branch. Maybe you can look for correlations between revenue and other variables. If you don’t have a question to answer or artificial intelligence to point you to an interesting trend, you’ll likely have to do some data discovery and exploration to find a story worth telling. This is the process Ben Wellington employs when researching his blog posts.

What Will Artificial Intelligence Be Doing in 2018?

AI increasingly will require knowledge and skill sets that data scientists and AI specialists usually lack, according to the paper. Consider a team of computer scientists creating an AI application to support asset management systems. “The AI specialists probably aren’t experts on markets,” says PwC. “They’ll need economists, analysts, and traders working at their side to identify where AI can best support the human asset manager.” And, since the financial world is in constant flux, once the AI is up and running it will need continual customizing and tweaking. For that, too, functional specialists, not programmers, will have to lead the way. AI has already shown superiority over humans when it comes to hacking. For example, machine learning, often considered a subset of AI, can enable a malicious actor to follow a person’s behavior on social media, then customize phishing tweets or emails just for them, PwC says.

Designing Effective AI Public Policies

Governments lacking in-house expertise is problematic from a policy development perspective. As AI increasingly intersects with safety-critical areas of society, governments hold responsibilities to act in the interests of their citizens. But if they don’t have the ability to formulate measured policies in accordance with these interests, then unintended consequences could arise, placing their citizens at risk. Without belabouring scenarios of misguided policies, governments should prioritise building their own expertise. Whether they’re prepared or not, governments are key stakeholders. They hold Social Contracts with their citizens to act on their behalf. So, as AI is applied to safety-critical industries, like healthcare, energy, and transportation, understanding the opportunities and implications is essential. Ultimately, knowledge and expertise are central to effective policy decisions. And independence helps align policies to the public interest. While the spectrum of potential policy actions for safety-critical AI is broad, all with their own effects, inaction is also a policy position.

AI cracks ancient game of Go
Personalization, 1-to-1 marketing, people-based marketing and other trending terms describe how marketers in every industry are working to stop aggressive, irrelevant advertising campaigns and create highly targeted interactions. Online gaming is no different. Through big data, gaming companies can create meaningful marketing messages. Especially as we talk about all of the data being collected, the thought of playing a mobile game may seem intrusive. But gaming companies are mining such metrics to better appeal to their users with content they’re likely to appreciate, not despise. “Segmentation isn’t enough anymore. 76 percent of digital nomads expect to see a personalized website screen on just about every brand site they visit — and your inability to give any kind of individual regard means they think a lot less of your brand, and makes them significantly more likely to bounce,” according to VentureBeat.

No one would dispute the fact that we’re in an age of considerable AI hype, but the progress of AI is littered by booms and busts in hype, growth spurts that alternate with AI winters. So the AI Index attempts to track the progress of algorithms against a series of tasks. How well does computer vision perform at the Large Scale Visual Recognition challenge? (Superhuman at annotating images since 2015, but they still can’t answer questions about images very well, combining natural language processing and image recognition). Speech recognition on phone calls is almost at parity. In other narrow fields, AIs are still catching up to humans. Translation might be good enough that you can usually get the gist of what’s being said, but still scores poorly on the BLEU metric for translation accuracy. Measuring the performance of state-of-the-art AI systems on narrow tasks is useful and fairly easy to do. 

How to build a Successful Advanced Analytics Department

Using data is proven to work. Our client’s examples showcase the possibilities. We’ve built and implemented a dynamic pricing model that deals with over 2 million quarterly pricing decisions. Increased fraud detection from 50% to over 90% and more accurately predicted e-commerce sales a year in advance. Our portfolio includes AI and AA projects for a large range of industries, often times including industry leaders. If you discover more frauds than your competition – you get an advantage. Moreover, you build from there. You get new ideas every time you work with data. The power and value of using data are spreading within organizations as managers start noticing results. ... The very first thing is to understand where you stand today. It might be that you already gather a lot of data, but your organization is mostly driven by spreadsheets. This is how a vast number of organizations are managed. 

The general sentiment of global media coverage of AI in 2017 frequently painted very much what is deemed to be a worst case scenario picture of what the future holds for the technology. Job losses, a lack of human control and even killer robots continued to dominate the headlines in the year just gone. This year however, we can all expect to see the technology applied more widely, and more practically than ever before. Digital transformation will then be experienced through the mainstream application of AI to business operations, coupled with greater adoption of cloud and growth in the scale and degree of AI implementations. Not only will the technology revolutionise areas like the supply chain, in-store operations and merchandise execution but also dependency on AI will become a more prominent means of pursuing new business avenues across the retail sector.

Quote for the day:

"Eventually relationships determine the size and the length of leadership." -- John C. Maxwell

Daily Tech Digest - January 25, 2018

How policymakers should approach AI
AI is already super-human in many domains and in the next 5-20 years it is quite likely that we will be able to capture and express all of extant culturally-communicated human knowledge with it. Already we are far better at predicting individuals' behaviour than individuals are happy to know, and therefore than companies are happy to publicly reveal. Individuals and parties exploiting this are very likely compromising democracy globally, notably in the UK. There is an incredibly large project here for the social sciences and the humanities as we urgently address the political, economic, and existential (in the philosophical sense) challenges of massive improvements in communication, computation, and prediction. Again, natural laws of biology tell us to anticipate an accelerated pace of change given the increased plasticity of increased intelligence. Therefore we need to ensure our societies are robust to this increase, with sufficient resilience built into the system to allow individuals to have periods out of work finding a new place in the economy.

Implement OAuth in 15 minutes with Firebase

This article provides a 15 minute, step-by-step guide to adding OAuth support to a CLI-generated Angular application using Firebase. We will implement OAuth with a Google account, but other platforms supported by Firebase include: Facebook, Twitter, and GitHub. But first, what is Firebase? Firebase got its start as a realtime cloud-hosted NoSQL database supporting multi-user synchronization. Since being acquired by Google in October of 2015 it has become an entire publishing platform for web and mobile applications. Many major companies, including Lyft, Shazam, The New York Times, and NPR, use Firebase to support their apps . Some of these applications see over 100 million monthly users and update the database more than 3,000 times per second, providing strong evidence that the platform can scale.

Why NoSQL Needs Schema-Free ETL Tools

Even developers don't like writing boring "plumbing code" — code that just links data from one place to another. It's dull and repetitive. Customers don't like it, either — as anywhere that code is needed inevitably means a maintenance headache, not to mention a long time to write and test it in the first place. This means increased costs to initially deploy a new technology like NoSQL. Equally, on the output side, if you can't rapidly visualize the insights you can glean from your data, then you cannot fully realize the benefits of your investment in NoSQL database technology. Trying to code around the problem leads to longer project times, and the aforementioned increase in costs associated with custom coding. Many NoSQL companies have tried to shoe-horn SQL support into their products in an effort to bridge the gap between traditional BI vendors and their products. This has only been partially successful. 

NHS Wales IT outage: What went wrong with its datacentres?

Guillaume Ayme, IT operations evangelist at big data analytics software supplier Splunk, raised concerns about the datacentres’ setup, given that running dual sites usually means that in the event of an outage, one will failover to the other. “For the issue to be impacting two datacentres suggests it is severe, as one would normally be the backup for the other,” he said. “This may suggest there has been a problem in the failover procedure. “Once the service is restored, it will be essential to find the root cause to avoid a potential repeat. This can be complex for organisations that do not have full visibility into the data generated by their IT environment.” ... “While systems are now back up and running, the chaos it created shows why we need to move from hours to minutes to resolve problems like this,” said Anderson. “Ultimately, it comes down to our reliance on software and the need for it to work perfectly – and that’s difficult in IT environments that are getting more complex by the day.

Exploring Modern Data Platforms (Podcast)

The DevOps thing is something that everyone is trying to get their head around right now. When you have a whole staff of people who know SQL and know relational databases and now we say, ‘Okay, but all of your data is going to go to an object data store.’ Like, what does that look like or how should the data be organized? How do you query it? How do you use it? That type of training, but, to be honest, that really is not as much of a leap as it was even a year ago. The evolution is happening very, very, very rapidly. A year or two ago we’d say, ‘You need to use an object data store’, and we were speaking some foreign language. Now they get it, and they say, ‘Okay, let’s do it,’ because they think what’s happened is over the years people started dipping their toes and they’re realizing the economics of it. It’s like Hadoop was the gateway drug for this type of platform where you could start experiencing drastic cost reduction with enhanced capabilities

How CIOs Can Ensure a Seat at the Strategy Table

Digital disruption has placed technology at the heart of most business discussions, yet many CIOs are still fighting for a seat at the strategy table. Monika Sinha, research director at Gartner, says information and technology are considered too late in the strategy process in many enterprises. “IT is fundamental to the new business challenges,” Sinha told CIOs at Gartner Symposium/ITxpo in Goa, India this week. “It underpins new business models, products and services that are disrupting existing industries and creating new markets. As strategists, CIOs are flexible, agile and opportunistic in their approach.” “Once your ambition is clear, appropriately position IT at the heart of your business strategy.” The traditional “wait and respond” approach to enterprise strategy – the business strategy is finalized, the CIO reviews the strategy for IT’s contribution and an IT strategy is developed in response – is no longer viable.

The Benefits Of Open Standards For Process Automation

Once we see overall total cost of ownership of these process automations systems being reduced in the long run, we’ll be able to take advantage of the built-in, intrinsic cybersecurity features that are being designed into these open process automation systems. The rapid insertion of new technologies, new capabilities, and innovations will be inserted into the formerly closed systems in a much faster and cheaper way. Ultimately, that translates in manufacturing to increased equipment reliability, faster time to market, increased quality of production, and other benefits. ... It’s important to also remember that the intellectual property (IP) of those vendors is still preserved. There are points where we’re breaking up existing hardware and software systems into modules. With the modules, there will still be the intellectual property of the vendors—but the interfaces in between are what’s standard. In the future, there still will be the IP of the vendors in the hardware and software and in the application layer.

Cozy is building a personal cloud service that respects your privacy

Instead of creating yet another ecosystem of hosted services financed by ads, Cozy wants to change the balance and create a platform where the user is in charge. As you can read in the terms of services, you remain the owner of your data and your data won’t be shared with anyone unless you give your consent. And for the most privacy-concerned users, you can also install a Cozy instance on your own server. The main GitHub repositories have been updated today. The company just unveiled the first services of this new platform today. First, it starts with a good old file-syncing service. With Cozy Drive, you can install an app on all your computers, synchronize files with Cozy’s servers and find them everywhere — on your other computer, on your phone or on the web. Second, Cozy Photos lets you backup your photos. This works like Google Photos, Microsoft OneDrive’s Camera Upload and similar features.

IIoT and the Open Process Automation Forum

OPAF envisions a future open control system that will take information and data from any device and optimize it for better decision making. It will empower the workforce to be more actively involved and responsible for good business outcomes. For example, secondary measures will be key, such as differential pressure or sensor temperatures. We will be able to collect and communicate data about the overall health status of the instrument or sensor, which will drive new levels of reliability and overall operational integrity and profitability. This new level of control and new control functions will drive incredible value. Fitzgerald: Much depends on the scale and relevant policies of a given client. While DHCP might be “easier” for both wired and wireless integrations, discrete IP addresses associated with given subnets provide additional needed security and robustness of operations.

Robots are needed to manage the automation robots

Dube says the combination of physical robotic machine bodies and AI software brains will eventually make it hard to tell humans and robots apart. “We are carbon-based organisms and robots are silicon-based, but I think the boundaries around them are going to get progressively diffused to the extent that you will not be able to distinguish between a human and an android in the next nine years,” he says. “Robots are becoming fairly smooth in terms of mechanical motion. They can easily walk through a crowded mall, avoiding people. They can take an escalator, climb down stairs and even run faster than humans. In five years, their dexterity will be as good as humans. “But one component is missing – the brain – and that is the area we specialise in. When we implant the brain into the robot frame, it will be able to be asked a question, analyse what was said, and provide an answer. It will be able to walk and talk to you.”

Quote for the day:

"A leader must have the courage to act against an expert_s advice." -- James Callaghan

Daily Tech Digest - January 24, 2018

Can AI predict when that new hire will quit?
“The No. 1 most important data set to look at that predicts turnover more than any other time and again is reference response rate,” Bixler says. “The data conclusively continues to come back in that how the references rate the applicant will in fact help reduce turnover and predict performance one year later.” The average SkillSurvey client sees a 35 percent reduction in turnover. For pattern matching to predict this, though, you have to use good patterns. At DocuSign, Senior Director of Recruiting Susan Ross says references must respond to SkillSurvey within two days or an applicant won’t get the job. Later than that, and internal matching shows they’re not the best hire. Never mind that there are lots of reasons references might not respond that quickly that have nothing to do with the candidate. This timeline is DocuSign specific, though, and Ross says it and other matching works

istock 600402662
The best technology deployments are streamlined and fully integrated with other solutions and devices, and have the ability to empower a diverse employee base to work strategically and collaboratively. Strategy plays an instrumental role in creating the right atmosphere for innovation. Adam Uzialko, in Business News Daily, writes: “The responsive workplace offers an opportunity to extend the office far beyond its walls and enable more complete remote collaboration. The potential for the responsive workplace to dissolve the traditional boundaries of the office could have an immense impact on employees’ attitudes and the quality of the work they produce.” Of course, the onus of making this work falls on business leaders. To improve engagement and promote innovation, companies must take a hard look at how employees interact with the technology and tools at their disposal. Although today’s IT solutions offer significant potential, implementing ineffective or overly complicated technology can translate to a loss of productivity and further worker disengagement.

The old world typically consisted of operational databases providing online transaction processing (OLTP) and relational data warehouses providing online analytical processing (OLAP). Data from a variety of operational databases was typically batch-loaded into a master schema within the data warehouse once or twice a day. This data integration process is commonly referred to as extract-transform-load (ETL). Several recent data trends are driving a dramatic change in the old-world ETL architecture: Single-server databases are being replaced by a myriad of distributed data platforms that operate at company-wide scale; There are many more types of data sources beyond transactional data: e.g., logs, sensors, metrics, etc; and Stream data is increasingly ubiquitous, and there is a business need for faster processing than daily batches. The result of these trends is that traditional approaches to data integration often end up looking like a mess, with a combination of custom transformation scripts, enterprise middleware such as enterprise service buses (ESBs)

5 Things to Keep in Mind When Using Data for Artificial Intelligence
Many companies do not realize that they are sitting on a pile of bad or dirty data. This data contains a lot of missing fields, has wrong formatting, numerous duplicates, or is simply irrelevant information. IBM research estimated that the annual cost of bad data for the U.S. economy is a whopping $3.6 trillion. Still, many managers have certainty that they are sitting on a goldmine of data when in reality they have nothing valuable. I interviewed Sergey Zelvenskiy, who is an experienced machine learning engineer over at ServiceChannel, where he automates facilities management processes using artificial intelligence. We talked about common misconceptions when it comes to the good/bad data dichotomy and what companies should be focusing on when building AI products. As Zelvenskiy says, "The data that companies have may not necessarily be bad, it is just likely incomplete to solve the problem. There is a chicken and egg problem here.

The 6 Biggest Challenges Facing DevOps

Like any process or strategic approach, the field of DevOps is constantly changing. First emerging as a collective term nearly a decade ago, the DevOps field now embraces millions of software developers and entrepreneurs who have adjusted their teams and core philosophies to fall in line with the DevOps vision. However, these guiding principles are still evolving, and if you want to remain relevant and agile in 2018, you’ll need to evolve with them. There are critics who have argued that DevOps is a fad, or is more of a buzzword-driven rebranding campaign than a truly significant change in the industry, but there’s significant evidence to the contrary. The rise and continued success of SaaS platforms, increasing customer demands, and even the new perspectives of young developers are all pushing for DevOps to remain strongly relevant in business. The biggest transformation has been in how the term is used; rather than referring to specific roles, like “DevOps developers,” DevOps refers to a work culture that all individuals within it follow.

Orchestrating Flows for Cyber

An innovative solution uses the northbound interface on the SDN controller to monitor a customer’s network for volumetric increases, and to then dynamically task redirection of flow on the southbound interface to handle the attack. Vendors like Radware, with experience in load balancing, WAF, and anti-DDoS provide the perfect backdrop to building such an elegant approach to the problem. Visualize this in your mind: External traffic is being managed inbound through your software defined data center with the usual assortment of internal destinations: websites, apps, endpoints, and so on. A DDoS attack suddenly builds up toward one of your targets, probably the website, and the SDN controller immediately flow-orchestrates the increased volume to a collection of sinks or scrubbers, while maintaining proper traffic flow to non-targeted entities. It’s important to provide detection and orchestration of security at scale, because we know that with the speed of attacks were seeing on the Internet, enterprise teams will need to rely on proper automation to keep their applications and systems up and running.

Do website design platforms pose too big a security risk?

“Generally, if a platform allows you to insert HTML of third-party scripts and iframes, it can be abused to serve a malicious code. However, we didn’t see serious massive attacks on those sites recently, but we have seen attacks on multinationals and their systems,” says Sinegubko. “In our experience, the main areas of abuse for small to medium-sized businesses are spam via custom templates and using ad-backed widgets. Otherwise, the hacks are not massive. However, if hackers find a vulnerability in the platform itself, that will allow them to modify any sites hosted there, we’ll definitely see massive attacks," he says. “On the other hand, such attacks can be quickly mitigated, as most likely they won’t require action from thousands of individual webmasters, but rather just a coordinated effort from the platform staff.” Ultimately, the consensus seems to be that security should take priority over customisability, and more should be done to educate SME owners of the exploits hackers can take advantage of.

The importance of taking a break

Scientists are discovering that naps are Zambonis for our brains. They smooth out the nicks, scuffs and scratches a typical day leaves on our mental ice. For example, a University of California, Berkeley, study found that an afternoon nap expands the brain's capacity to learn and retain information. Other research has found that naps boost short-term memory, lift mood and increase feelings of "flow," that powerful source of engagement and creativity. Naps can also reduce our risk of heart disease and strengthen our immune systems. But these naps need not be as lengthy as a full-fledged siesta. The ideal naps – those that give us a boost without enveloping us in a haze of "sleep inertia" – are quite short, usually between 10 and 20 minutes. One Australian study published in the journal Sleep found that 10-minute naps had positive effects that lasted nearly three hours.

Secure your SDN controller

Secure your SDN controller
A significant issue regarding SDN security is that virtualizing every aspect of the network infrastructure increases your attack footprint. The SDN controller is typically the primary target for attackers because it is the central point for decisions in a network and a central point of failure. Attackers can try to get control of the network by breaking into a controller or pretending to be one. Once a central controller is compromised, an attacker can gain complete control over your network. This would be considered an extreme scenario, but it could be possible as SDN usage continues to grow. There are new types of denial-of-service attacks that try to exploit potential scaling limits of an SDN infrastructure by locating specific automatic processes that use a significant amount of CPU cycles. An SDN could be very vulnerable to attacks because of the separation of control and data planes. A disruption in the communication path between the two planes could potentially result in a major hole that attackers can compromise.

TalentSumerization – The Employee Experience in Agile Enterprises

The movement towards personalization in the workforce has been coined the “Consumerization of HR” – or “TalentSumerization”. It describes the idea of creating a social, mobile, and consumer-style experiences for employees inside the company. So, just as companies must ensure service excellence for their customers, HR must strive for service excellence for their employees. In fact, consumerization of human resources was identified as one of the most defining and disruptive trends of the industry. It not only changes the way we interact with employees, but also how companies market themselves. It is shifting companies away from a “seat filling” mentality to a “work experience” attitude. The new objective is to create one employer brand which provides a seamless employee experience – from the first interaction with potential candidates to the way we stay connected with former employees.

Quote for the day:

"Let us never negotiate out of fear. But let us never fear to negotiate." -- John F. Kennedy

Daily Tech Digest - January 23, 2018

Meltdown and Spectre: How much are ARM and AMD exposed?

Meltdown and Spectre: How much are ARM and AMD exposed?
AMD issued a statement on Meltdown and said it is potentially vulnerable to only one of the three variants of Meltdown, but no one has demonstrated an AMD vulnerability as yet. This applies to both the new Epyc server processor and older Opteron server chips for the half dozen customers still using them. With ARM, it gets complicated. The company has published a list of cores at risk. ARM has three types of cores — Cortex-A, Cortex-M and Cortex-R. Cortex-M is an embedded microcontroller used in Internet of Things (IoT) devices and a 32-bit processor, so it has no exposure. Cortex-R is also an embedded controller used in real-time applications, such as cars. Those are used in closed systems and are not prone to attack, although ARM said they are at risk of exposure. Only the Cortex-A line has exposure, and not all of the chips are at risk. For example, the Cortex-A53, which is the most widely used processor in smartphones and tablets, is not at risk. 

Blockchain and cryptocurrency may soon underpin cloud storage

bitcoin currency blockchain finance bank binary
The emerging blockchain-based distributed storage market could challenge traditional cloud storage services, such as Amazon AWS and Dropbox, for a cut of the cloud storage market. "Distributed compute and storage models are still in their infancy, but I do believe that there is an enormous market for this technology," said Paul Brody, Ernst & Young's (EY) Global Innovation Leader for Blockchain Technology. The idea of using P2P networks to aggregate computer resources is not new. In the early 2000s, BitTorrent opened as a distributed file-sharing service and grew to handle more than half of the internet's file-sharing bandwidth. Because blockchains come with a built-in mechanism for payments – cryptocurrencies, which were missing from the last go-around at P2P services – they are more likely to succeed, according to Brody.

Bitcoin: A cheat sheet for professionals

Bitcoin is the first decentralized form of cryptocurrency, but it's certainly not the only one. A large number of blockchain-based cryptocurrencies have emerged since 2009, which raises the obvious question: How is Bitcoin different? Aside from its much greater value, there are several things that make Bitcoin different from cryptocurrencies such as Etherium, Dogecoin, Litecoin, and others. All of these cryptocurrencies use blockchain technology, but the method and purpose of each one is different. Etherium, one of the most talked about bitcoin alternatives, isn't actually a value transfer platform; instead, it is used for distributed application programming. Etherium does have a monetary value in the form of its fuel, called Ether, but that's just one part of its overall model. Other cryptocurrencies, like Litecoin, Dogecoin, and PotCoin, use blockchains but don't rely on SHA-256 encryption like Bitcoin does; they use Scrypt, a password-based key derivation function, to build coin hashes instead.

What you need to know about Azure Notebooks

What you need to know about Azure Notebooks
The underlying technologies are familiar: You can add content around executable code playgrounds using Markdown to format text. Azure Notebooks automatically adds UI to your code snippets, and you can use any of a selection of visualization tools for charting results. Data can be uploaded to and downloaded from local PCs, so you can take files you’ve been using with Excel’s analytics and use them in Azure Notebooks, letting you compare results and use business intelligence tools to prepare data before it’s used. You import online data with Curl or Wget, using Python code in a notebook or from a notebook’s built-in terminal window. There’s also integration with Dropbox, so you can share files with colleagues or use it to ensure you’re always working with the latest version of a file. Although Microsoft provides most of the tools you’ll need, it can only really support general-purpose analytical operations with tools like Python’s Anaconda data science extensions.

The InfoQ eMag: APM & Observability

The topic of “observability” has been getting much attention recently, particularly in relation to building and operating “cloud native” systems. Several thought-leaders within this space like Cindy Sridharan have mused that observability could simply be a re-packaging of the age-old topic of monitoring (and argued that no amount of “observability” or “monitoring” tooling can ever be a substitute to good engineering intuition and instincts). Others, like Charity Majors have looked back at the roots of the term, which was taken from control theory and corresponds to a measure of how well internal states of a system can be inferred from knowledge of its external outputs. Both Sridharan and Majors discuss that the implementation of an observable systems should enable engineers to ask ad hoc (or following an incident, post hoc) questions about how the software works during execution. This eMag explores the topic of observability in-depth, covering the role of the “three pillars of observability” -- monitoring, logging, and distributed tracing

Are Advisors’ Cyberdefenses Strong Enough?

The speed at which cybercriminals launch attacks means the industry has no choice but to be more vigilant in protecting the precious information it keeps for its investors, so it can give more peace of mind to advisors and their clients. The public already sees cybercrime as a major threat. Research by Bitdefender, a cybersecurity technology provider based in Bucharest, Romania, finds U.S. citizens are more concerned about stolen identities (79%) than email hacking (70%) or home break-ins (63%). One major problem for the financial-services industry is that authentication methods are “severely outdated,” according to Harvey. “Many institutions have not yet recognized that cyberfelons already have the data to beat these practices. Millions of clients’ assets are at risk.” ... Today’s authentication practices largely rely on the of use private data, such as passwords, PINs and Social Security numbers — information that cyberfelons already possess.

Do data scientists have the right stuff for the C-suite?

For a data scientist or analyst to evolve as an effective leader three personal quality characteristics are needed: curiosity, imagination, and creativity. The three are sequentially linked. Curious people constantly ask “Why are things the way they are?” and “Is there a better way of doing things?” Without these personal qualities then innovation will be stifled. The emergence of analytics is creating opportunities for analysts as leaders. Weak leaders are prone to a diagnostic bias. They can be blind to evidence and somehow believe their intuition, instincts, and gut-feel are acceptable masquerades for having fact-based information. In contrast, a curious person always asks questions. They typically love what they do. If they are also a good leader they infect others with enthusiasm. Their curiosity leads to imagination. Imagination considers alternative possibilities and solutions. Imagination in turn sparks creativity.

6 ways hackers will use machine learning to launch attacks

job search machine learning ai artifical intelligence robotics automation
“We must recognize that although technologies such as machine learning, deep learning, and AI will be cornerstones of tomorrow’s cyber defenses, our adversaries are working just as furiously to implement and innovate around them,” said Steve Grobman, chief technology officer at McAfee, in recent comments to the media. “As is so often the case in cybersecurity, human intelligence amplified by technology will be the winning factor in the arms race between attackers and defenders.” This has naturally led to fears that this is AI vs AI, Terminator style. Nick Savvides, CTO at Symantec, says this is “the first year where we will see AI versus AI in a cybersecurity context,” with attackers more able to effectively explore compromised networks, and this clearly puts the onus on security vendors to build more automated and intelligent solutions.

Why the Cloud is more secure than On Prem

It is obvious that we are heading with this discussion in the direction of the classical security hygiene like risk management, identity management, patch management etc. to the extend needed by the customer, which is basically risk management. This needs to be done in every infrastructure and it needs to be done professionally. However, as most companies do not have IT as their core competence, they are trying to run security with a 0.5 FTE who then has to cover all the tasks needed – and who will be on a mission impossible. And even with the big and global companies, they are having difficulties with their inventory, with patch management (as a consequence), with their identities etc. I am deeply convinced that the cloud can help there! But before we need to understand the different responsibilities, knowing that this discussion is not new by far

SD-Branch: What it is and why you'll need it

bridge between two buildings
The branch network is a critical piece of the IT infrastructure for most distributed organizations. The branch network is responsible for providing reliable, high quality communications to and from remote locations. It must be secure, easy to deploy, able to be managed centrally and cost effective. Requirements for branch networks continue to evolve with needs for increased bandwidth, quality of service, security and support for IoT. SDN and network virtualization technologies have matured to the point where they can deliver significant benefits for branch networks. For example, SD-WAN technology is rapidly being deployed to improve the quality of application delivery and reducing operational complexity. SD-WAN suppliers are rapidly consolidating branch network functions and have reduced (or eliminated) the need for branch routers and WAN optimization. The broader concept of SD-Branch is still in its early stages. During 2018, we will see a number of suppliers introduce their SD-Branch solutions.

Quote for the day:

"No obstacle is so big that one person with determination can't make a difference." -- Jay Samit

Daily Tech Digest - January 22, 2018

Buildings should behave like humans

Buildings should behave like humans
“When we look at buildings as living structures, we can understand how various systems are connected and operate together,” says Dr. Filip Ponulak, principal data scientist at Site 1001, in a press release. Ponulak says all buildings should now be listening for issues. He says it’s an innovative way of managing new buildings. Site 1001 believes its system would also work in older buildings. Chief Innovation Officer Eric Hall told me one could draw an analogy with an aging car, except that unlike cars, buildings don’t have odometers to help identify failing parts. In other words, by collecting data on failings, for example, predicting upkeep becomes possible — you know when things are likely to fail and can pre-empt them, like a flexible car service schedule. That lets facilities management “move to an entirely conditional and proactive maintenance schedule,” says Hall on the company’s website. Data centers fit into this platform, too, the company says. Indeed, I’ve written before about folks who think AI will ultimately self-manage the data center

The future of AI and endpoint security

network security digital internet firewall binary code
The key to machine learning success currently lies in the cloud. Traditional servers are not large or fast enough to process the data and create the models needed to detect and combat attacks, but by using cloud servers the process is quicker, easier and much more affordable than ever before, bringing it into the reach of more enterprises. Hackers are already using automated systems, machine learning and AI to create new cyber threats. Security experts think the next 12 months will see an acceleration in the adoption of machine learning by hackers as they try to carry out increasingly sophisticated phishing attacks. However, AI antivirus solutions are still relatively thin on the ground. Although a small number of companies do offer machine learning and AI cyber threat solutions for endpoints, such as Cylance, Darktrace and Symantec, this really should become the industry standard. Microsoft at least seems to have learned from its experience of WannaCry and is apparently turning to AI to create the next generation of anti-virus software.

Infosec expert viewpoint: Google Play malware

Another issue facing Google Play security is the complex and fragmentary nature of the Android device ecosystem, which has given rise to a patching problem, as unpatched devices are attractive targets. Google has been striving to improve on this issue, but a lack of direct control (multiple wireless carriers and manufacturers are responsible for pushing patches to a multitude of devices) will continue to hamper its efforts. Users should be discerning and skeptical when downloading anything and have passive protection along with regular backups. Watch out for malicious apps mimicking popular, reputable apps and check an app’s permissions to make sure it does not have access beyond its stated functionality. Although they cannot make up for preventative measures such as checking permissions, anti-malware products provide some protection from malicious code and can partially make up for failures to avoid malicious apps.

Collect the Dots: The New Possible for Digital Evidence

At the heart of this transformation is the power of computers. On all fronts, computational capability has soared in the last few years. Processor speeds are much faster than before, and multi-core technology takes that power to new levels. Network speeds have increased greatly, enabling much more data to be sent where it's needed, quickly and efficiently. And at the foundation of computing, the cost of data storage has never been lower. Cloud computing is another major force of change. The on-ramp for enterprise cloud computing has been long by some accounts, as many analysts predicted a faster migration from traditional data centers. But in 2017, enterprise cloud really took off. That's partly due to the ongoing juggernaut of Amazon Web Services, which arguably managed to pull off a 10-year head start on its competitors. But now, all the major software vendors are involved, including Microsoft with Azure, IBM with Bluemix, Oracle Cloud, Google Cloud and the SAP HANA Cloud Platform.

Wide-area networks: What WANs are and where they’re headed

Many believe that SD-WAN is poised to take off in 2018, moving from an early adopter technology to mainstream implementation. Research firm IDC has predicted () that SD-WAN revenues will hit $2.3 billion in 2018, with a potential revenue target of $8 billion by 2021. The first phase of SD-WAN aimed at creating hybrid WANs and aggregating MPLS and Internet connections to lower costs; the next phase will improve management, monitoring and provide better security, according to Lee Doyle of Doyle Research. A subset of SD-WAN called SD-Branch will help reduce the need for hardware within branch offices, replacing many physical devices with software running on off-the-shelf servers. Mobile backup across a SD-WAN can provide a failover for broadband connections as wireless WAN technology (4G, LTE, etc.) costs decrease. ... Asynchronous Transfer Mode (ATM) is similar to frame relay with one big difference being that data is broken into standard-sized packets called cells.

OnePlus Attackers Steal Credit Card Data From 40,000 Customers

Data Breach Alarm
The admission that there was a data breach comes three days after OnePlus announced that it was temporarily disabling credit card payments on its website. OnePlus disabled the credit card payments on Jan. 16, after receiving reports from customers that they were seeing unknown credit card charges after buying something online from OnePlus. "One of our systems was attacked, and a malicious script was injected into the payment page code to sniff out credit card info while it was being entered," OnePlus stated in an advisory on the breach. The attack appears to had been ongoing from mid-November 2017 until Jan. 11, 2018, OnePlus said. According to the company, credit card information (card numbers, expiration dates and security codes) that was entered on the Oneplus.net site may have been compromised. Users who saved their credit card information on the site, as well as those who use PayPal, do not appear to be impacted by the breach, however.

Take your online security more seriously this year

“People, me included, are lazy,” says a Web Developer, Joe Tortuga, “And ease of use is inversely related to security. If it’s just difficult, people won’t just do it.” Well, if you don’t work towards a safer internet for yourself and others, then who would? One of the most important online security measures that we should adopt is using strong, difficult-to-guess passwords. Alas, in most cases, our passwords are not strong enough to confuse a hacker who tries to find a back door. A strong password should contain different characters that make it difficult to guess. In some cases, when signing up for a particular service on the internet, users use their real details, an online approach that many internet security experts kick against. “What happens is that you build up an online profile of yourself across several sites that hackers can use to guess your weak passwords,” says Murphy Shaun, CEO of Online Security Agency, PrivateGiant.

Understanding Supply Chain Cyber Attacks

Host organizations now face having to adapt security procedures to include not just internal infrastructures, but also vendors, customers, and even partners. While internal IT and security departments might have strong security practices for thwarting a wide range of direct attacks, third-party collaborators might not adhere to the same culture. Consequently, programs for vetting vendors need to be in place before fully integrating them into internal infrastructures. Building a vendor management program is ideal and should start with defining an organization's most important vendors. Building the program around a risk-based approach ensures that vendors are constantly evaluated and assessed, and their policies are consistent with the host organization. Besides requiring vendors to provide timely notification of any internal security incident, periodic security reports should be included in the collaboration guidelines to regularly ascertain their security status.

What are the key areas that need to be transformed for a smart city concept?

Manchester Smart City
True Innovation comes from collaboration. This belief sits at the core of the Open Innovation challenge, which was launched today by Cisco and Manchester Science Partnerships (MSP), who are on the search to work with some of the UK’s best small and medium-sized enterprises (SMEs) with a vision to transform Manchester through smart technologies. Convened by CityVerve, the UK’s smart city demonstrator, the challenge will see eight SMEs selected to participate in an eight week initiative in Manchester to combine technology, data and creativity to tackle some of the city’s biggest problems in healthcare, transport and energy. Commencing in March 2018, the initiative gives SMEs the opportunity to work with partners from public sector, corporate and academic worlds who are part of the CityVerve Internet of Things (IoT) test bed. The eight selected SMEs from across the UK will have the opportunity to put their innovative solutions to the test, in a real-life situation.

Salted Hash Ep 15: The state of security now and the not too distant future

The adage of ‘it’s not if you’ll be hacked, but when’ is still realistic, but maybe now It’s wiser to consider what your organization can do to get in front of any potential situation and prevent as much damage as possible. “My prediction is that you’re going to start to see executives start to stop treating IT Security as a product they can implement, and start treating it as an operational concern as equally as important as managing their finances,” Lee remarked. In addition to looking ahead, we also take a look back at some interesting moments in 2017. One of the standouts is the Justice Department naming foreign actors and indicting them for their acts. This leads to an interesting conversation of the crossover between law and government operations, and down the path of once it’s on the internet, it’s there forever.

Quote for the day:

"There are three secrets to managing. The first secret is have patience. The second is be patient. And the third most important secret is patience." -- Chuck Tanner