Daily Tech Digest - October 15, 2018

We Need to be Examining the Ethics and Governance of Artificial Intelligence


Recently, the role that pre-crime and artificial intelligence can play in our world has been explored in episodes of the popular Netflix TV show Black Mirror, focusing on the debate between free will and determinism. Working in counter-terrorism, I know that the use of artificial intelligence in the security space is fast becoming a reality. After all, decisions and choices previously made by humans are being increasingly delegated to algorithms, which can advise, and decide, how data is interpreted and what actions should result. Take the example of new technology that can recognize not just our faces but also determine our mood and map our body language. Such systems can even tell a real smile from a fake one. Being able to utilize this in predicting the risk of a security threat in a crowded airport or train station, and prevent it from occurring, for example, would be useful. Some conversations I have had with individuals working in cyber-security indicate that it is already being done.



UK gov launches 'world's first' Code of Practice for IoT security
The Code defines 13 guidelines for manufacturers, service providers, developers and retailers to implement in order to ensure that IoT products are safe to use. They are: no default passwords; implement a vulnerability disclosure policy; keep software updated; securely store credentials and security-sensitive data; communicate securely, minimise exposed attack surface; ensure software integrity; ensure that personal data is protected; make systems resilient to outages; monitor system telemetry data; make it easy for consumers to delete personal data; make installation and maintenance of devices easy and validate input data HP Inc. and Centrica Hive are the first companies to sign up to the new Code. Minister for Digital Margot James said that these pledges are "a welcome first step," but "it is vital other manufacturers follow their lead to ensure strong security measures are built into everyday technology from the moment it is designed."




The so-called password-less authentication, if implemented literally, would lead us to a world where we are deprived of the chances and means to get our volition confirmed in having our identity authenticated. It would be a 1984-like world. The values of democratic societies are not compatible. Some people allege that passwords can and will be eliminated by biometrics or PIN. But logic tells that it can never happen because the former requires a password/PIN as a fallback means and the latter is no more than the weakest form of numbers-only password. Various debates over ‘password-less’ or ‘beyond-password’ authentications only make it clear that the solution to the password predicament could be found only inside the family of broadly-defined passwords. ... If PIN or PINCODE, which is the weakest form of numbers-only password, had the power to kill the password, a small sedan should be able to kill the automobile. Advocates of this idea seem to claim that a PIN is stronger than passwords when it is linked to a device while the password is not linked to the device. 



Juniper advances network automation community, skillsets

Juniper advances network automation community, skillsets
“Since a critical part of automated operations is the individual engineers and processes they follow, Juniper has put deliberate investment into these areas by introducing many formal and informal training programs, cloud-based lab services, testing as a service, free trials, live throwdowns and [the new] Juniper Engineering Network (EngNet),” Koley wrote.  Juniper Engnet is a portal that includes a variety of automation tools, resources and social communities. According to the vendor, the site features API documentations, access to Juniper Labs, virtual resources, a learning portal and an automation exchange of useful network automation tools. “Juniper Engineering Network is aimed at elevating the entire networking community to move beyond incumbent CLIs knowledge and toward an automated, abstracted, self-driving technology. The networking community, including Juniper customers and partners, can contribute to the Automation Exchange within the community," Juniper stated.


AI is no silver bullet for cyber security


“AI is not a silver bullet – when you look at the technology, you have to make sure that senior management is aware of its risks and you don’t invest in it unless you already have good cyber hygiene – starting with people,” said Pereira User education is crucial, he said, because successful cyber attackers often exploit human weaknesses and emotions through social engineering and spear phishing to penetrate a system. “Those who don’t know how phishing attacks work will fall prey to them,” he said. “The panacea and antidote for phishing attacks is cyber education, which, when tailored for a person or function, is more effective than technology in stopping such attacks in many cases.” In deciding when and how to adopt AI to improve cyber security, Pereira said organisations should start with projects that address human and people risks, followed by processes and technology. “And when you get to the technology part, AI shouldn’t come first, but rather look at it as a way to enhance security processes, such as making it faster to review logs,” he said.


Deloitte says CIOs need to adapt or perish

Deloitte CIO survey
Deloitte says that in 2018 CIOs need a better grasp on the big picture, and that means looking ‘inward’, ‘across’ and ‘beyond’ the business. “The digital era presents CIOs with the opportunity to look inward and reinvent themselves by breaking out of the trusted operator mould,” says the report. “We note, as in previous surveys, the importance of strong relationships to the CIO’s business success. This year we suggest that developing a technology fluency programme can help create a solid foundation for these relationship-building efforts. A tech fluency programme can provide organisations with knowledge about technology trends, scalability of emerging technologies and complexities of managing legacy core systems – while enabling CIOs to understand internal and external customer perspectives. “CIOs can also look across the IT organisation and transform it, particularly by focusing on the IT operating model, funding priorities and budget allocation, and tech talent and culture at the heart of their digital agendas.


Why your machine-learning team needs better feature-engineering skills


The skill of feature engineering — crafting data features optimized for machine learning — is as old as data science itself. But it’s a skill I’ve noticed is becoming more and more neglected. The high demand for machine learning has produced a large pool of data scientists who have developed expertise in tools and algorithms but lack the experience and industry-specific domain knowledge that feature engineering requires. And they are trying to compensate for that with better tools and algorithms. However, algorithms are now a commodity and don’t generate corporate IP. Generic data is becoming commoditized and cloud-based Machine Learning Services (MLaaS) like Amazon ML and Google AutoML now make it possible for even less experienced team members to run data models and get predictions within minutes. As a result, power is shifting to companies that develop an organizational competency in collecting or manufacturing proprietary data — enabled by feature engineering. Simple data acquisition and model building are no longer enough.


How blockchain technology is transforming healthcare cybersecurity

An additional critical feature of blockchain technology is that every member of a blockchain generally can access and audit the entire ledger. This allows all interested parties to confirm and update the information contained in individual blocks. Another significant benefit is that laws and regulations can be programmed into the blockchain as smart contracts. Smart contracts are logical rules programmed into the blockchain. They are self-executing contracts where the built-in agreement is enforced on all members. Smart contracts mimic traditional contracts and laws, and can be used to program in obligations and consequences. In this way, the requirements of specific data privacy and security laws, such as the Health Insurance Portability and Accountability Act of 1996 or the European Union General Data Protection Regulation, can be embedded in the blockchain. Innovators are already experimenting with blockchain use cases in the healthcare context that demonstrate many of the blockchain security benefits.


How To Integrate AI Into The Enterprise


Overcoming ignorance is a good place to start, and the tutorial given by Hammond was a pleasing break from many technology events that are largely attended by people from whatever discipline the event covers. Data scientists attend data events, roboticists attend robotics events, and so on. At the O'Reilly event however, techies were in the minority, with most of the attendees from managerial functions. The session began by providing an overview of what AI is, with a whistle stop tour of machine learning, and specifically the nature of learning itself, which feeds into the supervised, unsupervised and reinforcement learning models used by all machine learning systems today. Machine learning is, of course, just one aspect of AI, with McKinsey recently identifying five distinct forms, including physical AI, computer vision, natural-language processing, natural language, and then machine learning. Understanding what each of these is, even on a basic level, can you help you to make informed choices, and not be suckered in by hype.


Criminals' Cryptocurrency Addiction Continues

"With the increasing, malicious focus on cryptocurrency-related threats, attacks and exploits, it is clear that criminal innovation in this space continues unabated," Ferguson tells Information Security Media Group. "Starting from attacks targeting cryptocurrency wallets on individual users' machines - either directly or as an add-on to some widespread ransomware variants - attackers have rapidly diversified into direct breaches of cryptocurrency exchanges, malware for mining on traditional, mobile and even IoT devices, and developed attack methodologies specifically designed to target the mechanics of blockchain-based transactions, such as the 51 percent attack." The 51 percent attack gives attackers who can control more than 50 percent of a network's hash rate - or computing power - the power to reverse transactions on the blockchain or double-spend coins. The first half of this year saw five successful 51 percent attacks leading to "direct financial losses ranging from $0.55 million to $18 million," Moscow-based cybersecurity firm Group-IB says in a recently released cybercrime trends report.



Quote for the day:


"Leaders should influence others in such a way that it builds people up, encourages and edifies them so they can duplicate this attitude in others." -- Bob Goshen


Daily Tech Digest - October 14, 2018


According to the sources, global fintech companies reportedly sought an extension of the October 15 deadline but it seems that the RBI is not inclined to relax the norms. Data localisation requires data about residents be collected, processed, and stored inside the country, often before being transferred internationally, and usually transferred only after meeting local privacy or data protection laws. Although domestic companies have welcomed the guidelines, global companies fear increase in their expenses for creation of local servers. To avoid this rise in cost, global companies in recent meeting with the RBI proposed to provide mirror data instead of original data to which the central bank did not agree, the sources said. Last week, Finance Minister Arun Jaitley met RBI Deputy Governor B P Kanungo to discuss RBI’s data localisation norms. The meeting was also attended by Economic Affairs Secretary Subhash Chandra Garg, Financial Services Secretary Rajiv Kumar and IT Secretary Ajay Prakash Sawhney.



The Data Quality Tipping Point

The Data Quality Tipping Point
It’s clear that data is no longer harvested and stored. Data isn’t left to rest any longer. It is the lifeblood that flows through every department in the business. It’s not just the result of a decision: it’s the driving force for your next move. Old, inaccurate and messy data can’t support the marketing department. If the data is old, it cannot be used as a concrete and reliable resource. And if you aren’t continually cleaning new data that comes in, you can’t capitalise on trends, or make decisions on what is and isn’t working. So we’re clear that data quality initiatives must run in parallel to business activities, rather than being carried out sporadically, and there needs to be a constant and attentive process to keep data clean. That means there’s a need for an ongoing investment in data governance, within the parameters of your budget. Few businesses have the budget to put extravagant data management processes in place. It would be wonderful to conduct data reviews every morning, or implement highly elaborate verification and enhancement programs.


Creating a Culture that Works for Data Science and Engineering


While both groups on the team are turning out great code, it’s challenging as a project manager to follow two different streams of work. Sometimes the two groups are working on similar things, but sometimes the data scientists are working on something in the very distant future for the engineers. The most important thing a cross-functional team can do is have everyone come to stand up every day. When we first told the data scientists about our daily “meetings,” they went pale in the face. “Every day?” they asked, with a look of panic in their eyes. I stood firm. It was the right call. Our daily meetings allow the engineers on our team to quickly start working from an informed place when R&D introduces a new project. Furthermore, we are benefiting from the best parts of agile with this approach; I love hearing everyone bounce ideas off each other in stand up. My favorite is when there’s a cross-functional “Ooo did you think about taking this approach?” We work better as a team and we have found a way to leverage everyone’s expertise.



The tech supply chain is more vulnerable than ever


It’s a great business model — especially when you consider that only 38 percent of companies are actively monitoring and managing their software supply chain hygiene. Today, the game has changed. Organizations now must contend with the fact that hackers are intentionally planting vulnerabilities directly into the supply of open source components. In one such example from February 2018, a core contributor to the conventional-changelog ecosystem (a common JavaScript code package) had his commit credentials compromised. A bad actor, using these credentials, published a malicious version of conventional-changelog (version 1.2.0) to npmjs.com. While the intentionally compromised component was only available in the supply chain for 35 hours, estimates are that it was downloaded and installed more than 28,000 times. Some percentage of these vulnerable components were then assembled into applications that were then released into production. The result is that these organizations then unwittingly released a Monero cryptocurrency miner into the wild — and the perpetrators of the supply chain hack profited handsomely.



How to use machine learning to build a predictive algorithm

You also have to make sure you're integrating not only data and platforms, but domain experts who bring invaluable information and skills to the data science team, according to David Ledbetter, a data scientist at Children's Hospital Los Angeles. "The machine learning community often isolates themselves and thinks they can solve all the problems, but domain experts bring value," Ledbetter said during a panel discussion at the AI World Conference & Expo in Boston in December. "Every time we meet with the clinical team, we learn something about what's going on with the data." The project team, with its mix of skills, needs to also identify good vs. bad outcomes based on the business problem you're trying to solve with a predictive algorithm. "It's important to set clear success criteria at the beginning of a project, and [to] pick something that has a reasonable likelihood of success," said William Mark, president of SRI International, aresearch and development firm that works on AI projects for customers, during the same panel discussion at AI World.


Cloud-agnostic container platforms – it’s all to play for

Steps into blue sky with clouds, sun © kraft2727 - Fotolia.com
Container-as-a-service (CaaS) products from the major cloud vendors, notably AWS EKS and Fargate, Azure AKS and Container Instances and Google Cloud Container Engine, present classic trade-offs between convenience and dependence. With their ability to tap into a plethora of cloud data, security and developer services that are unique in implementation if not conception, container products from the big three vendors can trap users in a maze of platform dependencies with no easy exit path. As container use in the enterprise moves from developer sandboxes to production systems, the desire for multi-environment portability presents an opportunity to devise standards, software, and automation systems that facilitate platform-agnostic container platforms. The idea is to ensure easy migration between private and public container environments. Recent announcements from Cisco, Google, and Pivotal Software are important milestones on the road to platform agnostic container infrastructure.


Welcome to Banking-as-a-Service

The underlying theme of this kind of disruption is the unbundling of supply and service. Banking has come late to the unbundling revolution. But now, the sector is ripe for it - for unbundling, or disaggregation - and ripe for its own Software-as-a-Service transformation that will allow customers to pick and choose and pay for applications as they use them. Software-as-a-Service (SaaS) businesses delivered by APIs have a low-touch sales model. These companies don’t sell; buyers help themselves. Low-touch sales combined with recurring revenues and lack of customer concentration are the three hallmarks of a SaaS business. In many cases these businesses are just better in all senses. But combining these three essential ingredients on their own will not be enough. The winners in this field are likely to be nimble specialists capable of creating plug-in-and-play APIs to allow anything to be processed anywhere, rather than the large - slow - generalists of the past. Starling is well-placed in this regard. We have built Starling with a set of public APIs that are freely available for anyone to use through our developer portal. 


5 Tips to Boost Your Company's Digital Transformation With BPM


With tools such as artificial intelligence and machine learning, reams of data can be processed in the blink of an eye, providing insights into how an organization can better meet customer needs. Often, this optimization is a product of changes in business process management, or BPM. Even the most basic organizations function through processes. There might be a process for acquiring leads, a process for vetting them, and a process for making a sale. After you convert a prospect, there's a process for invoicing the customer, one for fulfilling the order, and one for delivering the product. There are also strictly internal processes, such as those triggered when employees ask for time off or request tech support. BPM refers to the management of these procedures, such as ensuring they are effective and determining how to combine them in the most efficient way. When implemented effectively, BPM helps organizations streamline their day-to-day processes, making work more efficient. But implementing BPM or other digital transformations without full buy-in from your team can lead to a lack of teamwork or other disadvantages. 


APIs In Banking: Unlocking Business Value With Banking As A Platform (BaaP)

Banking as a Platform (BaaP), sometimes referred to as Banking as a Service (BaaS), occurs when a bank acts as an infrastructure provider to external third parties. Variations include other banks white-labeling the BaaP platform for faster time to market, fintech firms leveraging the BaaP provider’s banking license to provision bank accounts, and banks and fintechs using the BaaP platform for testing purposes. Banks like CBW, Fidor, JB Financial, solarisBank, and wirecard built their BaaP architecture from scratch, without the constraint of legacy systems, creating modular application stacks broken into discrete services. The modular banking services on a BaaP platform serve as building blocks, accessible to third parties through an API management layer, where they can be mixed and matched to create new products and services tailored to the third party’s business model


Life Is Dirty. So Is Your Data. Get Used to It.

As Dr. Hammond suggests, it's difficult to determine if data is ever clean. Even scientific constants have a degree of accuracy. They are "good enough," but not perfect. Data's ultimate purpose is to drive decisions. Bad data means bad decisions. As data professionals, it is up to us to help keep data "good enough" for use by others. We have to think of ourselves as data janitors. But nobody goes to school to become a data janitor. Let's talk about options for cleaning dirty data. Here's a handful of techniques that you should consider when working with data. Remember, all data is dirty, you won't be able to make it perfect. Your focus should be making it "good enough" to pass along to the next person. The first thing you should do when working with a dataset is to examine the data. Ask yourself, "does this data make sense?" That's what we did in the example above. We looked at the first few rows of data and found that both the city and country listed inside one column.



Quote for the day:


"Courage is more exhilarating than fear and in the long run it is easier." -- Eleanor Roosevelt


Daily Tech Digest - October 13, 2018

Of the survey respondents who report a blockchain project in the pilot stage, 54 percent say the effort sometimes or often hasn’t been justified by the result. This should be a call to more effective action. To help executives answer that call, the report offers four strategies that can be used to build trust.  ... The participants in a blockchain ecosystem need to decide what the operating standards will be and what various users will be able to see and do. The design begins with the strategic business model, which includes making decisions about whether the blockchain will be permissionless, and thus available to everyone, or permissioned (having various levels of permissions). Permissions determine participants’ roles and engagement with the blockchain, which can vary from entering information or transactions to only viewing information. The choice of model isn’t automatic; organizations will decide based on design and use case considerations. They will also need to consider the type of network to establish. Forty percent of survey respondents report that they are using permissioned blockchains, 34 percent are working with permissionless chains, and 26 percent are taking a hybrid approach


How to put cybersecurity threats into a business context

Focusing on business impact is a different way to think about cybersecurity, and it requires a different mindset than that of tactically responding to cybersecurity threats. Cybersecurity used to be all about preventing attacks, and a breach either occurred or it didn't. "Now, most organizations understand that cybersecurity is not a problem to be solved but a risk to be managed," says Andrew Morrison, US leader of cyberstrategy defense and response at Deloitte & Touche. "Most of the market is acclimated to the fact that it's not longer if an attack will occur but when an attack will occur and how we will manage it. That entails a totally different mindset. "Risks, by nature, can be accepted, mitigated, or transferred," he says. ... A business-focused description of the same problem, however, might be that patching the vulnerability will reduce the probability of a breach to a particular database, which, if exposed, will cost a particular amount of money in lost business, fines and remediation expenses.


Big data processing techniques to streamline analytics


Addressing big data processing techniques requires innovative algorithms and programming, rather than simply adding hardware power. A solution widely used is indexing and partitioning the data to provide better access. GeoSpock's infin8 uses data indexing to process and organize data for subsecond data retrieval by ingesting and processing raw data at any scale, then creating an organized index that preserves every record of the original data set. Making the algorithms smarter has another interesting effect, too, allowing companies to reliably harvest data from images, video and audio that opens the door to new generations of applications that can "look and hear." These advancements let machines scan footage and tag the objects or people they detect. It can also be used as part of companies' intelligence-gathering arsenal. Artificial intelligence provides big benefits in this realm. Advancements in artificial intelligence require large amounts of data to operate properly, and these AI tools provide a better view of the data to see what parts of the data set are more useful and which parts have less value that can be deprioritized.


Why Business Leaders Shouldn’t Have Blind Faith in AI

San Marcos student Amaris Gonzalez takes a selfie with "Pepper" an artificial Intelligence project utilizing a humanoid robot. | Reuters/Mike Blake
Most machine learning algorithms are also bad at thinking about what Athey calls “what-if” scenarios. Like what would happen if a company were to change its prices, or if it hadn’t run a certain ad campaign. And here is where misguided faith in the accuracy of machine learning can become problematic in practice. Consider an algorithm designed to predict hotel-room occupancy based on observed prices, Athey says. It would look at historical occupancy rates and prices and draw the correct conclusion that the hotel is full when prices are high. However, if that predictive model was applied to optimize prices, it would lead to the conclusion that in order to get more people into your hotel, you should raise prices. “Which is of course wrong,” Athey says. “Just because higher prices are correlated with a full hotel doesn’t mean if you change your price you will sell more hotel rooms.”


Regulators can do more to encourage fintech innovation

A lot of work has been done in this area by the Consumer Financial Protection Bureau’s Office of Innovation and Project Catalyst, its predecessor, and by states like Arizona, which became the first state in the United States to adopt a regulatory sandbox statute. Yet these efforts, while welcome, fall short because they are largely focused on each agency’s policies and procedures and participant eligibility. Fintechs need more than process-oriented frameworks. To be successful, regulatory sandboxes require clearly articulated safeguards, terms of use and expectations on transparency. These matters are too important to be left to one-off negotiations. Regulatory sandboxes sound like a great idea, but what actually is a regulatory sandbox? In order for regulatory sandboxes to succeed, stakeholders need to have a common understanding — and acceptance — of the basic concept. First, regulatory sandboxes need a better name. Terms like “clinical trial,” “experiment,” or “lab” may better convey what is really needed.


This AI can predict your personality just by looking at your eyes

An exhibitor presents replacement puppet eyes at the Northern German puppet, teddy bear and miniatures fair in Hamburg February 14, 2010. Picture taken February 14, 2010.  REUTERS/Morris Mac Matzen (GERMANY - Tags: SOCIETY BUSINESS)
The project used artificial intelligence to track and monitor the eye movements of 42 individuals using tools from SensoMotoric Instruments. Those findings were then cross-checked with well-established questionnaires that define personality traits. Of the five key traits – openness, conscientiousness, extraversion, agreeableness and neuroticism – the technology easily identified four: neuroticism, extraversion, agreeableness and conscientiousness. The 42 people were fitted with an eye tracker and given five Australian dollars and 10 minutes to make a purchase in a university campus shop. When they returned they removed the eye tracker and filled in personality and curiosity questionnaires. The findings were analyzed to show how trait-specific eye movements vary across activities. While the study used a small sample and the authors said the predictions aren’t yet accurate enough for practical applications, it does shed light on the close link between personality and eye movements. Pupil diameter, for example, was important for predicting neuroticism.


Meet Your New Colleague: AI


How potential employees actually speak to AI is a different conversation than how potential employees should speak to AI, he added. That is, it’s unclear whether how a person treats a machine says anything about how that person would treat other people, and it’s unclear whether something like a person being rude to a machine agent should impact their job prospects. “We can certainly agree that we do care if it’s a human recruiting coordinator,” Mortenson said. But machines have no feelings or emotions and cannot be offended, so it would be easy to argue why employers shouldn’t care. Ultimately, “I do think we should care even if it is a machine,” Mortenson said. “I understand why we might care a little bit less, but I don’t think we can just discard that as a signal.” He gave the example of a report which found that this technology could have implications on how kids learn how to communicate and teach them that speaking harshly or impolitely to people has no consequences.


How digital technology is changing the world

Hitachi is working with major manufacturers on their digitisation journey, moving away from the conventional customer/supplier relationship and focusing on digital innovation through co-creation. This approach is already delivering results. Swedish ferry operator Stena Line, for instance, wanted to optimise many aspects of its operations, to reduce costs and inefficiencies such as excess fuel consumption. Hitachi gathered data from its ship’s operations and functions and used it to develop an AI algorithm that calculated an optimal way of steering Stena’s vessels and reducing fuel consumption. Mr Ramachander says: “We couldn’t have done that in isolation, without the shipping company. Co-creation is about working with our clients to solve their problems. In a move away from the traditional customer/supplier relationship, we are aiming to become their digital innovation business partner.”


Managing to the Next Century - The 5 Big Things For Agile Transitions


In the new agile world, it is neither possible to tell people to do a particular task or plan at the same level of breadth or depth. Work is defined, managed and executed by empowered teams who are not focused on the task, but instead the outcome they are trying to achieve. Quality including technical debt is treated in the same way that value is treated allowing the team and the business to make explicit, transparent decisions on trade-offs. But moving away from traditional managed work to a more agile approach requires more than managers stepping away ... At the very heart of the agile organization is a collection of teams, self-organized and empowered to make decisions. They have all the right skills to deliver value and are supported by an organization that fills in any gaps and helps them to get better. At scale that means teams of teams and the adoption of practices to ensure that dependencies are effectively managed.


Banking on artificial intelligence

Robot with a credit card. No, really.
Automation and handling masses of data is very valuable indeed but front-line services are also receiving attention and it is here, when married with human intervention, that excitement lies around the use of AI. The concept lies in being able to enhance the service provided to customers via virtual assistants, chatbots, robo-advisors and other analytical tools, all of which can be made more effective when machine learning and AI are applied. Providing better customer service is a good use for AI and something that all banks are focused on. Indeed, banks are commonly using chatbots and voicebots to interact with customers and solve basic problems without the need for human backup. Avika says: “Banks are using machine learning to improve customer engagement in order to increase customer satisfaction. For example, applying machine learning to unstructured complaints data can help a bank to group the complaints into categories, allowing them to tackle the areas that will have the biggest customer impact first. ...”



Quote for the day:


"You may be disappointed if you fail, but you are doomed if you don't try." -- Beverly Sills


Daily Tech Digest - October 12, 2018


The first step in reducing TCO is understanding what it is and why current solutions are driving it so high. A data protection TCO analysis should do what its name implies; calculate the TOTAL cost of ownership. For data protection this means adding up all the hard costs like data protection storage, the data protection network and data protection software. It should also include periodic costs like hardware and software maintenance (including support) as well as subscription costs like cloud storage or cloud compute. Calculating data protection infrastructure TCO also means adding up the operating costs associated with learning and operating the data protection system. Most data protection solutions are not self-service or designed for IT generalists; they need a well-trained administrator familiar with the infrastructure to interact with it. Operating costs are particularly important because certain complicated data protection tasks – like a full restore – will require a knowledgeable person to complete.



Taking Agile Transformations Beyond the Tipping Point

Not all leaders can make this transition. For example, one Asia-Pacific company undergoing an agile transformation replaced one-quarter of its top 40 leaders with individuals who better embodied agile values, such as collaboration and teamwork. Middle managers will also face challenges. Those who have grown up inside silos will need to learn how to manage cross-functional teams and delegate decision making to employees closer to the field. They may even need to return to doing the daily work rather than only managing other people. The coordination activities that consumed so much of managers’ time are increasingly handled within and between teams. While agile may be a fundamentally different way of working, many of the steps to become an agile organization are familiar to any executive who has gone through a successful corporate transformation. (See Exhibit 2.) The steps of committing, designing, preparing, and refining are variations of any large-scale change.


Detail of Dutch reaction to Russian cyber attack made public deliberately


The attackers used a rental car parked close to the OPCW building in The Hague. The hackers then attempted to use Pineapples to break into the WiFi network of the organisation. Pineapples are devices usually used for intercepting network traffic. The hackers were also caught using antennas and signal amplifiers, and other equipment the MIVD considers “specifically used during hacking operations”. During the operation, the MIVD found laptops with extra batteries (which the MIVD said were purchased in the Netherlands), and mobile phones with 4G connectivity, which the hackers tried to destroy during their arrest. Eichelsheim reiterated that the excuse the Russian might’ve simply been on holiday won’t fly. “They were caught with very specific equipment, entered on diplomatic visas, and were found carrying €20,000 and $20,000 in cash. That’s not a holiday.”


A Day In The Life Of Ms. Smith: How IoT And IIoT Enhance Our Lives

Ms. Smith walks out of the building. An RFID reader at the door scans her badge as she walks past it. Computer vision sees her approaching the exit and walking into the parking lot. The drive home is much like her drive to work. Computer vision devices on the road monitor and control traffic signals. Her ride home is slow—but again, she misses most of the red lights. Fifteen minutes before she gets home, the thermostat automatically turns on the heat (or cooling) so that the temperature is comfortable when she comes in the door. Finally at home, she walks inside, and the lights turn on. To relax, she turns on the TV, and the lights in the room automatically dim, making it easier for her to watch her favorite show. As she’s ready for bed, she says, “Turn down the lights,” to her digital assistant. “Oh, and wake me up at 5:30,” she says. “No, make it 6.” Lights in the other parts of her house dim, the lights in her bedroom slowly fade, and so does Ms. Smith.


5 CRM trends for 2018

5 CRM trends for 2018
Applying machine learning to CRM data has been a difficult process for most organizations. To do this traditionally you would need machine learning expertise on staff, developers and the drive to build the solution. Alternatively, you would have to build and maintain integration between your CRM system and an external machine learning service. That’s starting to change. “Machine learning is now built directly into CRM products,” explains Julian Poulter, research director for CRM and CX (customer experience) at Gartner. “We have seen about 30 use cases applying machine learning to CRM, but industry adoption is slow so far. The use cases include recommending alternative products, lead scoring and ecommerce recommendations.” That means the kinds of product recommendation features offered by Amazon and other ecommerce providers are within reach of many more organizations. But that’s not the only way machine learning can help.


Spinnaker is the Kubernetes of Continuous Delivery

Despite its humble and slow start, Spinnaker is enjoying widespread adoption. Today, Spinnaker is backed by industry leaders like Microsoft, Google, Netflix, Oracle and so on. It’s supported by all major cloud providers, including but not limited to, AWS, Google Compute Platform, Microsoft Azure and OpenStack. Spinnaker users include big names like Capital One, Adobe, Schibsted, LookOut and more. There is a growing vendor ecosystem around it which includes players like Mirantis, Armory and OpsMx. ... There were roughly 400 people at the event, representing over 125 companies and over 16 countries. During the Summit, the community announced the governance structure for the project. “Initially, there will be a steering committee and a technical oversight committee. At the moment Google and Netflix are steering the governance body, but we would like to see more diversity,” said Steven Kim, Google’s Software Engineering Manager who leads the Google team that works on Spinnaker.


Anomaly detection methods unleash microservices performance

AKF cube diagram
A symptom-manifestation-cause approach involves working back from external signs of poor performance to internal manifestations of a problem to then investigate likely root causes. For example, the symptom of increased response times can be tracked to the internal manifestation of excess latency in message passing between the app's services, which occurred because of a failing network switch. Other potential root causes exist for those same symptoms and manifestation, however. For example, an application design using overly large message requests, or too many small messages, would cause the same issue. These root causes would be found by different tools and resolved by different people. Change-impact analysis creates broad categories that lump together changes in component-level metrics based on their effect on external performance measures. These metric categories might include network link latency, database queue depth and CPU utilization, grouped according to assessments such as excessive resource usage, cost overages or response time.


Unlock distributed analytics with a microservices approach


Combining BI and analytics software with a microservices approach enables average end users to drill down into data with specific types of queries. When it comes time to visualize that data, organizations must decide whether to build customized visualization tools in-house or adopt a third-party option. A vast number of options exist for visualization, which include web-based platforms and stand-alone, open source tools. These tools tend to focus on a range of data interaction, from complex depictions of near-time data to simple renderings. However, big data sources have their limitations. Streaming and unstructured data sources present challenges that mainstream analytical tools struggle to depict. For example, some query connections won't accept data set blending, which limits exploratory analysis. Teams may also encounter system timeouts, out-of-memory exceptions, long query waits and rendering limitations. However, distributed analytics approaches can excel in big data.


Digital transformation in 2019: Lessons learned the hard way

Because of the focus on the technology components, the people-side of the changes required for digital transformation often go under-addressed, yet arguably are the key success factors. That's because the people in the organization have to carry out the digital transformation, yet are often inadequately equipped to do so from a skill, culture, mindset, inclination, and talent perspective. Many organizations have had their digital change initiatives crash upon the shoals of insufficient human capability to carry them out or an inadequately enabling environment. Currently, lack of appropriately skilled personnel ranks in the top five obstacles to digital transformation and is reported by 39 percent of orgs. The good news is that improved organizational focus and improved techniques for upskilling workers to support digital transformation have been arriving. Expect to see both more in 2019. The smart digital leader will use the resources of HR's L&D department to help drive them.


Multicloud does not eliminate vendor lockin

Multicloud does not eliminate vendor lockin
You might think you can avoid the trade-off by using containers or otherwise writing applications so they are portable. But there is a trade-off there as well. Containers are great, and they do provide cloud-to-cloud portability, but you’ll have to modify most applications to take full advantage of containers. That could be an even bigger cost than going cloud-native. Is it worth the avoided lockin? That’s a question you’ll need to answer for each case. Moreover, writing applications so they are portable typically leads to the least-common-denominator approach to be able to work with all platforms. And that means that they will not work well everywhere, because they are not cloud-native. I suppose you could write portable applications that are cloud-native to mutiple clouds, but then you’re really writing the application multiple times in advance and just using one instance at a time. That’s really complex and expensive. Lockin is unavoidable. But lockin is a choice we all must make in several areas: language, tooling, architecture, and, yes, platform.



Quote for the day:


"Leadership cannot just go along to get along. Leadership must meet the moral challenge of the day." -- Jesse Jackson


Daily Tech Digest - October 11, 2018

No company seems to safe anymore. In 2018 alone, we have seen the social media giant Facebook reporting data breaches twice affecting millions of users each time. As if this wasn’t enough, a couple of days ago Google reported exposing the data of more than 500K users of its social network Google+ between 2015 & March 2018. The ironical part is that Google reported no misuse of data but in response to this incident has decided to completely shut down the portal, Huh? Apparently, Google didn’t disclose this earlier citing fear of regulatory scrutiny. Wondering if we should still trust these tech giants with our personal data? European Data regulation like the GDPR is a step in the right direction in protecting customers’ data & these tech companies are now facing multi billion dollar lawsuits. On the side note, the Crypto industry is facing a similar situation with more than $927 million worth of digital money stolen to date this year — 3.5 times more than 2017.


The benefits of IAM processes, strategies for digitized companies


"Companies are using more and more systems than they ever have before. They're collecting more data, [and] the employees' job roles are changing faster," he said, adding that identity access management sits at the nexus of all those dynamics. Consider how an employee may require access to specific data or certain applications to work on a project, but will not need that access on an ongoing basis, he said. IT should be capable of changing access rights of not just that employee, but dozens, hundreds or even thousands of employees, as needed. However, not all organizations are maturing their IAM practices, Maxim said. "There are still a lot of companies that are doing very little with IAM -- they're working on spreadsheets, or they've reached a limit to what they could do with their homegrown systems," he said. However, he noted that many of them are "actively looking to find ways to streamline what they're doing."


Successful data-driven companies must balance human and machine roles

The anticipated redistribution of work between humans and machines may displace 75 million jobs, but it’s likely to create as many as 133 million new ones, too, according to the report. This major shift in jobs may not reassure those of you who believe technology is a threat to your role. But the reality is that smarter technologies provide an amazing opportunity to focus on the ways that we create the most value for our organizations. Creativity and strategic thinking remain distinctly human advantages. When paired with the increased processing capacity of machines, there is plenty of room to be optimistic about the future. Understanding the strengths of people vs. machines We are far from a reality where we can trust machines to make business decisions with human-like judgment and contextual understanding. Today, we trust machines to automate tasks and analysis in areas that are heavily parameterized and minimally risky. 


Disaster Recovery: Data Center or Host Infrastructure Reroute


Regardless of which approach you take, even if everything works flawlessly, you still need to address the ‘brownout’ phenomenon or the time it takes for services to be restored at the primary or to a secondary location. It is even more important to automatically send people to a different location if performance is impaired. Several people have heard of GSLB, and while many use it today, it is not part of their comprehensive DoS approach. But it should be. If your goal with your DDoS mitigation solution is to ensure an uninterrupted service in addition to meeting your approved performance SLA; then dynamic GSLB or infrastructure based performance load balancing has to be an integral part of any design. We can deploy this technology purely defensively, as we have traditionally done with all DoS investments or we change the paradigm and deploy the technology to help us exceed expectations. This allows us to give each individual user the best experience possible.


Suspected NASA Hacker Busted After Boasting About Exploits

The suspect was identified after a year-long investigation by the Polizia Postale - Italy's postal police - via its CNAIPIC group, which since 2008 has served as the national anti-crime computer center for the protection of critical infrastructure. It regularly investigates cybercrime. After identifying the suspect, police say they executed a search that resulted in the seizure of computing devices, which have tied the suspect to attacks against at least 60 Italian websites. In addition, rather than just being a member of the "Master Italian Hackers," the suspect appears to have been one of its leaders, authorities say. The Italian suspect is the latest in a long list of admitted hackers whose "too much information sharing" habits got them in trouble. To pick just one example: Last year, Russian-born Alexander Konstantinovich Tverdokhlebov, who emigrated to the U.S. in 2007, later becoming a naturalized citizen, pleaded guilty in U.S. federal court to having been "an active member of several highly exclusive Russian-speaking cybercrime forums."


SoftBank has a lot to worry about if it strikes this deal with WeWork


It’s very possible that the talks for SoftBank Vision Fund to invest up to $20 billion into WeWork will fail, Recode was told. Here are some of the hazards that could trip up either side over the next few weeks. The Vision Fund’s single biggest outside investor, the Saudi government, which holds a 45 percent stake, is under increasing political scrutiny after allegations it is behind the disappearance of U.S.-based Washington Post journalist, Jamal Khashoggi. Backing from foreign governments has always loomed as a major liability for venture capital investors. The SoftBank-Saudi ties are not new. But the Khashoggi revelations make it particularly bad timing for a deal, as WeWork could face reputational risk for taking money from a government that’s embroiled in such a high-profile human rights case. “If all that’s alleged is true, WeWork will be in bed with a regime that has expressed brazen disregard for virtually any norm of international politics,” said Chris Meserole, a foreign policy fellow at The Brookings Institution.


3 things you should do to prevent cyber attacks

The threat landscape is constantly evolving, with cyber criminals always looking for new exploits and studying one another’s tactics. As soon as a particular exploit proves successful, crooks the world over will adopt and refine it.  The majority of successful attacks come in the immediate aftermath of the popularisation of a particular attack method. That’s because its success is predicated on the fact that many organisations are vulnerable to it. Once the trend becomes common knowledge, organisations learn how it works and address it.  You can greatly minimise your chances of coming under attack by staying informed about growing trends. There are many ISACs (Information Sharing and Analysis Centres) that you can use to gather real-time threat intelligence.   When it comes to addressing new attack methods, processes and policies are relatively resilient and will perhaps only need to be tweaked. You are much more likely to need to update your software and web applications.


Automate everything or get left behind

Automate everything or get left behind image
Discovery and auto-monitoring. Sophisticated monitoring solutions use an increasing range of methods, including direct access to hosts via SSH and indirect access via configuration repositories like ActiveDirectory and services like Windows Discovery, to extract facts from existing infrastructure and speed up monitoring configuration by operators. Leading-edge products are now moving towards automating the process completely: creating comprehensive maps of infrastructure, apps, and complete business services and monitoring these things without the need for any manual intervention or direction. Alert processing, notification, escalation, integration. Alerting is, of course, a powerful form of automation. It entails decision-making, which may be simple or significantly more complex (e.g., several metrics, from separate systems, have entered states predictive of a particular kind of known failure for a critical business service). It involves sophisticated assignment and escalation based on issue, team rotas, time/date and other variables.


SD-WAN Adolescence Is About Interoperability and Scalability

SD-WAN-Adolescence-Is-About-Interoperability-Scalability
Almost everyone (enterprises, CSPs, standards bodies, vendors) I spoke with acknowledges that data-plane interoperability in SD-WAN is unlikely in the near future. Enterprises and CSPs are telling me they don’t need it yet. This brings back memories of the old IPsec interoperability wars. Trying to create interoperable meshes of nodes from disparate vendors today is really putting the cart before the horse because we need to start from the control plane. CSPs that are in the process of building or customizing their orchestration systems to integrate with SD-WAN offerings say that having interoperability at the control and management level allows for coordination between multiple SD-WAN domains. It also makes switching vendors feasible with much less pain. The present efforts focus on interoperability at the northbound API level. They govern the APIs used to provision and control SD-WAN deployments.


Security warning: Attackers are using these five hacking tools to target you

Perhaps the most potentially damaging of the dangers detailed in the report are remote access trojans - malware which is secretly installed onto an infected system providing a backdoor to observe all activity and enabling the attacker to carry out commands which lead to data being stolen. The particular example given in the report is JBiFrost, a trojan typically employed by low-skilled cyber criminals but with the capability to be exploited by state actors. What makes JBiFrost so potent is that it is cross-platform, with the ability to operate on Windows, Linux, MAC OS X and Android. Often delivered via a phishing email, it allows attackers to move across networks and install additional software. This particular RAT is publicly available and the cyber security agencies said they have observed it being used in targeted attacks against critical national infrastructure owners and their supply chain operators.



Quote for the day:


"The level of morale is a good barometer of how each of your people is experiencing your leadership." -- Danny Cox


Daily Tech Digest - October 10, 2018

security threats and vulnerabilities
Underlying all this likely nonsense is the obvious fact that almost every computer chip in the world is made outside of the U.S., often in Asian locations. I used to laugh when I was told that I couldn’t bring my Lenovo laptop in, but I could bring in my Dell laptop, which itself was full of nothing but Asian-made chips. If you are worried about supply chain threats, and you should be, it’s not just one little purported spy chip you should be worried about. You can’t find a computerized device in the U.S. that doesn’t have foreign-made chips. There isn’t some secret U.S. government agency that goes around inspecting all those chips for security holes or backdoors before they get put into all our computers. To me it is a hilarious idea that the Chinese would have to insert a specialized, tiny spy chip when it would be far easier to put an intentional weakness or backdoor into any of the hundreds of chips that are used in every computer on the planet. It would be far easier to hide in the weeds than to create a dedicated spy chip that any hardware expert would notice and question.



Overcoming the top obstacles to digital transformation success

You should begin developing a solid digital transformation strategy by first establishing a small, integrated governance team with equal representation and influence from the business and IT, including security. The governance team will enable a clear line of communication between digital and legacy IT teams and ensure initiatives are synchronized so appropriate investments are made to harden core systems while securely exposing functionality that enables digital initiatives. While security was not cited as one of the top three barriers, it remains a concern. With the highly-fragmented state of data across most enterprises today, exposing data sources to new digital systems creates yet another opportunity for attack. IT and security are integral to governance to limit risk exposure as new digital capabilities are introduced. As you launch digital initiatives, especially if you are behind the digital curve, partner with digital leaders who can provide the capabilities you need to get your products to market securely while you continue learning and developing internally.


IT departments struggle to balance innovation with everyday IT operations


“Organisations have become acutely aware of the critical role technology now plays in overall business strategy, from enabling a more productive and connected workforce to increasing market share and customer loyalty,” she said. “The Insight Intelligent Technology Index signifies how competing demands on IT are inhibiting their ability to plan and innovate.” The index, which queried 200 IT professionals, also found 79% of IT decision makers felt there were not enough resources to effectively support the demand for innovation, with another 33% saying innovation was expected of them despite existing processes, practices and business operations not evolving in ways that allowed them to do so. Another 30% cited a lack of clearly defined roles and responsibilities in the organisation as a reason for the lack of innovation.


CEO Fraud: Barriers to Entry Falling, Security Firm Warns

To hide their efforts, attackers may alter the rules for a compromised email account to divert copies of their fraudulent messages - and potentially replies - to other, attacker-controlled accounts, Digital Shadows notes. Such fraud can take the form of false invoices or modifying legitimate ones, but swapping in details for accounts controlled by attackers. Because BEC scams typically exploit weak corporate controls, organizations can use many actions to better defend themselves, Digital Shadows says. One of the most basic steps is to ensure that email accounts always have two-step verification enabled. That at least prevents an attacker that has the login credentials from accessing the account. Controls around wire transfers can also be shored up, Digital Shadows says. Fraudsters have had success, for example, by compromising the email account of a CEO and then sending an email to the finance department saying a payment needs to be made.


Discovering Blind Spots in the Data


Usually, there’s a trade-off between precision and recall. Improving precision can drop the recall and vice-versa. It’s up to the business stakeholders to tell the data scientists, which is more important: identifying more actual escalations at the cost of having more false escalations classified as escalations (high recall, low precision)? Or minimizing false escalations at the cost of missing many actual escalations (low recall, high precision)?  If the business stakeholders go for high recall and low precision, they will need to engage more people to deal with a higher number of real escalations and possibly many false escalations. If they choose low recall and high precision, they can engage fewer people to deal with the escalations but will risk having model miss many real escalations. In our case, initially, the Business stakeholders preferred high precision over low recall so that they didn’t have to deal with a lot of false escalation alerts. Our dataset had a few features whose value changed with time. This introduced us to a phenomenon called signal leakage.


Why 60% of IT security pros want to quit their jobs right now

retain.jpg
The main reasons cited by the IT pros who wanted to leave were job dissatisfaction and the lack of growth opportunities within their companies, said the release. The survey gathered data from more than 9,000 IT security professionals and decision-makers in the enterprise, said the release. This survey could give businesses better insight into how to retain and support their current tech talent. Other top reasons for employees looking to quit include unhealthy work environments (53%), absence of IT security prioritization from executives or upper management (46%), unclear job expectations (37%), and lack of mentorship (30%), said the release. Buy-in from upper management is crucial for security efforts, since only 38% of CEOs are really engaged in cybersecurity. This low engagement percentage is proof that executives don't prioritize cybersecurity as much as other factors of business, which further validates the dissatisfaction IT professionals are feeling.


NASA is using HoloLens AR headsets to build its new spacecraft faster


In the headset, the workers can see holograms displaying models that are created through engineering design software from Scope AR. Models of parts and labels are overlaid on already assembled pieces of spacecraft. Information like torquing instructions—how to twist things—can be displayed right on top of the holes to which they are relevant, and workers can see what the finished product will look like. The virtual models around the workers are even color-coded to the role of the person using the headset. For Jory’s team, which is currently constructing the heat shield skeleton of Orion, the new technology takes the place of a 1,500-page binder full of written work instructions. Lockheed is expanding its use of augmented reality after seeing some dramatic effects during testing. Technicians needed far less time to get familiar with and prepare for a new task or to understand and perform processes like drilling holes and twisting fasteners. These results are prompting the organization to expand its ambitions for the headsets: one day it hopes to use them in space.


Why today's containers and microservices will be tomorrow's legacy sooner than you think

The industry will be stuck with container platforms because these are interesting technologies that give the operators a taste of the power of running massive jobs at scale. Unfortunately, the ROI of maintaining that platform is elusive, since very few companies running these platforms will ever reach a point where they can even optimize job scheduling, and the cost of maintaining the container platform itself competes with the modest improvements in the developer's user experience. A similar phenomenon was seen with OpenStack half a decade ago, when, in the rush to have an in-house cloud, many companies grossly underestimated the short- and long-term associated costs and are now stuck maintaining OpenStack in perpetuity for the sake of the unnamed applications running on top of it. Well, that's a depressing thought, isn't it? But true. And why? Well, because technology change is hard. 


“Given the way the data was captured and displayed, it would not be readily available or searchable, but [the information commissioner] considers that a motivated individual could locate and extract the data in a more permanent way,” the notice said. Although the USB stick contained more than 1,000 files overall, just 1% of this information could be classified as being personal in nature. Also, a subsequent investigation by the ICO revealed less than 2% of the airport’s 6,500-strong workforce had received data protection training.  “Given that Heathrow Airport is Europe’s busiest airport, where high-level security should be inherent, loss or unauthorised disclosure of personal data of staff could have presented a greater risk if found by individuals who had not handled the data responsibly,” the penalty notice said. “Taking into account all of the above, the commissioner has decided that the penalty is £120,000.” According to the report, the USB stick was found in Kilburn, west London, on 16 October 2017, before being handed in to a national newspaper 10 days later


Behavioral Biometrics: Key Challenges

As more companies move away from passwords toward behavioral biometrics, they face new challenges, says Rajiv Dholakia, vice president products at Nok Nok Labs, a company which is into next generation authentication. Behavioral biometrics relies on a behavioral trait of an individual, rather than a physical trait. Examples include speech patterns, signatures and keystrokes. "There are no standards as such in this area on how the information is collected, how it's stored and how it's processed," Dholakia says in an interview with Information Security Media Group. "And therefore, there may be some privacy hazards associated with the technique unless a manufacturer makes it super clear exactly what is being collected, how it's being processed and whether that profile data is anonymized." Other behavioral biometrics issues include accuracy and concerns about passive collection of information from users, he says. "Moreover, when you are using behavioral biometrics, you have to be super certain that the information coming from all sensors is coming from a real device as opposed to a virtual machine," he says



Quote for the day:


"He who cannot be a good follower cannot be a good leader." -- Aristotle