Daily Tech Digest - January 27, 2018

There seems to be an obvious and somewhat necessary solution here to ensure that employees within an organisation are able to understand their IoT data and apply it to their own sector of expertise for maximum business benefit. One of the ways to resolve this skills shortage is to think about training Millennials to drive IoT projects forward in the future. Millennials are our future workforce and, given they are used to being constantly connected, they are perfectly placed to drive further connectivity. You’ll hear this described as entering the sharing economy. Therefore, as we enter this more circular economy, we need to equip employees with the necessary skills in AI, ML and deep learning (DL). By opening up the opportunity for individuals to specialise in these areas, businesses will be able to apply analytics to streaming data for deeper insights. This will enable more predictive decisions to be made and falls into sync with what a data scientist would be doing day by day. 


Blockchain Technology A Global Perspective!


The world is innovating without permission. The real essence of Blockchain Technology lies in innovating the Government based applications. The Deleware company corporation records stored on Blockchain, The Sweden operating now real-estate transaction on Blockchain, Singapore issuing invoicing into the Blockchain, The UK using Blockchain based monitors for distribution of grants. In Estonia, e-Citizens records, e-Payments key, medical records secured on Blockchain and the Ghana recording land registry using Blockchain Technology. Last but not the least. The most promising initiative of Smart Dubai — Dubai Blockchain Strategy. The Dubai is on the fast track to implement Blockchain in government operations and it will be the First City in the World to be completely powered by the Blockchain by 2020. After producing bunch of POCs and evaluating hundreds of Blockchain Innovations, with supporting tens of game changing startups around the globe. Now, sharing few easy to implement and potential Blockchain Use Cases to change the society in phenomenal way, all for public and private sectors. 


Is It Possible To Learn Data Science & Machine Learning Without Mathematics?


For machine learning, the real prerequisite skill that one needs to learn is data analysis, beginners and there is no need to know calculus and linear algebra in order to build a model that makes accurate predictions. The role of mathematics is particularly significant only if one is involved in machine learning research in an academic setting or for few subsets of more advanced data scientists. There are people in the industry at high levels who are also using advanced math on a regular basis. There are who are pushing the boundaries of machine learning people working on bleeding edge tools. People at companies like Google and Facebook are only ones who certainly use calculus, linear algebra, and more advanced math routinely in their work. The bottom line is that in industry, data scientists just don’t do much higher-level math but I reality they do is they spend a huge amount of their time getting data, cleaning data, and exploring data. The truth is that 80% of what people do is data munging and data visualization.


This Trojan infects Chrome browser extensions, spoofs searches to steal cryptocurrency

Different infection vectors are in place depending on the type of browser found on an infected system. Razy is able to install malicious browser extensions, which is nothing new. However, the Trojan is also able to infect already-installed, legitimate extensions, by disabling integrity checks for extensions and automatic updates for browsers. In the case of Google Chrome, Razy edits the chrome.dll file to disable extension integrity checks and then renames this file to break the standard pathway. Registry keys are then created to disable browser updates. "We have encountered cases where different Chrome extensions were infected," the researchers say. "One extension, in particular, is worth mentioning: Chrome Media Router is a component of the service with the same name in browsers based on Chromium. It is present on all devices where the Chrome browser is installed, although it is not shown in the list of installed extensions."


ML
Machine Learning on Code is actually a field of research that is just starting to materialize into enterprise products. One of the pioneers of movement is a company called source{d}, which is building a series of open source projects turning code into actionable data and training machine learning models to help developers respect technical guidelines. With every company quickly becoming a software company, intangible assets such as code represent a larger share of their market value. Therefore companies should strive to understand their codebase through meaningful analytic reports to inform engineering decisions and develop a competitive advantage for the business. On one hand, managers can use tools like the open source source{d} engine to easily retrieve and analyze all their Git repositories via a friendly SQL API. They can run it from any Unix system, and it will automatically parse their companies’ source code in a language-agnostic way to identify trends and measure progress made on key digital transformation initiatives.


Information theory holds surprises for machine learning

Information theory provides bounds on just how optimal each layer is, in terms of how well it can balance the competing demands of compression and prediction. "A lot of times when you have a neural network and it learns to map faces to names, or pictures to numerical digits, or amazing things like French text to English text, it has a lot of intermediate hidden layers that information flows through," says Artemy Kolchinsky, an SFI Postdoctoral Fellow and the study's lead author. "So there's this long-standing idea that as raw inputs get transformed to these intermediate representations, the system is trading prediction for compression, and building higher-level concepts through this information bottleneck." However, Kolchinsky and his collaborators Brendan Tracey (SFI, MIT) and Steven Van Kuyk (University of Wellington) uncovered a surprising weakness when they applied this explanation to common classification problems, where each input has one correct output (e.g., in which each picture can either be of a cat or of a dog). In such cases, they found that classifiers with many layers generally do not give up some prediction for improved compression.


The controversies of blockchain governance and rough consensus

On-chain governance describes the manner of proposing changes to a cryptocurrency and its underlying blockchain by a certain set of processes, rather than a simple majority consensus. The core differences between blockchains can be highlighted by examining exactly how these decisions are made, and by whom. To understand the concept, it’s important to identify all the participants in the network and how they work together. Miners are a core component in a decentralized public blockchain network because they help sustain it. They are incentivized by transaction fees and block rewards. Developers create the protocol and maintain the blockchain. They are also responsible for enforcing changes such as hard or soft forks. Like miners, developers are also incentivized to keep the network going. When developers propose a change to the network, a core group is tasked with achieving consensus over whether to accept or reject them. For miners, they back changes by contributing their hash power to one of the blockchains borne from a hard fork. 


Hacking enterprise architecture and service design


It is not very common for IT architects and service designers to work together. We are usually at the different ends of digital projects. We speak a different language, and often even physically sit in different buildings. As a service designer and enterprise architect, we had the unique opportunity to work together in various projects at D9. We began to identify strengths and weaknesses of each approach and expertise. There are clear commonalities, and in many ways our expertise complements each other. Government is transforming in the wider context of global and societal issues, complex systems and technological change. Our approach is that the traditional role and structures of government are challenged by new technology, human centred design and internal and external strategic drivers of change. To us, digital transformation is about human, strategy and technology. We work in transformation that requires multidisciplinary work in all the three elements. Throughout our work, we identified that a core challenge for digital transformation is a lack of active dialogue between strategy and development.


'Bitcoin will go to zero': Davos talks up the future of blockchain tech

A visual representation of the digital Cryptocurrency, Bitcoin, is seen on September 04 2018 in Hong Kong, Hong Kong.
Schumacher said the industry is now trying to create "open decentralized systems." These would essentially be next generation protocols or infrastructure that businesses could run on, similar to cloud computing today. The next generation of blockchain technology is currently being developed. Yeung said that she sees blockchain adoption happening quickly in the area of payments, particularly in Asia. "Many developing countries, where just to start with they don't even have credit cards, there's no particular infrastructure, it's almost easier to see sort of blockchain-enabled payments, to see in Asia, you will see more action happening in Asia more than U.S. and Europe," Yeung told CNBC. Ripple CEO Garlinghouse said he expects more widespread adoption of blockchain in about five years, while Schumacher said that it is three years off. However, Hutchins said that ultimately, consumers will not be talking about what blockchain is being used, they will just care how good the use case of a product is.


The future of code quality, security and agility lies in machine learning

code
Source code repository analysis can also reveal information about the developers writing it. Team dynamics can be highlighted by analyzing commits time and content: managers can identify when software engineers are the most productive, arranging meetings and encouraging cross-team collaboration accordingly. Looking at programming languages and frameworks trend can inform hiring managers on what type of talent to hire and what upskilling education resources can they provide. Adding source code as a new dataset in enterprises’ data warehouses and visualization platforms such as Power BI, Looker or Tableau will provide everyone in the engineering organization with a whole new level of source code and development process observability.  Yet the most exciting aspect of looking at code as a dataset is that it can be used to train Machine Learning models that can automate many different repetitive tasks for developers. We’re already starting to see new machine learning based applications for assisted code review or suggestions on GitHub.



Quote for the day:


"It is easier to act yourself into a new way of thinking, than it is to think yourself into a new way of acting." -- A.J. Jacobs


Daily Tech Digest - January 26, 2019

AI is sending people to jail—and getting it wrong


Police departments use predictive algorithms to strategize about where to send their ranks. Law enforcement agencies use face recognition systems to help identify suspects. These practices have garnered well-deserved scrutiny for whether they in fact improve safety or simply perpetuate existing inequities. Researchers and civil rights advocates, for example, have repeatedly demonstrated that face recognition systems can fail spectacularly, particularly for dark-skinned individuals—even mistaking members of Congress for convicted criminals. But the most controversial tool by far comes after police have made an arrest. Say hello to criminal risk assessment algorithms. Risk assessment tools are designed to do one thing: take in the details of a defendant’s profile and spit out a recidivism score—a single number estimating the likelihood that he or she will reoffend. A judge then factors that score into a myriad of decisions that can determine what type of rehabilitation services particular defendants should receive, whether they should be held in jail before trial, and how severe their sentences should be. 


Almost 90 percent of the business leaders surveyed as part of the study believed that cognitive diversity in the workplace is extremely important for running a successful organization. Managers in the contemporary workplace want employees to think differently and experiment with their typified ways of problem solving. While expecting such cognitive diversity was a bit difficult in the past, the role AI can play in the workforce means that organizations can expect greater rewards in the future. AI mechanisms will help augment human efforts in the workplace and stimulate cognitive diversification that benefits the organization. The study also revealed that 75 percent of respondents expected AI to create new roles for employees. This is a clear indication that AI is not going to replace human jobs, but will instead increase efficiency and shift humans’ roles and even create new positions for employees that provide meaningful work better suited to humans’ strengths.


Mondelez vs. Zurich: How watertight is cyber insurance coverage?

Mondelez vs. Zurich: How watertight is cyber insurance coverage? image
To put it bluntly, it appears the insurance sector has not been able to keep up with cyber threats. As new threats pop-up in cyberspace, new policies typically lag behind in a confused state. A lack of visibility of their client’s cyber health also challenges insurers. This is very important for insurers, for example, if somebody wants health insurance, proving whether or not they smoke or that there’s no hereditary diseases which run in their family is vital in establishing how much their premium should be. The visibility issue isn’t just one affecting insurers. Many firms don’t have the tools to adequately assess and respond to the rising levels of cyber risk they’re exposed to. A recent report from the insurer Hiscox claimed that nearly three-quarters (73%) of global firms are “cyber-novices” when it comes to the quality and execution of their security strategy. If it’s the case (and it is) that cyber insurance policies are confusing and have room for improvement, the best thing a company can do is first to understand the cyber risks they face, and then secure a bespoke policy to meet their needs.



Collateral Damage: When Cyberwarfare Targets Civilian Data

Unfortunately, this is par for the course for private-sector businesses and NGOs. Sometimes the breach is to get a critical piece of political or military information to be used later. Sometimes it's to steal intellectual property or research so that the hacking nation can get a competitive boost in the economic and/or military might. Sometimes it's to cull some personal information about someone with the right security clearance — which may mean orchestrating a super-breach, compromising several million other accounts along the way. Notably, these breaches aren't about anything so pedestrian as identity theft or credit card fraud. Instead, the goal is to use the information gleaned as a jumping-off point — to allow escalated access to yet more critical information. This is especially the case with healthcare organizations, where the right juicy health-record tidbit about a well-placed employee (or family member thereof) of a government arm can be used to extort some small amount of extra information or escalated access, turning that employee into an inside-attack threat.


How AI and Quantum Computing May Alter Humanity’s Future


König and the AI research team showed that quantum outperforms classical computing and that quantum effects can “enhance information-processing capabilities and speed up the solution of certain computational problems.” In their research, the team demonstrated that parallel quantum algorithms running in a constant time outperform classical computers. The scientists showed that quantum computers only required a fixed number of steps for problem solving and was better at “solving certain linear algebra problems associated with binary quadratic forms.” Forward-thinking organizations recognize the synergistic boost that the combination of quantum computing and artificial intelligence may herald. Microsoft CEO Satya Nadella stated in a WSJ Magazine interview, “What’s the next breakthrough that will allow us to keep up this exponential growth in computing power and to solve problems—whether it’s about climate or food production or drug discovery?


Bringing open-source rhyme and reason to edge computing: LF Edge

This isn't easy. Interoperability and standards simply don't exist in IoT or Edge Computing. This makes life miserable for anyone working in these areas. It's the LF Edge's founders hope that this pain will bring vendors, OEMs, and developers together to create true open standards. For the broader IoT industry to succeed, the fragmented edge technology players must work together to advance a common, constructive vision. Arpit Joshipura, the Linux Foundation general manager for Edge and IoT, said, "In order for the broader IoT to succeed, the currently fragmented edge market needs to be able to work together to identify and protect against problematic security vulnerabilities and advances common, constructive vision for the future of the industry. LF Edge is realizing this vision with five projects. These support emerging Edge applications in non-traditional video and connected things that require lower latency (up to 20 milliseconds), faster processing, and mobility.


Balancing data privacy with ambitious IT projects for digital transformation

Balancing data privacy with ambitious IT projects image
A global organisation that produces medical devices for the healthcare market used IoT technology to monitor and record the usage of every individual device for product development and preventative maintenance. Regardless of the relatively benign purpose, because of the nature of these medical devices and the broad approach to data collection, the usage data that the developers were collecting was inherently sensitive. Healthcare data is classified as “special category” data by GDPR as well as others, which brings with it additional prohibitions over its use and heightened penalties for its mishandling. More concerning was that neither the patients, the healthcare professionals nor the business were aware of the collection and use of the data. No framework was in place to govern its collection, use or storage. No processes were documented. Furthermore, the business had not yet appointed a data protection officer. Once the legal teams began their GDPR preparations, they quickly discovered this data use.


26 Regulatory Initiatives that Will Shape Fintech in Europe and Beyond

In the banking industry’s quest towards open banking, standardisation has now become the name of the game towards global applicability. There is a consistent push and pull between whether these standards should come from regulators or industry players. On one hand, regulators can future-proof standards in that they could design the standards based on principles that ensure safety in the ecosystem. On the other hand, industry players may be better suited to producing standards or platforms that could better encourage innovation and growth of the industry as they are often instrumental in making it happen. Many of the standards listed below only apply to one region or another, but as the interchange fee regulation in the EU being implemented in Australia shows, there is something to be said about the ripple effect of regulations, particularly when regulators attempt to implement what works in other countries. The following is a list of initiatives, regulations and standards that have been listed in the World Payments Report 2018, by Capgemini and BNP Paribas.


With cybersecurity threats looming, the government shutdown is putting America at risk

net-neutrality-capitol
Employees who are considered “essential” are still on the job, but the loss of supporting staff could prove to be costly, in both the short and long term. More immediately, the shutdown places a greater burden on the employees deemed essential enough to stick around. These employees are tasked with both longer hours and expanded responsibilities, leading to a higher risk of critical oversight and mission failure, as weary agents find themselves increasingly stretched beyond their capabilities. The long-term effects, however, are quite frankly, far more alarming. There’s a serious possibility our brightest minds in cybersecurity will consider moving to the private sector following a shutdown of this magnitude. Even ignoring that the private sector pays better, furloughed staff are likely to reconsider just how valued they are in their current roles. After the 2013 shutdown, a significant segment of the intelligence community left their posts for the relative stability of corporate America. The current shutdown bears those risks as well. A loss of critical personnel could result in institutional failure far beyond the present shutdown, leading to cascading security deterioration.


Three reasons why you need to modernise your legacy enterprise data architecture

null
Most data was of a similar breed in the past. By and large, it was structured and easy to collate. Not so today. Now, some data lives in on-premises databases while other data resides in cloud applications. A given enterprise might collect data that is structured, unstructured, and semi-structured. The variety keeps widening.  According to one survey, enterprises use around 1,180 cloud services, many of which produce unique data. In another example, we integrated over 400 applications for a major enterprise IT firm. The process of integrating all this wildly disparate data alone is too great a task for legacy systems. Within a legacy data architecture, you often have to hand-code your data pipelines, which then need repairing as soon as an API changes. You might also have to oversee an amalgam of integration solutions, ranging from limited point-to-point tools to bulky platforms that must be nurtured through scripting. These traditional approaches are slow, fraught with complexity, and ill-matched for the growing variety of data nowadays.



Quote for the day:


"It is easy to lead from the front when there are no obstacles before you, the true colors of a leader are exposed when placed under fire." -- Mark W. Boyer


Daily Tech Digest - January 25, 2019

Within the Microsoft Cyber Defense Operation Center (CDOC), we focus on these dependencies with teams that coordinate threat intelligence, security monitoring and incident response by exploiting both the common, and unique capabilities of each specialization. It is here that we leverage our global workforce of more than 3,500 security professionals across our product development teams, information security groups, and legal teams to protect our cloud infrastructure and services, products and devices, and internal resources. The engineering teams behind our commercial security solutions, like Azure Security Center (ASC), also take advantage of the Cyber Defense Operation Center (CDOC) community to test hypotheses and pre-flight solutions in a real-world environment. This model is based on a closed-loop system of intelligence, defense, and control that streamlines our security capabilities for more than 200 cloud services, over 100 datacenters, millions of devices, and over a billion customers around the globe.


Stealthy New DDoS Attacks Target Internet Service Providers

An analysis of DDoS data during Q3 2018 by Nexusguard showed attackers trying to overwhelm targeted sites, and even entire ISP -- aka communications provider (CSP) -- networks, by spreading attack traffic across a large number of IP prefixes. Unlike a typical volumetric attack on a single IP address, many of the DDoS campaigns that Nexusguard analyzed involved attackers contaminating legitimate traffic across hundreds of IP addresses with small bits of junk. The attack traffic within each IP address was small enough to avoid detection by DDoS mitigation tools but big enough to take down a targeted site once converged, Nexusguard said in a report published this week. For example, the average attacks involved just 33.2 Mbps of traffic per targeted IP making it hard for service providers to detect and mitigate the traffic. In total, about 159 autonomous systems - most belonging to service providers - were targeted in "bit-and-piece" attacks in Q3 of 2018.


Establish a configuration management strategy to guide transition


No matter the returns on the technology, enterprise IT organizations frequently encounter problems with a configuration management strategy. Various internal teams select different configuration management tools. Team members resist the burden of a steep learning curve with a new tool. Or people stick with their habits because they are simply too busy or distracted with existing work to change. "There are high performers who tend to have their own tastes, [and] there are others who are trying to catch up to that," said Suranjan Chatterjee, global head of the cloud apps, microservices and API unit at Tata Consultancy Services. He cited "a lot of tensions and sensitivities" when diverse teams in a large group must collaborate. To increase automation and win the war on configuration drift, IT organizations should prepare a solid configuration management strategy and evaluate tools specifically based on how easily they onboard and support users.


A human-centred agenda for the future of work

Technology, including artificial intelligence, robotics and sensors, entails countless opportunities to improve work. The extraction of knowledge through data mining can assist labour administrations to identify high-risk sectors and improve labour inspection systems. Blockchain technology could make it easier for companies and the social partners to monitor working conditions and labour-law compliance in supply chains. But digital technology also creates new challenges for decent work. Digital labour platforms provide new sources of income to many workers in different parts of the world, yet the dispersed nature of the work across international jurisdictions makes it difficult to establish workers’ rights. The work on platforms is sometimes poorly paid—even below prevailing minimum wages—and no official mechanisms are in place to address unfair treatment. Thus I introduced into the commission the idea of an international governance system for digital-labour platforms, which would require platforms (and their clients) to respect certain minimum rights and protections.


How can individual employees prepare for the future of work?


Individuals should be aware of traits that will help them prepare for the future and will also make it easier to develop these soft skill areas, one of which is self-awareness. “This self-awareness around purpose is a prime source of energy, resilience and clarity when it comes to dealing with all the choices, challenges and changes around us,” said Empey. “It also helps with a second key point, which is being ‘open’ to change, other points of view, other ways of doing things and so on.” A third strategy he suggests is authentic networking – that is, being quite deliberate in seeking a connection with those who can be of value to you, and you helping them in a generous, mutual and non-favour-expecting way. Individual employees should also remember that it’s not all about the skills we need at work. “The last point I would make is around health and wellbeing,” said Empey.


Cybercriminals Home in on Ultra-High Net Worth Individuals

The conclusions drawn by Glasswall mirrors research conducted by UK-based Campden Wealth, which found that 28% of the UHNW families reported having been the victim of one or more cyberattacks. While UHNW families have an estimated net worth of at least $30 million, Campden Wealth recommends that those setting up single-family offices have wealth of $150 million or more. Many of the families that open single-family offices have far in excess of $150 million, with their average net worth standing at $1.2 billion, according to the Campden Wealth/UBS Global Family Office Report. Dr. Rebecca Gooch, Campden Wealth's director of research, says phishing was the most common type of attack, followed by ransomware, malware infections, and social engineering. She says UHNW individuals are targeted in a variety of ways including via their operating businesses, family offices, or through the family members themselves. More than half the attacks were viewed as malicious.


Poor practices expose 24 million financial records


The records were stored in an Elasticsearch cluster which contained 51GB of what appeared to be OCR credit and mortgages reports, Diachenko said in a blog post. “The documents contained highly sensitive data, such as social security numbers, names, phones, addresses, credit history and other details which are usually part of a mortgage or credit report,” he said. “This information would be a goldmine for cyber criminals, who would have everything they need to steal identities, file false tax returns, get loans or credit cards.” The exposed data was eventually traced to a data and analytics company called Ascension in Fort Worth, Texas, with the help of TechCrunch, which first reported Diachenko’s findings. According to parent company Rocktop Partners, Ascension shut down the server in question after learning of a “server configuration error” that “may have led to exposure of some mortgage-related documents”.


Microservices and the Saga Pattern

Microservices are not new in the market, they have been a part of our day to day life for a while now, so here in this post, I would like to share with you my understanding of what microservices are and what the Saga Pattern is, which is widely used with the microservices. We will start with what exactly we mean when we say: (i) we need a microservice, (ii) what it means to be reactive, and then (iii) dig into the concept of Saga patterns, with respect to a distributed system along with an easy to understand real-life example. ... For example, if we are preparing a product like a Restaurant application, then we would be creating several small microservices like the Orders, Customers, Reservations, etc., which would perform specific tasks around a specific functionality of the Restaurant application, and would be interacting if and only if we need to have the functionalities come together and that, to, only through their exposed APIs. For now, we can think of an API as endpoints of a service which are exposed for use to the outside world


Is customer information safer with a blockchain database?

Spring Labs is spearheading a group of prominent fintech lenders that will use a blockchain-based, peer-to-peer network to share consumer information to help with identification verification on loan applications. Avant, OnDeck Capital and SoFi are among 16 companies currently testing the network, called the Spring Protocol, which is scheduled to go live in the second half of this year. Part of the idea behind Spring’s system is to have a central database lenders can access without replicating critical consumer information on multiple systems, said John Sun, president and chief product officer for Spring Labs. While Spring doesn’t identify as a pure blockchain firm, he said it’s the best way to safeguard and store the information. “We have this solution that does certain things and we asked ourselves what is the best technology to build it that way,” Sun said. “It just so happens that for parts of the protocol and the technology stack, blockchain really is the best way to accomplish what we wanted to do.”


Adding Agile to Lean at Toyota Connected

Thurlow argued that the need to be more flexible, adaptable, and nimble is now a necessity, and no longer an option. Toyota needed to add agility into Lean Product Development. As Thurlow stated: "We took the best of breed agile learning and combined that with decades of lean thinking from the creators at Toyota and established an approach we currently call Scrum The Toyota Way." Every team member has had formal training, followed by continuous coaching in the workplace through a dedicated team of Scrum Masters and Coaches who are independent from the product delivery teams, he said. Coaches are embedded with the teams, but report externally ensuring they have checks and balances. Toyota Connected is building a pattern library of tools and techniques they have created or identified that work in various contexts. Just as The Toyota Production System never stops improving, Scrum The Toyota Way evolves endlessly, said Thurlow.



Quote for the day:


"Leadership is working with goals and vision; management is working with objectives." -- Russel Honore


Daily Tech Digest - January 24, 2019

The team used a graph convolutional neural network — an algorithm that operates on nodes, edges, properties, and other graph structures — to model the statistical relationship among parking locations, traffic flow, parking demand, road links, and parking blocks. Together with a recurrent neural network with long-short term memory (LSTM) — a type of AI algorithm capable of learning long-term dependencies — and a multi-layer decoder, the system extracted parking information from traffic-related data sources (such as parking meter transactions, traffic speed, and weather conditions) and output occupancy forecasts. The researchers trained it on data sourced from the Pittsburgh downtown area, which they note has 97 on-street parking meters across 39 street blocks. Historical parking stats came from the Pittsburgh Parking Authority, while connected car company Inrix’s Traffic Message Channel and Weather Underground’s API supplied traffic speed data and hourly weather reports, respectively.


Evidence-Based Management Guide - Updated

Most organizations need to start by looking at the value they deliver today, or Current Value. Organizations often use revenue to measure this, and if you can measure it instantaneously, it’s not a bad measure; for example, if you are selling items online, by knowing daily sales, or even moment-to moment sales, it can give an organization some sense of the value that customers experience. A better measure is actual customer satisfaction, since sometimes people buy things they never end up using, or buy things only because they have no better alternatives. Measures like Net Promoter Score (NPS), if measured as close to the actual experience as possible, can give a better indicator of value. Even measures that simply show how often a feature is used, and for how long, can give a better picture of what customers value, than does revenue. Going deeper, measures that give insight into why the customer is using the product are even better and can serve as true aligning measures of success. For example, we’ve worked with a company that helps organizations process their insurance claims.


Multi-vector attacks target cloud-hosted technologies

10 cloud security breach virtualization wireless
Attackers often break in by exploiting unpatched vulnerabilities or insecure configurations in services like the Redis data structure store, the Apache Hadoop big-data processing toolset or the Apache ActiveMQ messaging middleware. They also launch brute-force password guessing attacks against a large number of services including MySQL, MongoDB, Memcached, CouchDB, PostgreSQL, Oracle Database, ElasticSearch, RDP, VNC, Telnet, RSync, RLogin, FTP, LDAP and more. One of the most commonly used malware tools observed in attacks against cloud-hosted services is the XBash worm, which first appeared in May 2018. This malware is used to infect both Windows and Linux servers and deploys additional payloads depending on which OS is running. XBash is typically associated with a cybercriminal group known in the security industry as Iron. However, another group called Rocke is also using an XBash variant and has recently been in the news after it started disabling cloud security and monitoring agents.


Linux’s Hyperledger to give developers supply chain building blocks

binary chains / linked data / security / blockchain
"What attracts many organizations to blockchain technology is the possibility of sharing data across corporate boundaries while maintaining a high degree of rigor and accuracy," said Robert Beideman, a vice president with the GS1 standards organization. Last week, SAP launched a blockchain-based a supply chain tracking service that will enable drug wholesalers to authenticate pharmaceutical packaging returned from hospitals and pharmacies. The Linux Foundation described its Hyperledger Grid project as a framework, not a blockchain or an application. "Grid is an ecosystem of technologies...that work together, letting application developers make the choice as to which components are most appropriate for their industry or market model," the Grid project said in a blog post. Grid includes a set of libraries, data models and SDKs to accelerate development for supply chain smart contracts and client interfaces. (Smart contracts are self-executing code based on pre-determined business agreements.


Financial Services and Social Value Can Mix

Many of the problems facing our society come from a lack of social cohesion. Social inequality affects us all. In global terms, economic conditions may have improved, but in real terms, when examined at an individual level within a particular country, inequality can be felt more and more. This perceived impact goes some way toward explaining the recent appearance of populist movements, which is one of the biggest threats to economic development. Any form of populism will always work against the stability that we need. Another significant concern, which is in effect also an opportunity, is social and technological disruption. If we don’t tackle this issue properly, it’ll put an end to insurance as we know it. Consumer profiles and society have changed dramatically, and people expect and demand more from companies. Young people expect companies to be much more committed, more socially active, and more transparent.


Business failing to see strategic value of cyber security


Security professionals said boards perceive them as functional but not as a force for competitive advantage, with 56% saying they feel restricted by the board and only 41% reporting that their organisations have a CISO in place on the board.  Although the security team can be instrumental in business transformation, only 44% believe that the C-suite sees them as a positive force for innovation, and just one in 10 respondents (13%) believe that the board sees them as helping the company to gain a competitive advantage.  The findings suggest that boards may be paying lip service to IT security teams, as there is a disparity between what the board says and how this translates into investment. While 87% of security professionals believe that the board listens to them and values their input, a considerable proportion (62%) believe that the board can’t always see the business case for security investments.


AIOps tools supplement -- not supplant -- DevOps pipelines

AIOps tools enable an IT organization's traditional development, test and operations teams to evolve into internal service providers to meet the current and future digital requirements of their customers -- the organization's employees. AIOps platforms can also help enterprises monitor data across hybrid architectures that span legacy and cloud platforms, Grabner said. These complex IT environments demand new tools and technologies, which both require and generate more data. Organizations need a new approach to capture and manage that data throughout the toolchain -- which, in turn, drives the need for AIOps tools and platforms. AIOps can also be perceived as a layer that runs on top of DevOps tools and processes, said Darren Chait, COO and co-founder of Hugo, a provider of team collaboration tools based in San Francisco. Organizations that want to streamline data-intensive, manual and repetitive tasks -- such as ticketing -- are good candidates for an AIOps platform proof-of-concept project.


Desktop-as-a-Service: The new frontier for end user computing?

Desktop-as-a-Service: The new frontier for end user computing? image
The workspace strategy has evolved. It was static, but has been transcended through a hardware refresh. It is now in what Hill calls “an adaptive phase, which represents a shift to software modernisation and innovation”. Still, however, this transition to adaptive, is a mixed experience if you try and traverse these platforms. In this adaptive phase, DaaS and VDI are often conflated. VDI is a technology and DaaS is a service — an off-premise workspace on-demand via a provider. Here, the responsibility of security can be unclear: the customer, service provider, tooling provider or infrastructure platform provider? ... Workspace analytics will drive innovation and transformation, by enabling the identification of new devices: monitoring them, assessing them and adapting to them. Machine learning will play a huge roll in this more pervasisve analytics strategy, which will enable the next stage of continuous improvement across three channels.


What is malware? Viruses, worms, trojans, and beyond

binary code, magnifying lens, skull and crossbones
Antivirus softwareis the most widely known product in the category of malware protection products; despite "virus" being in the name, most offerings take on all forms of malware. While high-end security pros dismiss it as obsolete, it's still the backbone of basic anti-malware defense. Today's best antivirus software is from vendors Kaspersky Lab, Symantec and Trend Micro, according to recent tests by AV-TEST. When it comes to more advanced corporate networks, endpoint security offerings provide defense in depth against malware. They provide not only the signature-based malware detection that you expect from antivirus, but anti-spyware, personal firewall, application control and other styles of host intrusion prevention. Gartner offers a list of its top picks in this space, which include products from Cylance, CrowdStrike, and Carbon Black. It's fully possible—and perhaps even likely—that your system will be infected by malware at some point despite your best efforts.


Why Do We Need Architectural Diagrams?

The main beneficiary should be the team (developers, test engineers, business analysts, devops, etc.) who have direct involvement in the project. In my experience, outside of the team, there are very few stakeholders who really care about documentation. In the best case, they might be interested in one or two high-level diagrams (e.g. context diagram, application or software component diagram) which roughly describe the structure of the system and give a high-level understanding of it. However, most of the time we fail in identifying the real beneficiaries and their real needs and try to create too much documentation. This quickly becomes a burden to maintain and is quite soon outdated. In other cases, we just simply omit to create any kind of diagram because there is no time, no specific interest, or nobody wants to take on this responsibility. Besides this, the Agile Manifesto prescribes that teams should value working software over comprehensive documentation, which discourages cumbersome documentation processes



Quote for the day:



"A true dreamer is one who knows how to navigate in the dark" -- John Paul Warren


Daily Tech Digest - January 16, 2019

The Rise of Automated Machine Learning


AI and machine learning require expert data scientists, engineers, and researchers, and there's a worldwide short supply right now. The ability of autoML to automate some of the repetitive tasks of ML compensates for the lack of AI/ML experts while boosting the productivity of their data scientists. By automating repetitive ML tasks -- such as choosing data sources, data prep, and feature selection -- marketing and business analysts spend more time on essential tasks. Data scientists build more models in less time, improve model quality and accuracy, and fine-tune more new algorithms. More than 40 percent of data science tasks will be automated by 2020, according to Gartner. This automation will result in the increased productivity of professional data scientists and broader use of data and analytics by citizen data scientists. AutoML tools for this user group usually offer a simple point-and-click interface for loading data and building ML models. Most autoML tools focus on model building rather than automating an entire, specific business function such as customer analytics or marketing analytics.


Model-driven RESTful API for CRUD and More

This article introduces a model-driven RESTful API for CRUD (Create, Read, Update, Delete). With it, you can write simple models (specifying a database table and the set of columns to be exposed) and the REST endpoints for CRUD will become available automatically. No hand-coding of any SQL is necessary. The concept could be implemented on different technology stacks and languages. Here, I used JavaScript (which generates SQL) with Node.js, Express, and PostgreSQL. Most projects need to Create, Read, Update, and Delete objects. When these objects are simple enough (one driving table and a few columns in the database), the code is very similar from one object to the next. In fact, the patterns are the same, and the only differences are the names of the tables and the names and types of the columns. Of course, there will always be complex endpoints which need to be written by hand but by automating the simple ones, we can save a lot of time.


Progressing beyond a pre-digital age: Building the business case for ‘digital HR’

Progressing beyond a pre-digital age: Digital HR image
Humans are, well, only human. Mistakes happen, but a mistake can have a huge impact on an organisation’s health and future success. Introducing technology to manage a range of processes can help to reduce and mitigate HR related risk by minimising all manner of issues from poor HR consistency and visibility, to data loss. Manually updating changes in spreadsheets can be a cumbersome and ineffective process, especially when the data is being entered into multiple documents. Research from Salesforce shows that 88% of all spreadsheets have significant errors in them. Applying intelligent automation will not only reduce the risk of human mistakes but also help to flag errors and data problems before they create a negative impact on the business. The huge issue of risk and compliance aside, automation reduces the HR admin mountain and allows a focus on people strategies which are so critical when competing for talent and reducing churn. 


Get ready for edge computing’s rise in 2019

While many of you may see edge as exclusive to IoT, its value is much wider and will prove as critical to driving up customer experience as content delivery networks (CDN) were in the early days of the web . . .which explains why you are now seeing edge compute and AI services from all the major cloud vendors and on the road maps of the leading telecom companies. Twenty-seven percent of global telecom decision makers, who responded this year to the Forrester Analytics Global Business Technographics® Mobility Survey, 2018, said that their firms are either implementing or expanding edge computing in 2019. Many of these vendors will require new wireless tools and updated skill sets to achieve this digital transformation. This aligns to Verizon's recent employee buyout offer, as a result of which over 10,400 of its staff will be gone next year, driving nearly $10 billion in savings that it can apply to its edge-compute-empowered 5G network. And speaking of CDNs, nearly every one of these vendors is adding edge compute to their core market values.


World's first robot hotel massacres half of its robot staff

Terminator head
The story highlights the shortcomings of purportedly “state of the art” AI automation that are rarely discussed. One is that they’re installed to solve a management problem rather than a customer need, as was the case here - the hotel is in an area with an acute labour shortage. Secondly, they’re just plain annoying. As hotel manager Hideo Sadawa explained: “When you actually use robots you realize there are places where they aren’t needed - or just annoy people”. While robotics has advanced steadily in industry, the picture is different in consumer electronics. Trade group the International Federation of Robotics noted that sales of industrial robots had doubled in five years. But it’s largely cyclical, IFR president Junji Tsuda admitted. Adoption doubled even more dramatically between 2009 and 2010, which had nothing to do with AI and a lot to do with the falling cost of sensors and microelectronics. In industries where automation is highly advanced, such as car production, it may not move the dial much: wage rates largely govern the substitution phenomenon


The Key Cybersecurity Takeaways From The Recent SEC Charges

The Key Cybersecurity Takeaways From The Recent SEC Charges
Hackers continue to prefer phishing schemes to almost any other infiltration or social engineering tactic. In part, their effectiveness ties into their mundanity; phishing attacks look like legitimate emails, and employees without proper training will reliably open their emails. Phishing attacks, therefore, provide a low effort, high impact cyber threat. Furthermore, if it can hit the SEC, it can hit your enterprise as well. To prevent a phishing attack from inflicting damage on your databases, make sure your employees can recognize a phishing attack if they receive one; there are tell-tale signs for almost all of them. Incentivize recognizing phishing attacks before they occur, either through a small rewards program or by making cybersecurity a part of your employees’ everyday job duties and performance reviews. Additionally, ensure your cybersecurity platform includes a SIEM solution with strong threat detection capabilities. Your enterprise can also benefit from an email security solution to prevent phishing attacks from reaching your inboxes.


Major Security Breach Discovered Affecting Nearly Half of All Airline Travelers


With the PNR and customer name at our disposal, we were able to log into ELAL’s customer portal and make changes, claim frequent flyer miles to a personal account, assign seats and meals, and update the customer’s email and phone number, which could then be used to cancel/change flight reservation via customer service. Though the security breach requires knowledge of the PNR code, ELAL sends these codes via unencrypted email, and many people even share them on Facebook or Instagram. But that’s just the tip of the iceberg. After running a small and non-threatening script to check for any brute-force protections, none of which were found, we were able to find PNRs of random customers, which included all of their personal information. We contacted ELAL immediately to point out the threat and prompt them to close the breach before it was discovered by anyone with malicious intentions. We suggested stemming the vulnerability by introducing captchas, passwords, and a bot protection mechanism, in order to avoid using a brute-force approach.


What is COBIT? A framework for alignment and governance

New concepts and terminology have been introduced in the COBIT Core Model, which includes 40 governance and management objectives for establishing a governance program. The performance management system now allows more flexibility when using maturity and capability measurements. Overall, the framework is designed to give businesses more flexibility when customizing an IT governance strategy. Like other IT management frameworks, COBIT helps align business goals with IT goals by establishing links between the two and creating a process that can help bridge a gap between IT — or IT silos — and outside departments. One major difference between COBIT and other frameworks is that it focuses specifically on security, risk management and information governance. This is emphasized in COBIT 2019, with better definitions of what COBIT is and what it isn’t. 


The report on the security analysis of radio remote controllers for industrial applications highlights notes the use of obscure, proprietary protocols instead of standard ones makes controllers vulnerable to command spoofing, so an attacker can selectively alter their behaviour by crafting arbitrary commands, with consequences ranging from theft and extortion to sabotage and injury. “The legacy and widespread RF technology used to control industrial machines is affected by serious security issues that impact several market verticals, applications, products and brands,” the report said. The researchers warned that currently and widely used legacy RF technology for industrial applications can be abused for sabotage of equipment, theft of goods by manipulating equipment and extortion by demanding payment to hold off or cease equipment interference.


Getting Started with PouchDB - Part 1

PouchDB is an open-source JavaScript NoSQL database designed to run offline within a browser. There is also a PouchDB server version that can be used when online. These two databases synchronize from one to another using a simple API call. You may also use CouchDB on the server to synchronize your data. A NoSQL database is storage where there is no fixed table structure as in a relational database. There are a few different methods NoSQL databases use to store data: column, document, Graph, and key-value pair. Of these, the most common are column and document. PouchDB supports document-oriented where data in the model is stored as a series of JSON objects with a key value assigned to each document. Each document in PouchDB must contain a property called _id. The value in the _id field must be unique per database. You may use any string value you want for the _id field. In this article, I am going to use a value that is very simple.



Quote for the day:


"Your talent and giftedness as a leader have the potential to take you farther than your character can sustain you. That ought to scare you." -- Andy Stanley