Daily Tech Digest - January 28, 2019

What is Keras? The deep neural network API explained

What is Keras? The deep neural network API explained
Keras was created to be user friendly, modular, easy to extend, and to work with Python. The API was “designed for human beings, not machines,” and “follows best practices for reducing cognitive load.” Neural layers, cost functions, optimizers, initialization schemes, activation functions, and regularization schemes are all standalone modules that you can combine to create new models. New modules are simple to add, as new classes and functions. Models are defined in Python code, not separate model configuration files. The biggest reasons to use Keras stem from its guiding principles, primarily the one about being user friendly. Beyond ease of learning and ease of model building, Keras offers the advantages of broad adoption, support for a wide range of production deployment options, integration with at least five back-end engines (TensorFlow, CNTK, Theano, MXNet, and PlaidML), and strong support for multiple GPUs and distributed training. Plus, Keras is backed by Google, Microsoft, Amazon, Apple, Nvidia, Uber, and others.



How do you best talk to your board about cybersecurity?

Boards are maturing both in their interest in and understanding of cybersecurity. They are now asking much more specific questions, particularly as they wish to increase this understanding. In conducting this research, we had the pleasure of working with board members who have been privy to this security journey. We wanted to understand where the gap is for them and how we can help close it. One of the key problems in communicating security to any stakeholder group (including boards) is that we (security pros) assume that we know what our audience wants and proceed to throw information at them as per our desires. But because our expertise typically lies in the field of technology, not human psychology or communication, our assumptions about what they want are often far removed from reality. We rarely take the time to ask, for fear of appearing stupid. In this research, we did just that: We asked. As a result, we had the opportunity to understand board members’ journeys through the murky and often technical and confusing waters of cybersecurity.


The internet of human things: Implants for everybody and how we get there


Let me be clear on my motivation for wanting to make wearables -- and eventually implants -- our default method of brick-and-mortar payment: I hate physical wallets. I don't like dragging around a thick hunk of cow hide filled with a bunch of credit cards I don't use that often. Then, there are loyalty program cards and various IDs I have, such as my license, various types of permits, and medical insurance and drug plan stuff when I have to pick up prescriptions. Have you ever lost or had your wallet stolen? Or your keys? The amount of work it takes to get your life back in order is ridiculous. How many of you do the paranoid "life check" triple play every day for your wallet, keys, and smartphone? When I am traveling, I might do that three times a day, easy.  So now, let us imagine a future where you don't have to walk around carrying cow hide stuffed with plastic cards and cash. A future without losing wallets and the disruption that ensues. A future where many of us can leave our homes every day with literally nothing on our person except a smartphone and perhaps a wearable device.



Panasonic IoT strategy is all about big data analytics

It's all about pain points. What's the problem that you're trying to solve? Believe it or not, it may sound like an easy question, but the answers are really difficult. Because A, to get your middle management or middle-ranked individuals, to be able to speak to pain points is difficult because they see that as an admission of some guilt. Getting them out of that mode, and getting them into a comfort zone where they can openly talk about the pain points is really challenging. Because you can get a set of pain points from the top-level executives, but you need to let some level of granularity on those pain points. Without the granularity you're unable to pinpoint on the specifics and recommended a solution. So what we have done is, for instance, in our industrial manufacturing operations, we have people who walk into manufacturing floors, we talk to executives, we talk to engineers, and we take a third-party view on what the problems are, and identify these pains, and then try to prioritize what the return would be on those pain points.


Giving algorithms a sense of uncertainty could make them more ethical


The algorithm could handle this uncertainty by computing multiple solutions and then giving humans a menu of options with their associated trade-offs, Eckersley says. Say the AI system was meant to help make medical decisions. Instead of recommending one treatment over another, it could present three possible options: one for maximizing patient life span, another for minimizing patient suffering, and a third for minimizing cost. “Have the system be explicitly unsure,” he says, “and hand the dilemma back to the humans.” Carla Gomes, a professor of computer science at Cornell University, has experimented with similar techniques in her work. In one project, she’s been developing an automated system to evaluate the impact of new hydroelectric dam projects in the Amazon River basin. The dams provide a source of clean energy. But they also profoundly alter sections of river and disrupt wildlife ecosystems. “This is a completely different scenario from autonomous cars or other [commonly referenced ethical dilemmas], but it’s another setting where these problems are real,” she says. “There are two conflicting objectives, so what should you do?”


The Technical Case for Mixing Cloud Computing and Manufacturing

The movement in manufacturing needs to be around the growth of IaaS usage, including cloud-delivered servers, databases, data integration, and other core components needed to provide the types of services listed earlier in this article. AWS provides all of these components, as does Google and Microsoft.  That said, what keeps many manufacturing companies out of the cloud is the lack of skills and knowledge. It takes a specific skill set to properly integrate existing ‘some-time’ systems that provide no real-time visibility or automated responses with new cloud-based systems that provide the ability to operationalize new and existing data points. The objective is to provide a quick ROI, as well as the ability to move operations in more productive and less expensive directions.  The fundamentals are well understood and are becoming easier to understand by the manufacturing organizations. What’s missing is a stepwise approach that spells out the cloud conversion approach with enough detail to provide the company with a path to tactical and strategic success.


Beyond the Dashboard: How AI Changes the Way We Measure Business


Many BI companies see the potential of AI and have jumped on the bandwagon. Most today generate point-and-click automated insights that surface significant trends, anomalies, and clusters in the day, usually for a highly constrained data set, such as a chart or dashboard. The trick is to do this at scale and in real time. Most BI vendors don’t have the processing power to do that, let alone run it continuously in the background for multiple KPIs simultaneously. With automated insights, the dashboard becomes a jumping off point for obtaining deep insights about business processes. These insights might pop up above or below a KPI, or upon a click; or they might be encoded in text via a natural language generation tool that displays or speaks a deep analysis of the dashboard KPIs. ... FinancialForce applies Salesforce's Einstein AI engine to sales and financial data to generate rich, action-oriented views of customers. Its dashboards display color-coded indicators of customer health, and with a single click, an analysis of under-performing health indicators along with recommendations for improvement.


Developing Microservices with Behavior Driven Development & Interface Oriented Design


BDD involves the triad – the three perspectives of the customer, of the developer, and of the tester. It’s usually applied for the external behavior of an application. Since microservices are internal, the customer perspective is that of the internal consumer, that is, the parts of the implementation which uses the service. So the triad collaboration is between the consumer developers, the microservice developers, and the testers. Behavior is often expressed in a Given-When-Then form, e.g. Given a particular state, When an action or event occurs, Then the state changes and/or an output occurs. Stateless behavior, as business rules and calculations, just shows the transformation from input to output. Interface Oriented Design focuses on the Design Patterns principle “Design to interfaces, not implementations”. A consumer entity should be written against the interface that a producer microservice exposes, not to its internal implementation. These interfaces should be well defined, including how they respond if they are unable to perform their responsibilities. Domain Driven Design (DDD) can help define the terms involved in the behavior and the interface.


Google petitions Supreme Court to reconsider Android Java ruling


Walker said the court initially ruled that the software interfaces were not copyrightable, but that decision was overruled. “A unanimous jury then held that our use of the interfaces was a legal fair use, but that decision was likewise overruled,” he said. “Unless the Supreme Court corrects these twin reversals, this case will end developers’ traditional ability to freely use existing software interfaces to build new generations of computer programs for consumers.”  Walker added: “The US constitution authorised copyrights to “promote the progress of science and useful arts’, not to impede creativity or promote lock-in of software platforms.” In response to Walker’s post, Oracle executive vice-president and general counsel, Dorian Daley, wrote: “Google's petition for certiorari presents a rehash of arguments that have already been thoughtfully and thoroughly discredited. “The fabricated concern about innovation hides Google’s true concern: that it be allowed the unfettered ability to copy the original and valuable work of others as a matter of its own convenience and for substantial financial gain.



Securing the Internet of Things: Governments Action Likely in 2019

person taking a selfie in the lens of a security camera
The landscape is dotted with a few new laws and regulations, such as a California law requiring manufacturers of any devices that connect to the internet to include “reasonable” security features, including unique, user-set passwords for each device rather than generic default credentials that are easier for an intruder to discern. Some security experts, however, have criticized the law as too weak. Well-known consultant Robert Graham wrote, “it’s based on the misconception of adding security features. It’s like dieting …. The key to dieting is not eating more but eating less. The same is true of cybersecurity, where the point is not to add 'security features' but to remove ‘insecure features.’" That reaction shows there’s a lot more to be done. But it will be interesting to see just how aggressively governments push. Will they rely on stronger laws to force the industry to more effectively tackle IoT security? Or gentler approaches, like the United Kingdom’s government website that provides a voluntary code of practice?



Quote for the day:


"Remember teamwork begins by building trust. And the only way to do that is to overcome our need for invulnerability." -- Patrick Lencioni


Daily Tech Digest - January 27, 2018

There seems to be an obvious and somewhat necessary solution here to ensure that employees within an organisation are able to understand their IoT data and apply it to their own sector of expertise for maximum business benefit. One of the ways to resolve this skills shortage is to think about training Millennials to drive IoT projects forward in the future. Millennials are our future workforce and, given they are used to being constantly connected, they are perfectly placed to drive further connectivity. You’ll hear this described as entering the sharing economy. Therefore, as we enter this more circular economy, we need to equip employees with the necessary skills in AI, ML and deep learning (DL). By opening up the opportunity for individuals to specialise in these areas, businesses will be able to apply analytics to streaming data for deeper insights. This will enable more predictive decisions to be made and falls into sync with what a data scientist would be doing day by day. 


Blockchain Technology A Global Perspective!


The world is innovating without permission. The real essence of Blockchain Technology lies in innovating the Government based applications. The Deleware company corporation records stored on Blockchain, The Sweden operating now real-estate transaction on Blockchain, Singapore issuing invoicing into the Blockchain, The UK using Blockchain based monitors for distribution of grants. In Estonia, e-Citizens records, e-Payments key, medical records secured on Blockchain and the Ghana recording land registry using Blockchain Technology. Last but not the least. The most promising initiative of Smart Dubai — Dubai Blockchain Strategy. The Dubai is on the fast track to implement Blockchain in government operations and it will be the First City in the World to be completely powered by the Blockchain by 2020. After producing bunch of POCs and evaluating hundreds of Blockchain Innovations, with supporting tens of game changing startups around the globe. Now, sharing few easy to implement and potential Blockchain Use Cases to change the society in phenomenal way, all for public and private sectors. 


Is It Possible To Learn Data Science & Machine Learning Without Mathematics?


For machine learning, the real prerequisite skill that one needs to learn is data analysis, beginners and there is no need to know calculus and linear algebra in order to build a model that makes accurate predictions. The role of mathematics is particularly significant only if one is involved in machine learning research in an academic setting or for few subsets of more advanced data scientists. There are people in the industry at high levels who are also using advanced math on a regular basis. There are who are pushing the boundaries of machine learning people working on bleeding edge tools. People at companies like Google and Facebook are only ones who certainly use calculus, linear algebra, and more advanced math routinely in their work. The bottom line is that in industry, data scientists just don’t do much higher-level math but I reality they do is they spend a huge amount of their time getting data, cleaning data, and exploring data. The truth is that 80% of what people do is data munging and data visualization.


This Trojan infects Chrome browser extensions, spoofs searches to steal cryptocurrency

Different infection vectors are in place depending on the type of browser found on an infected system. Razy is able to install malicious browser extensions, which is nothing new. However, the Trojan is also able to infect already-installed, legitimate extensions, by disabling integrity checks for extensions and automatic updates for browsers. In the case of Google Chrome, Razy edits the chrome.dll file to disable extension integrity checks and then renames this file to break the standard pathway. Registry keys are then created to disable browser updates. "We have encountered cases where different Chrome extensions were infected," the researchers say. "One extension, in particular, is worth mentioning: Chrome Media Router is a component of the service with the same name in browsers based on Chromium. It is present on all devices where the Chrome browser is installed, although it is not shown in the list of installed extensions."


ML
Machine Learning on Code is actually a field of research that is just starting to materialize into enterprise products. One of the pioneers of movement is a company called source{d}, which is building a series of open source projects turning code into actionable data and training machine learning models to help developers respect technical guidelines. With every company quickly becoming a software company, intangible assets such as code represent a larger share of their market value. Therefore companies should strive to understand their codebase through meaningful analytic reports to inform engineering decisions and develop a competitive advantage for the business. On one hand, managers can use tools like the open source source{d} engine to easily retrieve and analyze all their Git repositories via a friendly SQL API. They can run it from any Unix system, and it will automatically parse their companies’ source code in a language-agnostic way to identify trends and measure progress made on key digital transformation initiatives.


Information theory holds surprises for machine learning

Information theory provides bounds on just how optimal each layer is, in terms of how well it can balance the competing demands of compression and prediction. "A lot of times when you have a neural network and it learns to map faces to names, or pictures to numerical digits, or amazing things like French text to English text, it has a lot of intermediate hidden layers that information flows through," says Artemy Kolchinsky, an SFI Postdoctoral Fellow and the study's lead author. "So there's this long-standing idea that as raw inputs get transformed to these intermediate representations, the system is trading prediction for compression, and building higher-level concepts through this information bottleneck." However, Kolchinsky and his collaborators Brendan Tracey (SFI, MIT) and Steven Van Kuyk (University of Wellington) uncovered a surprising weakness when they applied this explanation to common classification problems, where each input has one correct output (e.g., in which each picture can either be of a cat or of a dog). In such cases, they found that classifiers with many layers generally do not give up some prediction for improved compression.


The controversies of blockchain governance and rough consensus

On-chain governance describes the manner of proposing changes to a cryptocurrency and its underlying blockchain by a certain set of processes, rather than a simple majority consensus. The core differences between blockchains can be highlighted by examining exactly how these decisions are made, and by whom. To understand the concept, it’s important to identify all the participants in the network and how they work together. Miners are a core component in a decentralized public blockchain network because they help sustain it. They are incentivized by transaction fees and block rewards. Developers create the protocol and maintain the blockchain. They are also responsible for enforcing changes such as hard or soft forks. Like miners, developers are also incentivized to keep the network going. When developers propose a change to the network, a core group is tasked with achieving consensus over whether to accept or reject them. For miners, they back changes by contributing their hash power to one of the blockchains borne from a hard fork. 


Hacking enterprise architecture and service design


It is not very common for IT architects and service designers to work together. We are usually at the different ends of digital projects. We speak a different language, and often even physically sit in different buildings. As a service designer and enterprise architect, we had the unique opportunity to work together in various projects at D9. We began to identify strengths and weaknesses of each approach and expertise. There are clear commonalities, and in many ways our expertise complements each other. Government is transforming in the wider context of global and societal issues, complex systems and technological change. Our approach is that the traditional role and structures of government are challenged by new technology, human centred design and internal and external strategic drivers of change. To us, digital transformation is about human, strategy and technology. We work in transformation that requires multidisciplinary work in all the three elements. Throughout our work, we identified that a core challenge for digital transformation is a lack of active dialogue between strategy and development.


'Bitcoin will go to zero': Davos talks up the future of blockchain tech

A visual representation of the digital Cryptocurrency, Bitcoin, is seen on September 04 2018 in Hong Kong, Hong Kong.
Schumacher said the industry is now trying to create "open decentralized systems." These would essentially be next generation protocols or infrastructure that businesses could run on, similar to cloud computing today. The next generation of blockchain technology is currently being developed. Yeung said that she sees blockchain adoption happening quickly in the area of payments, particularly in Asia. "Many developing countries, where just to start with they don't even have credit cards, there's no particular infrastructure, it's almost easier to see sort of blockchain-enabled payments, to see in Asia, you will see more action happening in Asia more than U.S. and Europe," Yeung told CNBC. Ripple CEO Garlinghouse said he expects more widespread adoption of blockchain in about five years, while Schumacher said that it is three years off. However, Hutchins said that ultimately, consumers will not be talking about what blockchain is being used, they will just care how good the use case of a product is.


The future of code quality, security and agility lies in machine learning

code
Source code repository analysis can also reveal information about the developers writing it. Team dynamics can be highlighted by analyzing commits time and content: managers can identify when software engineers are the most productive, arranging meetings and encouraging cross-team collaboration accordingly. Looking at programming languages and frameworks trend can inform hiring managers on what type of talent to hire and what upskilling education resources can they provide. Adding source code as a new dataset in enterprises’ data warehouses and visualization platforms such as Power BI, Looker or Tableau will provide everyone in the engineering organization with a whole new level of source code and development process observability.  Yet the most exciting aspect of looking at code as a dataset is that it can be used to train Machine Learning models that can automate many different repetitive tasks for developers. We’re already starting to see new machine learning based applications for assisted code review or suggestions on GitHub.



Quote for the day:


"It is easier to act yourself into a new way of thinking, than it is to think yourself into a new way of acting." -- A.J. Jacobs


Daily Tech Digest - January 26, 2019

AI is sending people to jail—and getting it wrong


Police departments use predictive algorithms to strategize about where to send their ranks. Law enforcement agencies use face recognition systems to help identify suspects. These practices have garnered well-deserved scrutiny for whether they in fact improve safety or simply perpetuate existing inequities. Researchers and civil rights advocates, for example, have repeatedly demonstrated that face recognition systems can fail spectacularly, particularly for dark-skinned individuals—even mistaking members of Congress for convicted criminals. But the most controversial tool by far comes after police have made an arrest. Say hello to criminal risk assessment algorithms. Risk assessment tools are designed to do one thing: take in the details of a defendant’s profile and spit out a recidivism score—a single number estimating the likelihood that he or she will reoffend. A judge then factors that score into a myriad of decisions that can determine what type of rehabilitation services particular defendants should receive, whether they should be held in jail before trial, and how severe their sentences should be. 


Almost 90 percent of the business leaders surveyed as part of the study believed that cognitive diversity in the workplace is extremely important for running a successful organization. Managers in the contemporary workplace want employees to think differently and experiment with their typified ways of problem solving. While expecting such cognitive diversity was a bit difficult in the past, the role AI can play in the workforce means that organizations can expect greater rewards in the future. AI mechanisms will help augment human efforts in the workplace and stimulate cognitive diversification that benefits the organization. The study also revealed that 75 percent of respondents expected AI to create new roles for employees. This is a clear indication that AI is not going to replace human jobs, but will instead increase efficiency and shift humans’ roles and even create new positions for employees that provide meaningful work better suited to humans’ strengths.


Mondelez vs. Zurich: How watertight is cyber insurance coverage?

Mondelez vs. Zurich: How watertight is cyber insurance coverage? image
To put it bluntly, it appears the insurance sector has not been able to keep up with cyber threats. As new threats pop-up in cyberspace, new policies typically lag behind in a confused state. A lack of visibility of their client’s cyber health also challenges insurers. This is very important for insurers, for example, if somebody wants health insurance, proving whether or not they smoke or that there’s no hereditary diseases which run in their family is vital in establishing how much their premium should be. The visibility issue isn’t just one affecting insurers. Many firms don’t have the tools to adequately assess and respond to the rising levels of cyber risk they’re exposed to. A recent report from the insurer Hiscox claimed that nearly three-quarters (73%) of global firms are “cyber-novices” when it comes to the quality and execution of their security strategy. If it’s the case (and it is) that cyber insurance policies are confusing and have room for improvement, the best thing a company can do is first to understand the cyber risks they face, and then secure a bespoke policy to meet their needs.



Collateral Damage: When Cyberwarfare Targets Civilian Data

Unfortunately, this is par for the course for private-sector businesses and NGOs. Sometimes the breach is to get a critical piece of political or military information to be used later. Sometimes it's to steal intellectual property or research so that the hacking nation can get a competitive boost in the economic and/or military might. Sometimes it's to cull some personal information about someone with the right security clearance — which may mean orchestrating a super-breach, compromising several million other accounts along the way. Notably, these breaches aren't about anything so pedestrian as identity theft or credit card fraud. Instead, the goal is to use the information gleaned as a jumping-off point — to allow escalated access to yet more critical information. This is especially the case with healthcare organizations, where the right juicy health-record tidbit about a well-placed employee (or family member thereof) of a government arm can be used to extort some small amount of extra information or escalated access, turning that employee into an inside-attack threat.


How AI and Quantum Computing May Alter Humanity’s Future


König and the AI research team showed that quantum outperforms classical computing and that quantum effects can “enhance information-processing capabilities and speed up the solution of certain computational problems.” In their research, the team demonstrated that parallel quantum algorithms running in a constant time outperform classical computers. The scientists showed that quantum computers only required a fixed number of steps for problem solving and was better at “solving certain linear algebra problems associated with binary quadratic forms.” Forward-thinking organizations recognize the synergistic boost that the combination of quantum computing and artificial intelligence may herald. Microsoft CEO Satya Nadella stated in a WSJ Magazine interview, “What’s the next breakthrough that will allow us to keep up this exponential growth in computing power and to solve problems—whether it’s about climate or food production or drug discovery?


Bringing open-source rhyme and reason to edge computing: LF Edge

This isn't easy. Interoperability and standards simply don't exist in IoT or Edge Computing. This makes life miserable for anyone working in these areas. It's the LF Edge's founders hope that this pain will bring vendors, OEMs, and developers together to create true open standards. For the broader IoT industry to succeed, the fragmented edge technology players must work together to advance a common, constructive vision. Arpit Joshipura, the Linux Foundation general manager for Edge and IoT, said, "In order for the broader IoT to succeed, the currently fragmented edge market needs to be able to work together to identify and protect against problematic security vulnerabilities and advances common, constructive vision for the future of the industry. LF Edge is realizing this vision with five projects. These support emerging Edge applications in non-traditional video and connected things that require lower latency (up to 20 milliseconds), faster processing, and mobility.


Balancing data privacy with ambitious IT projects for digital transformation

Balancing data privacy with ambitious IT projects image
A global organisation that produces medical devices for the healthcare market used IoT technology to monitor and record the usage of every individual device for product development and preventative maintenance. Regardless of the relatively benign purpose, because of the nature of these medical devices and the broad approach to data collection, the usage data that the developers were collecting was inherently sensitive. Healthcare data is classified as “special category” data by GDPR as well as others, which brings with it additional prohibitions over its use and heightened penalties for its mishandling. More concerning was that neither the patients, the healthcare professionals nor the business were aware of the collection and use of the data. No framework was in place to govern its collection, use or storage. No processes were documented. Furthermore, the business had not yet appointed a data protection officer. Once the legal teams began their GDPR preparations, they quickly discovered this data use.


26 Regulatory Initiatives that Will Shape Fintech in Europe and Beyond

In the banking industry’s quest towards open banking, standardisation has now become the name of the game towards global applicability. There is a consistent push and pull between whether these standards should come from regulators or industry players. On one hand, regulators can future-proof standards in that they could design the standards based on principles that ensure safety in the ecosystem. On the other hand, industry players may be better suited to producing standards or platforms that could better encourage innovation and growth of the industry as they are often instrumental in making it happen. Many of the standards listed below only apply to one region or another, but as the interchange fee regulation in the EU being implemented in Australia shows, there is something to be said about the ripple effect of regulations, particularly when regulators attempt to implement what works in other countries. The following is a list of initiatives, regulations and standards that have been listed in the World Payments Report 2018, by Capgemini and BNP Paribas.


With cybersecurity threats looming, the government shutdown is putting America at risk

net-neutrality-capitol
Employees who are considered “essential” are still on the job, but the loss of supporting staff could prove to be costly, in both the short and long term. More immediately, the shutdown places a greater burden on the employees deemed essential enough to stick around. These employees are tasked with both longer hours and expanded responsibilities, leading to a higher risk of critical oversight and mission failure, as weary agents find themselves increasingly stretched beyond their capabilities. The long-term effects, however, are quite frankly, far more alarming. There’s a serious possibility our brightest minds in cybersecurity will consider moving to the private sector following a shutdown of this magnitude. Even ignoring that the private sector pays better, furloughed staff are likely to reconsider just how valued they are in their current roles. After the 2013 shutdown, a significant segment of the intelligence community left their posts for the relative stability of corporate America. The current shutdown bears those risks as well. A loss of critical personnel could result in institutional failure far beyond the present shutdown, leading to cascading security deterioration.


Three reasons why you need to modernise your legacy enterprise data architecture

null
Most data was of a similar breed in the past. By and large, it was structured and easy to collate. Not so today. Now, some data lives in on-premises databases while other data resides in cloud applications. A given enterprise might collect data that is structured, unstructured, and semi-structured. The variety keeps widening.  According to one survey, enterprises use around 1,180 cloud services, many of which produce unique data. In another example, we integrated over 400 applications for a major enterprise IT firm. The process of integrating all this wildly disparate data alone is too great a task for legacy systems. Within a legacy data architecture, you often have to hand-code your data pipelines, which then need repairing as soon as an API changes. You might also have to oversee an amalgam of integration solutions, ranging from limited point-to-point tools to bulky platforms that must be nurtured through scripting. These traditional approaches are slow, fraught with complexity, and ill-matched for the growing variety of data nowadays.



Quote for the day:


"It is easy to lead from the front when there are no obstacles before you, the true colors of a leader are exposed when placed under fire." -- Mark W. Boyer


Daily Tech Digest - January 25, 2019

Within the Microsoft Cyber Defense Operation Center (CDOC), we focus on these dependencies with teams that coordinate threat intelligence, security monitoring and incident response by exploiting both the common, and unique capabilities of each specialization. It is here that we leverage our global workforce of more than 3,500 security professionals across our product development teams, information security groups, and legal teams to protect our cloud infrastructure and services, products and devices, and internal resources. The engineering teams behind our commercial security solutions, like Azure Security Center (ASC), also take advantage of the Cyber Defense Operation Center (CDOC) community to test hypotheses and pre-flight solutions in a real-world environment. This model is based on a closed-loop system of intelligence, defense, and control that streamlines our security capabilities for more than 200 cloud services, over 100 datacenters, millions of devices, and over a billion customers around the globe.


Stealthy New DDoS Attacks Target Internet Service Providers

An analysis of DDoS data during Q3 2018 by Nexusguard showed attackers trying to overwhelm targeted sites, and even entire ISP -- aka communications provider (CSP) -- networks, by spreading attack traffic across a large number of IP prefixes. Unlike a typical volumetric attack on a single IP address, many of the DDoS campaigns that Nexusguard analyzed involved attackers contaminating legitimate traffic across hundreds of IP addresses with small bits of junk. The attack traffic within each IP address was small enough to avoid detection by DDoS mitigation tools but big enough to take down a targeted site once converged, Nexusguard said in a report published this week. For example, the average attacks involved just 33.2 Mbps of traffic per targeted IP making it hard for service providers to detect and mitigate the traffic. In total, about 159 autonomous systems - most belonging to service providers - were targeted in "bit-and-piece" attacks in Q3 of 2018.


Establish a configuration management strategy to guide transition


No matter the returns on the technology, enterprise IT organizations frequently encounter problems with a configuration management strategy. Various internal teams select different configuration management tools. Team members resist the burden of a steep learning curve with a new tool. Or people stick with their habits because they are simply too busy or distracted with existing work to change. "There are high performers who tend to have their own tastes, [and] there are others who are trying to catch up to that," said Suranjan Chatterjee, global head of the cloud apps, microservices and API unit at Tata Consultancy Services. He cited "a lot of tensions and sensitivities" when diverse teams in a large group must collaborate. To increase automation and win the war on configuration drift, IT organizations should prepare a solid configuration management strategy and evaluate tools specifically based on how easily they onboard and support users.


A human-centred agenda for the future of work

Technology, including artificial intelligence, robotics and sensors, entails countless opportunities to improve work. The extraction of knowledge through data mining can assist labour administrations to identify high-risk sectors and improve labour inspection systems. Blockchain technology could make it easier for companies and the social partners to monitor working conditions and labour-law compliance in supply chains. But digital technology also creates new challenges for decent work. Digital labour platforms provide new sources of income to many workers in different parts of the world, yet the dispersed nature of the work across international jurisdictions makes it difficult to establish workers’ rights. The work on platforms is sometimes poorly paid—even below prevailing minimum wages—and no official mechanisms are in place to address unfair treatment. Thus I introduced into the commission the idea of an international governance system for digital-labour platforms, which would require platforms (and their clients) to respect certain minimum rights and protections.


How can individual employees prepare for the future of work?


Individuals should be aware of traits that will help them prepare for the future and will also make it easier to develop these soft skill areas, one of which is self-awareness. “This self-awareness around purpose is a prime source of energy, resilience and clarity when it comes to dealing with all the choices, challenges and changes around us,” said Empey. “It also helps with a second key point, which is being ‘open’ to change, other points of view, other ways of doing things and so on.” A third strategy he suggests is authentic networking – that is, being quite deliberate in seeking a connection with those who can be of value to you, and you helping them in a generous, mutual and non-favour-expecting way. Individual employees should also remember that it’s not all about the skills we need at work. “The last point I would make is around health and wellbeing,” said Empey.


Cybercriminals Home in on Ultra-High Net Worth Individuals

The conclusions drawn by Glasswall mirrors research conducted by UK-based Campden Wealth, which found that 28% of the UHNW families reported having been the victim of one or more cyberattacks. While UHNW families have an estimated net worth of at least $30 million, Campden Wealth recommends that those setting up single-family offices have wealth of $150 million or more. Many of the families that open single-family offices have far in excess of $150 million, with their average net worth standing at $1.2 billion, according to the Campden Wealth/UBS Global Family Office Report. Dr. Rebecca Gooch, Campden Wealth's director of research, says phishing was the most common type of attack, followed by ransomware, malware infections, and social engineering. She says UHNW individuals are targeted in a variety of ways including via their operating businesses, family offices, or through the family members themselves. More than half the attacks were viewed as malicious.


Poor practices expose 24 million financial records


The records were stored in an Elasticsearch cluster which contained 51GB of what appeared to be OCR credit and mortgages reports, Diachenko said in a blog post. “The documents contained highly sensitive data, such as social security numbers, names, phones, addresses, credit history and other details which are usually part of a mortgage or credit report,” he said. “This information would be a goldmine for cyber criminals, who would have everything they need to steal identities, file false tax returns, get loans or credit cards.” The exposed data was eventually traced to a data and analytics company called Ascension in Fort Worth, Texas, with the help of TechCrunch, which first reported Diachenko’s findings. According to parent company Rocktop Partners, Ascension shut down the server in question after learning of a “server configuration error” that “may have led to exposure of some mortgage-related documents”.


Microservices and the Saga Pattern

Microservices are not new in the market, they have been a part of our day to day life for a while now, so here in this post, I would like to share with you my understanding of what microservices are and what the Saga Pattern is, which is widely used with the microservices. We will start with what exactly we mean when we say: (i) we need a microservice, (ii) what it means to be reactive, and then (iii) dig into the concept of Saga patterns, with respect to a distributed system along with an easy to understand real-life example. ... For example, if we are preparing a product like a Restaurant application, then we would be creating several small microservices like the Orders, Customers, Reservations, etc., which would perform specific tasks around a specific functionality of the Restaurant application, and would be interacting if and only if we need to have the functionalities come together and that, to, only through their exposed APIs. For now, we can think of an API as endpoints of a service which are exposed for use to the outside world


Is customer information safer with a blockchain database?

Spring Labs is spearheading a group of prominent fintech lenders that will use a blockchain-based, peer-to-peer network to share consumer information to help with identification verification on loan applications. Avant, OnDeck Capital and SoFi are among 16 companies currently testing the network, called the Spring Protocol, which is scheduled to go live in the second half of this year. Part of the idea behind Spring’s system is to have a central database lenders can access without replicating critical consumer information on multiple systems, said John Sun, president and chief product officer for Spring Labs. While Spring doesn’t identify as a pure blockchain firm, he said it’s the best way to safeguard and store the information. “We have this solution that does certain things and we asked ourselves what is the best technology to build it that way,” Sun said. “It just so happens that for parts of the protocol and the technology stack, blockchain really is the best way to accomplish what we wanted to do.”


Adding Agile to Lean at Toyota Connected

Thurlow argued that the need to be more flexible, adaptable, and nimble is now a necessity, and no longer an option. Toyota needed to add agility into Lean Product Development. As Thurlow stated: "We took the best of breed agile learning and combined that with decades of lean thinking from the creators at Toyota and established an approach we currently call Scrum The Toyota Way." Every team member has had formal training, followed by continuous coaching in the workplace through a dedicated team of Scrum Masters and Coaches who are independent from the product delivery teams, he said. Coaches are embedded with the teams, but report externally ensuring they have checks and balances. Toyota Connected is building a pattern library of tools and techniques they have created or identified that work in various contexts. Just as The Toyota Production System never stops improving, Scrum The Toyota Way evolves endlessly, said Thurlow.



Quote for the day:


"Leadership is working with goals and vision; management is working with objectives." -- Russel Honore


Daily Tech Digest - January 24, 2019

The team used a graph convolutional neural network — an algorithm that operates on nodes, edges, properties, and other graph structures — to model the statistical relationship among parking locations, traffic flow, parking demand, road links, and parking blocks. Together with a recurrent neural network with long-short term memory (LSTM) — a type of AI algorithm capable of learning long-term dependencies — and a multi-layer decoder, the system extracted parking information from traffic-related data sources (such as parking meter transactions, traffic speed, and weather conditions) and output occupancy forecasts. The researchers trained it on data sourced from the Pittsburgh downtown area, which they note has 97 on-street parking meters across 39 street blocks. Historical parking stats came from the Pittsburgh Parking Authority, while connected car company Inrix’s Traffic Message Channel and Weather Underground’s API supplied traffic speed data and hourly weather reports, respectively.


Evidence-Based Management Guide - Updated

Most organizations need to start by looking at the value they deliver today, or Current Value. Organizations often use revenue to measure this, and if you can measure it instantaneously, it’s not a bad measure; for example, if you are selling items online, by knowing daily sales, or even moment-to moment sales, it can give an organization some sense of the value that customers experience. A better measure is actual customer satisfaction, since sometimes people buy things they never end up using, or buy things only because they have no better alternatives. Measures like Net Promoter Score (NPS), if measured as close to the actual experience as possible, can give a better indicator of value. Even measures that simply show how often a feature is used, and for how long, can give a better picture of what customers value, than does revenue. Going deeper, measures that give insight into why the customer is using the product are even better and can serve as true aligning measures of success. For example, we’ve worked with a company that helps organizations process their insurance claims.


Multi-vector attacks target cloud-hosted technologies

10 cloud security breach virtualization wireless
Attackers often break in by exploiting unpatched vulnerabilities or insecure configurations in services like the Redis data structure store, the Apache Hadoop big-data processing toolset or the Apache ActiveMQ messaging middleware. They also launch brute-force password guessing attacks against a large number of services including MySQL, MongoDB, Memcached, CouchDB, PostgreSQL, Oracle Database, ElasticSearch, RDP, VNC, Telnet, RSync, RLogin, FTP, LDAP and more. One of the most commonly used malware tools observed in attacks against cloud-hosted services is the XBash worm, which first appeared in May 2018. This malware is used to infect both Windows and Linux servers and deploys additional payloads depending on which OS is running. XBash is typically associated with a cybercriminal group known in the security industry as Iron. However, another group called Rocke is also using an XBash variant and has recently been in the news after it started disabling cloud security and monitoring agents.


Linux’s Hyperledger to give developers supply chain building blocks

binary chains / linked data / security / blockchain
"What attracts many organizations to blockchain technology is the possibility of sharing data across corporate boundaries while maintaining a high degree of rigor and accuracy," said Robert Beideman, a vice president with the GS1 standards organization. Last week, SAP launched a blockchain-based a supply chain tracking service that will enable drug wholesalers to authenticate pharmaceutical packaging returned from hospitals and pharmacies. The Linux Foundation described its Hyperledger Grid project as a framework, not a blockchain or an application. "Grid is an ecosystem of technologies...that work together, letting application developers make the choice as to which components are most appropriate for their industry or market model," the Grid project said in a blog post. Grid includes a set of libraries, data models and SDKs to accelerate development for supply chain smart contracts and client interfaces. (Smart contracts are self-executing code based on pre-determined business agreements.


Financial Services and Social Value Can Mix

Many of the problems facing our society come from a lack of social cohesion. Social inequality affects us all. In global terms, economic conditions may have improved, but in real terms, when examined at an individual level within a particular country, inequality can be felt more and more. This perceived impact goes some way toward explaining the recent appearance of populist movements, which is one of the biggest threats to economic development. Any form of populism will always work against the stability that we need. Another significant concern, which is in effect also an opportunity, is social and technological disruption. If we don’t tackle this issue properly, it’ll put an end to insurance as we know it. Consumer profiles and society have changed dramatically, and people expect and demand more from companies. Young people expect companies to be much more committed, more socially active, and more transparent.


Business failing to see strategic value of cyber security


Security professionals said boards perceive them as functional but not as a force for competitive advantage, with 56% saying they feel restricted by the board and only 41% reporting that their organisations have a CISO in place on the board.  Although the security team can be instrumental in business transformation, only 44% believe that the C-suite sees them as a positive force for innovation, and just one in 10 respondents (13%) believe that the board sees them as helping the company to gain a competitive advantage.  The findings suggest that boards may be paying lip service to IT security teams, as there is a disparity between what the board says and how this translates into investment. While 87% of security professionals believe that the board listens to them and values their input, a considerable proportion (62%) believe that the board can’t always see the business case for security investments.


AIOps tools supplement -- not supplant -- DevOps pipelines

AIOps tools enable an IT organization's traditional development, test and operations teams to evolve into internal service providers to meet the current and future digital requirements of their customers -- the organization's employees. AIOps platforms can also help enterprises monitor data across hybrid architectures that span legacy and cloud platforms, Grabner said. These complex IT environments demand new tools and technologies, which both require and generate more data. Organizations need a new approach to capture and manage that data throughout the toolchain -- which, in turn, drives the need for AIOps tools and platforms. AIOps can also be perceived as a layer that runs on top of DevOps tools and processes, said Darren Chait, COO and co-founder of Hugo, a provider of team collaboration tools based in San Francisco. Organizations that want to streamline data-intensive, manual and repetitive tasks -- such as ticketing -- are good candidates for an AIOps platform proof-of-concept project.


Desktop-as-a-Service: The new frontier for end user computing?

Desktop-as-a-Service: The new frontier for end user computing? image
The workspace strategy has evolved. It was static, but has been transcended through a hardware refresh. It is now in what Hill calls “an adaptive phase, which represents a shift to software modernisation and innovation”. Still, however, this transition to adaptive, is a mixed experience if you try and traverse these platforms. In this adaptive phase, DaaS and VDI are often conflated. VDI is a technology and DaaS is a service — an off-premise workspace on-demand via a provider. Here, the responsibility of security can be unclear: the customer, service provider, tooling provider or infrastructure platform provider? ... Workspace analytics will drive innovation and transformation, by enabling the identification of new devices: monitoring them, assessing them and adapting to them. Machine learning will play a huge roll in this more pervasisve analytics strategy, which will enable the next stage of continuous improvement across three channels.


What is malware? Viruses, worms, trojans, and beyond

binary code, magnifying lens, skull and crossbones
Antivirus softwareis the most widely known product in the category of malware protection products; despite "virus" being in the name, most offerings take on all forms of malware. While high-end security pros dismiss it as obsolete, it's still the backbone of basic anti-malware defense. Today's best antivirus software is from vendors Kaspersky Lab, Symantec and Trend Micro, according to recent tests by AV-TEST. When it comes to more advanced corporate networks, endpoint security offerings provide defense in depth against malware. They provide not only the signature-based malware detection that you expect from antivirus, but anti-spyware, personal firewall, application control and other styles of host intrusion prevention. Gartner offers a list of its top picks in this space, which include products from Cylance, CrowdStrike, and Carbon Black. It's fully possible—and perhaps even likely—that your system will be infected by malware at some point despite your best efforts.


Why Do We Need Architectural Diagrams?

The main beneficiary should be the team (developers, test engineers, business analysts, devops, etc.) who have direct involvement in the project. In my experience, outside of the team, there are very few stakeholders who really care about documentation. In the best case, they might be interested in one or two high-level diagrams (e.g. context diagram, application or software component diagram) which roughly describe the structure of the system and give a high-level understanding of it. However, most of the time we fail in identifying the real beneficiaries and their real needs and try to create too much documentation. This quickly becomes a burden to maintain and is quite soon outdated. In other cases, we just simply omit to create any kind of diagram because there is no time, no specific interest, or nobody wants to take on this responsibility. Besides this, the Agile Manifesto prescribes that teams should value working software over comprehensive documentation, which discourages cumbersome documentation processes



Quote for the day:



"A true dreamer is one who knows how to navigate in the dark" -- John Paul Warren