Daily Tech Digest - December 25, 2018

DevOps disruptors in 2019

null
In 2019, we see will be a significant shift from commercial testing to open source tools, which will have a dramatic effect on the testing vendors in the market. There are several reasons for this. We all know that continuous testing is a critical component for optimising DevOps pipelines, and by its definition, to continuously test teams must be able dramatically scale the number of tests being executed, including running full regression cycles nightly as opposed to end of the Dev cycle and a massive “shift-left” of testing, all the way to the pre-commit and per-commit level. However, traditional commercial solutions struggle to meet the demands of continuous testing in two ways. Firstly, they do not scale, nor do they have the reliability to meet continuous testing requirements. Secondly, with shift-left, the persona of the test author shifts from QA to Dev. All this means that yesterday’s commercial solutions are simply not a fit for today’s developers. Instead, Open Source solutions are a vital piece of making continuous testing a reality.


Web Portals: More Breaches Illustrate the Vulnerabilities

Web Portals: More Breaches Illustrate the Vulnerabilities
One factor contributing to security issues in web portals is that "most organizations don't think about the total cost of running the system/application," says Mark Johnson, a former healthcare CISO and shareholder at consulting firm LBMC Information Security. "Because of that, a newly reported vulnerability may not get patched, or they may be resource constrained and they make 'risky' configuration choices - like adding too many support people as system or application admins. Finally, they may not dedicate the resources necessary to monitor these systems as closely." Based on what BJC has publicly disclosed about its portal incident, it's unclear exactly what caused the breach, Johnson says. "If it was a problem with the portal software or some underlying system or middleware application configuration or patching, there are some basic things that everyone should look to do when they have interactive systems, especially portals, on the internet," he says. Those steps include understanding the requirements of the system or application and reviewing and then implementing security controls that need to be in place based on the "risk of the system or application" and the type of data involved.


Training machines sans bias will only augment humans: AWS executive

Machines
When it comes to humans, they are good in dealing with situations that have ambiguous kind of data points.  "Humans are really good at learning quickly with very little information. ML models are the opposite. They require a lot of data inputs to be able to be trained. "I would argue that you show someone a bicycle a few times and you show them how to ride a bicycle after few times the human being is able to ride that bicycle pretty easily. To just train a robot to ride a bicycle takes millions of hours of training," explained Klein.  In the last one year, AWS has released over 200 ML services and features.  When it comes to Amazon Alexa now talking to humans, he said lot of their customers are using the platform to do voice profiling for a variety of reasons.  "For example, in the financial services industry, we have customers that are looking into voice profiling as an additional factor at their call centres. So if they want to verify if it's you, they can add voice profiling as an additional factor to further reduce fraudulent or impersonation calls," he explained.


NIST Risk Management Framework 2.0 Updates Cyber-Security Policy

"The RMF provides a dynamic and flexible approach to effectively manage security and privacy risks in diverse environments with complex and sophisticated threats, evolving missions and business functions, and changing system and organizational vulnerabilities," the RMF states. "The framework is policy and technology neutral, which facilitates ongoing upgrades to IT resources and to IT modernization efforts—to support and help ensure essential missions and services are provided during such transition periods." The RMF 2.0 includes a long list of tasks that includes an outline of risk management roles within an organization as well as strategy. Identifying common controls as well as having a continuous monitoring strategy is another key component that is part of RMF. Risk itself is at the core of RMF 2.0, with the requirement that organizations execute a risk assessment that includes all assets that need to be protected.


Business owners must understand that having a one-size-fits-all approach to cybersecurity can leave substantial gaps making their businesses vulnerable. The first step is to think about exposure: this includes the hardware and software you are using as well as operations conducted via web or cloud-based systems. You should also consider what unique threats there are to a particular system. An important note: it isn’t enough to think about your own business. What about the third-party vendors you’ve hired? Any of their vulnerabilities will affect you, too. Connectivity of systems both internally and externally has been a major driver of technological progress, and the advent of things like cloud-based storage and mobile payment options have made doing business easier. But while interconnected systems may make things run more efficiently, it also can increase the risk – a vulnerability in one system can affect the connected ones as well.  Keeping critical systems like payroll, business email, and point-of-sale (POS) separate can decrease the inherent risks of connectivity and help ensure that one cyber threat doesn’t compromise a business’ entire operation.


Digital KYC – why it’s finger-clickin’ good

The universal availability of electronic documentation, such as identity cards, is a fundamental building block without which a fully digitised, automated and near real-time KYC capability proves difficult. Progress towards this is being made, notably in developing nations where the challenge of undocumented segments of the population was tricky until digital solutions became available. The Unique Identification Authority of India (UIDIA) was established in 2008 to give a digital identity to every resident. This ‘Aadhaar’ ID now gives access to many key services, including banking ... “Estonia’s approach makes life efficient: taxes take less than an hour to file, and refunds are paid within 48 hours. By law, the state may not ask for any piece of information more than once, people have the right to know what data are held on them and all government databases must be compatible, a system known as the X-road. In all, the Estonian state offers 600 e-services to its citizens and 2,400 to businesses.”


Keeping AI Beneficial and Safe for Humanity


One analogy that Stuart Russell uses that I find helpful is bridges. When we ask a civil engineer to build a bridge, we don’t have to specify ‘make sure it’s safe’ or ‘make sure it doesn’t fall down’. These concepts are built-in when we talk about bridges. Similarly, CHAI would like to get the field of AI to the point where if we ask a software engineer to build an AI system, we don’t have to specify things like value alignment, ethics, and human-compatibility — they should be built right into the definition of AI. If AI is not beneficial to humans, it’s not actually achieving its purpose. Yet we currently have no guarantees that the systems that are in development at the moment are going to be beneficial, and some good reason to believe they won’t be by default — just as a bridge built without the right engineering expertise likely wouldn’t be safe. “I’m not sure we need to have ‘smarter than human’ AI for a system to be dangerous. Any system that is sufficiently competent could be dangerous, even if it doesn’t resemble something that we would recognise as human-like. ...”


2019 Security Predictions Report Released

This year’s security predictions span the categories of cloud, consumer, digital citizenship, security industry, SCADA/manufacturing, cloud infrastructure, and smart home. I won’t spoil your reading of it, but one of the predictions that jumped out for me was regarding Business Email Compromise (BEC) and how targeted threats will go lower down in the org chart. This makes a lot of sense given that CxOs are getting harder to exploit via BEC. They are becoming more aware of the threat and more BEC safeguards are deployed to protect them. An example of such a safeguard is machine learning to fingerprint executive writing styles, like our Writing Style DNA. This prediction is quite actionable, especially given there are tools and techniques being deployed to protect the C-suite, that can be expanded to protect their direct reports as this threat pivots.


Report: Over 300 British Blockchain Companies Shut Down in 2018

Report: Over 300 British Blockchain Companies Shut Down in 2018
Putting a number on it, the U.K.’s Sky News has found out that at least 340 companies claiming to be involved with crypto or blockchain were shut down this year. It obtained these findings by analyzing publicly available figures from the databases of Companies House and Open Corporates. This figure is an increase of 144 percent from just 139 blockchain-related companies that went bust in 2017. The data shows that over 200 of those companies were established during 2017 and 60 percent of them closed down between June and November 2018 alone. On the other side, the number of newly-registered blockchain companies continued to raise throughout the year, reaching a total of 817 in November 2018, which means the market continued to grow overall. However, the report notes that the number of new companies is now growing slower than the number of blockchain businesses shutting down for the first time. And of the companies which haven’t been shut down, over 50 have removed references to blockchain or crypto from their name.


Digital disruption may widen the gender gap: Jessie Qin, EY

Digital innovation is taking over the workplaces and now is the right time to build diversity. In digital innovation, you need the left brain and the right brain to work together. However, there are pain points and frustration that come from the hypothesis that digital disruption is likely to increase the gender gap.  For instance, there is World Economic Forum data that shows if you look at the 15 top economies of the world, digital, robotics and AI will lead to job losses of about 5 million. Men will get one new job out of the three jobs they lose; women, on the other hand, will get only one job out of the five jobs they are losing. What’s even more alarming is that while the disruption stems from technology, women are far less digitally connected — the global Internet user gender gap grew from 11% in 2013 to 12% in 2016, according to data from the International Telecommunication Union. The gap remains large in the world’s Least Developed Countries (LDCs) at 31%.



Quote for the day:


"Leadership is liberating people to do what is required of them in the most effective and humane way possible." -- Max DePree


Daily Tech Digest - December 24, 2018


Prioritizing TD over MVP and vice versa needs to be someone’s responsibility, otherwise who would handle the delivery time from this balance? Thanks to the gains of my Project Management knowledge, now I do handle. I’m the one who should forecast when it makes sense to spend more time working on a better engineering because I know my Stakeholder enough to predict his or her next move towards a brand-new MVP. Let me quickly change contexts for didactic purposes: Ordering pizza while savagely hungry at home, I expect it to arrive maximum within an hour and hot. I know a lot of stuff might go wrong on the way to my house. I would probably embrace some and others I wouldn’t. If the pizza arrives two hours later, I won’t accept it. If it arrives simply warm, it’s fine. The same applies to projects. Most valuable to stakeholders won’t stand a perfect engineering if that doesn’t pay its cost, which means delivery right on time or sooner.



Automated Cyber Attacks Are the Next Big Threat. Ever Hear of 'Review Bombing'?
This is not a theoretical risk, either. It is already happening. Recent incidents involving Dunkin Donuts' DD Perks program, CheapAir and even the security firm CyberReason's honeypot test showed just a few of the ways automated attacks are emerging “in the wild” and affecting businesses. In November, three top antivirus companies also sounded similar alarms. Malwarebytes, Symantec and McAfee all predicted that AI-based cyber attacks would emerge in 2019, and become more and more of a significant threat in the next few years. What this means is that we are on the verge of a new age in cybersecurity, where hackers will be able unleash formidable new attacks using self-directed software tools and processes. These automated attacks on their own will be able to find and breach even well-protected companies, aand in vastly shorter time frames than can human hackers. Automated attacks will also reproduce, multiply and spread in order to massively elevate the damage potential of any single breach.



A data inventory is key to maintaining data privacy compliance

Building and maintaining a comprehensive data inventory can enhance overall data quality and help create a path to streamline the compliance efforts, which helps in the effort of reducing risk through the creation of an effective controls framework. Additionally, identifying potential processes that can be automated creates opportunity for better regulatory reporting in both accuracy and efficiency. Improved accuracy supports improved data security. Clear data maps and inventories can support more effective and proactive security measures that address critical issues, such as which specific business processes the data touches and the related risks of that interaction. Complete data lineage capability is also enabled through data accuracy, allowing for a cohesive approach by audit, security, and compliance groups alike.


Network management must evolve in order to scale container deployments

Network management must evolve in order to scale container deployments
Highly containerized environments are subject to something called “container sprawl.” Unlike VMs, which can take hours to boot, containers can be spun up almost instantly and then run for a very short period of time. This increases the risk of container sprawl, where containers can be created by almost anyone at any time without the involvement of a centralized administrator. Also, IT organizations typically run about eight to 10 VMs per physical server but about 25 containers per server, so it’s easy to see how fast container sprawl can occur. A new approach to managing the network is required — one that can provide end-to-end, real-time intelligence from the host to the switch. Only then will businesses be able to scale their container environments without the risk associated with container sprawl. Network management tools need to adapt and provide visibility into every trace and hop in the container journey instead of being device centric. Traditional management tools have a good understanding of the state of a switch or a router, but management tools need to see every port, VM, host, and switch to be aligned with the way containers operate.


Office 365, Outlook Credentials Most Targeted by Phishing Kits

The phishing kit used the most during the second half of the year was a multi-brand kit that mainly targets Office 365 and Outlook credentials, but which also supports spoofed pages for AOL, Bank of America, Chase, Daum, DHL, Dropbox, Facebook, Gmail, Skype, USAA, Webmail, Wells Fargo, and Yahoo. The second most popular phishing kit in the timeframe also targets Office 365, Cyren says. This tool, however, was specifically built for Office 365 phishing and packs built-in techniques to evade detection, including blocking IPs and security bots, as well as user agents to hide from phishing defenses. A PayPal phishing kit has emerged as the third most used, and employs new levels of sophistication, with several evasive techniques, the researchers say. Fourth in line comes a multi-brand phishing kit that can target almost anything from lifestyle brands to data, banking and email credentials, and more. Apple, Netflix, Dropbox, Excel, Gmail, Yahoo, Chase, PayPal and Bank of America are among the targeted brands.


5 Cloud Trends That Will Dominate 2019

5 Cloud Trends That Will Dominate 2019
Despite the hubbub being raised about the job-stealing nature of automation, you should expect automation services to keep rising in popularity as 2019 unfolds. Automation platforms are more efficient today than ever before, which means that businesses of all shapes and sizes have a sizable economic incentive to digitize their operations to the greatest extent possible. While human capital will always be vital in the cloud marketplace, it’s growing quite obvious that the future of the cloud will at least partly be determined by clever algorithms that do some of our thinking for us. Major corporations like Amazon and Microsoft are already beginning to cash in on this trend with the use of lawyer SEO; Amazon Web Services has a wide variety cloud automation services, for instance, including automatic testing to locate weak security points. As digital privacy and network security grow more important to the public, especially as new data breaches continue to occur, automation will be viewed as a way of securing the cloud and making it a more reliable place to store our sensitive information.


Understanding Blockchain Basics and Use Cases

The use cases for blockchains are still being hotly debated. There is the obvious example of censorship-resistant digital currencies. However, the volatility and fragmentation seen in the cryptocurrency market during 2018 seems to suggest that the actual applicability of trustless digital currencies is limited. From the enterprise perspective, it is becoming clear that they can also be used to create systems or networks that are deployed as a shared construct between multiple entities that don't necessarily trust each other yet want to share data and maintain a form of consensus about concerns that all parties care about. These use cases, where a centralized authority is unacceptable to the participants, or too costly to set up, are still emerging. This is despite the time, effort and venture capital that has deployed into the wide array of blockchain projects created to date. As more projects come to market as we move into 2019, it remains to be seen whether the promise of blockchain will ever amount to the major impact that its advocates have now been promising for quite some time.


AI Inspires a Healthcare Revolution


Heart surgeons are employing data and analytics alongside scalpels and stents as they carry out intricate operations, using digital replicas of human hearts and AI to predict the likely outcomes of treatments. In the future, we may all have these replicas—known as digital twins—that are continuously fed data about our bodies and can help predict when we may become ill, and suggest preventive therapy and the most effective treatments. Digital twin technology has the potential to make significant improvements in diagnosis and treatment of a range of conditions. Building a digital replica of a heart requires collecting reams of data about the patient’s physiological condition, fitness levels and lifestyle. In one case, cardiologists created a digital version of the heart of a patient suffering from an irregular heartbeat, to test whether the patient was among the 70 percent likely to respond to a particular treatment.


When the Tide Goes Out: Big Questions for Crypto in 2019

Debates have raged around the globe about how cryptocurrencies, and particularly ICOs, fit within existing securities, commodities and derivatives laws. Many contend that so-called ‘utility tokens’ sold for future consumption are not investment contracts – but this is a false distinction. By their very design, ICOs mix economic attributes of both consumption and investment. ICO tokens’ realities – their risks, expectation of profits, reliance on the efforts of others, manner of marketing, exchange trading, limited supply, and capital formation — are attributes of investment offerings. In the U.S., nearly all ICOs would meet the Supreme Court’s ‘Howey Test’ defining an investment contract under securities laws. As poet James Whitcomb Riley wrote over 100 years ago: “When I see a bird that walks like a duck and swims like a duck and quacks like a duck, I call that bird a duck.” In 2019, we’re likely to continue seeing high ICO failure rates while funding totals decline.


Embracing agile development: Don’t let technical debt get in the way of innovation

null
By prioritising this type of approach first, you can begin to reduce debt and then resolve other portions as part of a long-term strategy. Remember: this isn’t going to resolve itself overnight. Get the whole team on board before committing to across-the-board technical debt reduction because unlike a Waterfall approach, Agile changes are small and frequent, so everyone will need to commit to the new method. Again, teams can tackle this by adopting an EAD approach, so they can focus on moving slowly and deliberately to avoid including any new changes that might introduce new debt or increases existing debt. An EAD approach also helps to ensure teams are committed to testing throughout the DevOps process, which in turn creates a more collaborative environment and promotes transparency. With Agile, each successive version of the software builds directly on the previous version. It also allows for repeat work that improves upon previously completed activities.



Quote for the day:


"The greatest leader is not necessarily the one who does the greatest things. He is the one that gets the people to do the greatest things." -- Ronald Reagan


Daily Tech Digest - December 23, 2018

Blockchain Data Network
Some critics have been quick to disparage real efforts to create digital voting with strictly theoretical worries. In reality, the rollout in West Virginia is a very focused solution to a specific issue: low overseas voter participation. The current system is broken. A blockchain-driven digital voting app is a clear solution. Anyone but critics of progress should eagerly support West Virginia’s efforts until there is an actual reason to worry. Once any blockchain application is embraced in sufficient numbers by both the using and accepting sides, the impressive software will become an invaluable and ubiquitous tool. More widespread adoption of blockchain’s most beneficial use cases will trigger network effects that will multiply the benefits. Let’s remember that we are in the early days of blockchain. Many industry observers seem to be in a rush to declare blockchain a mainstream technology. As enthusiastic as I am in my support of blockchain, I would not yet call it mainstream. The interconnectedness of the world means its adoption will probably take root and bloom quickly.


Data Analyst and Business Analyst- A contrast

A business analyst is required to have expertise in the industry in which they function. A business analyst working for a finance company must be good with numbers and understand calculations for a payback period and internal rate of return as both are needed for the calculation of ROI( return on investment). They use various tools to analyse and manipulate data. They should also possess excellent communication skills so that they can easily convey the technical data messages to the clients in a way that is understandable to even those who might lack technical knowledge. ... Data analysts are required to possess sharp technical knowledge coupled with excellent industry knowledge. They act like security guards of the company keeping the data safe and also possess a strong and thorough understanding of the relationships that the organisation’s databases hold. They use complex query statements and technologically advanced database tools to extract information from these databases.


Banking with APIs 101


Communication over the phone is no longer necessary thanks to open banking and APIs (Application Programming Interfaces), pieces of software allowing seamless interaction between clients and banks. Not only retail and corporate clients, but an entire ecosystem of internal stakeholders, software suppliers, brokers, asset managers, fintechs, etc. may now benefit from business models shaped around open banking and alternative ways of generating revenues. But what are APIs essentially for? APIs enable communication and data exchange between clients (data requesters) and servers (data holders) in a secure and consistent manner. Applications and data being unbundled in modern architectures, the bank is now requested to share data under open banking regulations. In other words, the most valuable asset the bank possesses, has to be openly and securely shared. APIs can fulfil these needs in the most effective manner. Banks do not of course need to expose all sorts of data, only to provide access to the specific information needed or required. 


Deep automation in machine learning

Automation doesn’t stop when the model is “finished”; in any real-world application, the model can never be considered “finished.” Any model’s performance will degrade over time: situations change, people change, products change, and the model may even play a role in driving that change. We expect to see new tools for automating model testing, either alerting developers when a model needs to be re-trained or starting the training process automatically. And we need to go even further: beyond simple issues of model accuracy, we need to test for fairness and ethics. Those tests can’t be automated completely, but tools can be developed to help domain experts and data scientists detect problems of fairness. For example, such a tool might generate an alert when it detects a potential problem, like a significantly higher loan rejection rate from a protected group; it might also provide tools to help a human expert analyze the problem and make a correction.


Artificial Intelligence - Leading The Silent Revolution in HealthCare


The AI on the CherryHome device can monitor whether an elderly goes into the bathroom and does not return, if they fall, or if their gait is abnormal. To protect the patient’s privacy, CherryHome turns them into a virtual skeleton and sends caregivers and family members real-time notifications of such anomalies. Also, all video footage is processed on-device—not sent to the cloud, as is the case with most home assistants. Already in place is a pilot partnership between CherryHome, TheraCare, in-home caregiving service and TriCura, a tech ecosystem for care agencies. This represents another differentiator for AI, according to Goncharov. A lot of scientists in the AI space are working on fundamental problems—elderly care being just one of them. Looking forward, Goncharov says that AI will be further propelled as machine learning can be done with less and less data. The biggest hurdle to broader applications right now, he says, is the immense amount of data required to teach machines anything—another way that CherryHome is leading the way.


Transforming a Traditional Bank into an Agile Market Leader

In order to fix the environment, you basically boil it down to two big things. You’ve got to create an environment where you teach people and you give people the ability to get their hands dirty, learning by doing. Experimenting. And the second big thing is the fear of risk. In the professional environment, risk is extraordinarily high. At home, worst case is we get frustrated because some app didn’t work. At the bank, people could lose their jobs, they could lose their bonus. So if you figure out a way to learn by doing and make it OK to fail, then it’s OK to take risks. So how do you get this culture change and become like a startup? You have a central team that creates a culture of experimentation, which gives people an opportunity to work with other people [in a risk-free environment]. I was really surprised that in the first couple of years [of our change in mind-set] we started getting really huge traction. And we made it happen in every part of the company, including human resources, marketing and communications, everywhere.


Not all clouds are the same

null
There are different architectures on the cloud security market, some more readily equipped than others to ease the transition away from hardware. An advantage of containerised cloud architecture is streamlined migration to the cloud without sacrificing your network architecture or security posture. Some less sophisticated solutions may compromise on critical capabilities provided by legacy appliances. Consider, for instance, your company’s IP presence and how important it is to operations: an IP address associated with your organisation is used to identify your users to third-party vendors for whitelisting, and for preventing non-authorised users from accessing SAML authentication. Your traffic’s all-important IP identity is lost, however, when traversing typical shared-proxy security architectures. Think too of GDPR - cloud solutions that don’t offer a strong data centre presence, or the controls to keep data in the right place, can be little more than a liability.


Building a VPC with CloudFormation

This article describes how you can use AWS CloudFormation to create and manage a Virtual Private Cloud (VPC), complete with subnets, NATting, route tables, etc. The emphasis is use of CloudFormation and Infrastructure as Code to build and manage resources in AWS, less about the issues of VPC design. You may be wondering why we would use CloudFormation to build our VPC when we can create one via the VPC wizard in the management console.  CloudFormation allows us to create a "stack" of "resources" in one step. Resources are the things we create (EC2 Instances, VPCs, subnets, etc.), a set of these is called a stack. We can write a template that can easily stand up a network stack exactly as we like it in one step. This is faster, repeatable, and more consistent than manually creating our network via the management console or CLI. We can check our template into source control and use it any time we like for any purpose we want.


European Banks Are Pushing the Adoption of Blockchain Technology

European Banks Are Pushing the Adoption of Blockchain Technology
Led by Italy-based Associazione Bancaria Italiana, 14 banks, including BNP Paribas, contributed two months of data to a Corda-based blockchain network. The original press release, delivered in Italian, mentions the establishment of the first phase as a "basis for subsequent synergistic implementations of DLT technologies," which also includes a form of smart contracts that will regulate the transfer of data. With ABI Labs at the helm overseeing a million test transactions between the banks involved, reports show that the performances were satisfactory, which will allow the process to move forward to the next phase. This cooperation between European banks comes on the heels of a project led by the Polish bank PKO Bank Polski, in partnership with the tech company Coinfirm, that will see blockchain technology utilized to notify customers about changes to product terms. The project, titled Trudatum, was described as a "breakthrough on a global scale" by Pawel Kuskowski, President of Coinfirm. All those success stories inevitably attracted the attention of the European Union.


Machine Learning Explainability vs Interpretability

In the context of machine learning and artificial intelligence, explainability and interpretability are often used interchangeably. While they are very closely related, it’s worth unpicking the differences, if only to see how complicated things can get once you start digging deeper into machine learning systems. Interpretability is about the extent to which a cause and effect can be observed within a system. Or, to put it another way, it is the extent to which you are able to predict what is going to happen, given a change in input or algorithmic parameters. It’s being able to look at an algorithm and go yep, I can see what’s happening here. Explainability, meanwhile, is the extent to which the internal mechanics of a machine or deep learning system can be explained in human terms. It’s easy to miss the subtle difference with interpretability, but consider it like this: interpretability is about being able to discern the mechanics without necessarily knowing why. Explainability is being able to quite literally explain what is happening.



Quote for the day:


"Don't focus so much on who is following you, that you forget to lead." -- E'yen A. Gardner


Daily Tech Digest - December 22, 2018

Digital transformation: Are your people just paying lip service?

CIO Engaging, retaining and co-creating IT
The biggest mistake a company can make in digital transformation is starting the transformation journey without first getting the necessary commitment and support. Senior leaders and business stakeholders must commit to rethink and change organizational boundaries, policies, processes, talent and organizational structure as necessary to achieve the strategic intent or vision. If they’re not committed to doing that, the digital transformation effort will fail. Unfortunately, many companies get only lip service from leaders rather than long-term commitment to change. Company leaders can have a great meeting and talk about the need for change and a digital environment to create new competitive positioning, but not get real commitment to change. If your company starts down the path of trying to enable change, without that real commitment, you will face a high risk of pushback and debilitating, passive-aggressive behavior from managers and employees trying to maintain the status quo. The status quo – the existing business model or operating model and efforts to sustain it – represents the most formidable obstacle in your company’s path to digital transformation.


Machine vision can create Harry Potter–style photos for muggles

The code needs to see a head-to-toe cutout of a body seem from the front. It can handle some types of occlusion, such as an arm in front of the body, but cannot handle more complex occlusions, such as somebody sitting with legs crossed. Even still, mapping the cutout from a photograph onto a 3D skeleton does not produce realistic animations. That’s where Weng and co come in. Their main achievement is to develop a way to warp the 2D cutout in a way that creates a realistic 3D model of the body. “Our key technical contribution, then, is a method for constructing an animatable 3D model that matches the silhouette in a single photo,” they say. In the past, computer scientists have tried to solve this problem by deforming a three-dimensional body-shaped mesh to reflect the 2D cutout. That does not always work well, so Weng and co try a different approach. Their idea is to map the body-shaped mesh into 2D space and then align it with the 2D cutout using a warping algorithm.


Data Pipelines of Tomorrow


To grow to scale, data pipeline owners may need to make a few decisions about the data that they store at rest. In the future, the quantity of data generated even within a system will likely outgrow the capacity to store it all. Thus, data engineers of the future will need to consider the following questions: Which data is to remain volatile (in memory only) and temporary?; Which data is kept persistent and stored somewhere? For the data that is stored, a pipeline's storage capacity will need to massively autoscale, while handling increasingly ambiguous formats. This is explains why we now see data pipelines with several different kinds of data stores running side by side. Elasticsearch, for example, works great for storing unstructured (or semi-structured) text-based data, and might be run alongside Redis where super-fast lookups are needed, or a distributed database containing a ledger. ... On a similar note, we predict that the latency of access to data stores — and the time it takes to run queries — will continue to shrink.


How Do You Know If a Graph Database Solves the Problem?



If you have transactional data and do not care how it relates or connects to other transactions, people, etc, then graph is probably not the solution. There are cases where a technology simply stores data, and analysis of the connections and meanings among it is not important. You might have queries that rely on sequentially-indexed data (next record stored next to previous one in storage), rather than relationship-indexed data (record is stored nearest those it is related to). Searching for individual pieces of data or even a list of items also points to other solutions, as it is not interested in the context of that data. Overall, graph solutions will focus and provide the most value from data that is highly-connected and analysis that is looking for possible connections. If this doesn’t fit your use case, another kind of technology may suit it better. ... If you have constant, unchanging types of data that you are collecting, then graph may not be the most appropriate solution. Graphs are well-suited to storing any or all elements and can easily adapt to changing business and data capture needs.


Transparency Is Key to Building Trust in Business

Good governance is critical to building transparency and trust inside and outside an organization. For example, in Australia, the Royal Commission into Misconduct in the Banking, Superannuation and Financial Services Industry, which was established in late 2017, received more than 10,000 submissions, and its findings have revealed widespread misconduct in the sector. Better oversight is clearly necessary and would go a long way in rebuilding consumer trust. But transparent reporting is only a prerequisite for effective board engagement on this and other such issues. It’s also important for the board to engage in robust debate and, when appropriate, challenge the CEO and other leaders, the decisions they make, and the outcomes. CEOs need to think about the imperative for better oversight as a positive development — good governance promotes a healthy organization, and a healthy organization is one that people have confidence in.


Preparing millennials for the age of automation

Even as technology substitutes some forms of work, new types of work will be created. Photo: Aalok Soni/HT
Given the interplay of all these factors, it is difficult to make predictions, but possible to develop scenarios. Our analysis suggests that in India the growth in demand for work, barring extreme scenarios, could more than offset the number of jobs lost to automation. On jobs lost, we find that some 9% of India’s current work activity hours could be automated by 2030 in a “midpoint” automation adoption scenario, and up to 19% in the “rapid” adoption scenario. But, India can, in fact, create enough new jobs to offset automation and employ new entrants, if it undertakes the investments required. Most occupational categories have the potential to grow as India’s economy expands. As many as 100 million new jobs could be created for Indians—net of automation—if the country’s rising prosperity creates demand for construction, retail, and healthcare and education services, and therefore, jobs.


Artificial intelligence, machine learning momentum continues to build

The report digs into mentions of ML and AI in Canadian and UK parliaments, as well as mentions in the US Congressional Record. From 1995 to 2015, there were less than 25 mentions of the technology each year in US Congress. In 2018 there were 100 mentions. In the UK, the technologies were barely mentioned until 2015, while in 2018 mentions skyrocketed to nearly 300. The report also tracks human-level performance milestones of AI. In 1997 IBM's DeepBlue beat chess champion Gary Kasparov, and in 2011 IBM Watson won Jeopardy. By 2016, Google DeepMind's AlphaGo beat leading Go player Lee Sedol. This year, a DeepMind agent reached human level performance in 3D multiplayer first person game, Quake III Arena Capture the Flag. Notably absent from the report is any analysis of military use of AI and government spending on the technology. As noted by UNSW Sydney AI researcher Toby Walsh, some governments including the UK, France, and Germany have committed billions to AI.


The Intelligent Edge: What it is, what it’s not, and why it’s useful

The 3 Cs of the Intelligent Edge: Connectivity, connect and compute
Now consider how an employee with a smartphone app entering a large office building or campus with wireless location services can find a conference room, printer, or people without asking directions. This immediate insight into where the employee resides in relation to these other connected things greatly enhances the experience in this smart building. It's very similar to the retail shopping experience offered by many large retailers, where customers can access turn-by-turn directions on their phones to locate products, figure out what's on sale, or find the restroom. The media and telecom industries face growing distribution pressures from increased video resolution, new formats, expanding bandwidth, and the need for better security and reliability. As a result, telecom service providers are placing sophisticated compute and control systems in businesses and homes. These distributed intelligent edges make the services more competitive and improve customer experiences.


Ethereum thinks it can change the world. It’s running out of time to prove it.

Ethereum is already the most famous cryptocurrency after Bitcoin and the third largest in total value. Unlike the others, however, it aims to serve as a general-purpose computing platform that could, its adherents believe, make possible entirely new forms of social organization. The central topic of Devcon is “Ethereum 2.0,” a radical upgrade that would finally allow the network to realize its true power. The nagging truth, though, is that all the positivity in Prague masks daunting questions about Ethereum’s future. The handful of idealistic researchers, developers, and administrators in charge of maintaining its software are under increasing pressure to overcome technical limitations that stymie the network’s growth. At the same time, well-funded competitors have emerged, claiming that their blockchains perform better. Crackdowns by regulators, and a growing understanding of how far most blockchain applications are from being ready for prime time, have scared many cryptocurrency investors away


APT10 Indictments Show Expansion of MSP Targeting, Cloud Hopper Campaign

The allegations are not new but are almost certain to put further pressure on the already strained relationship between the US and China. The Washington Post last week, in fact, had described the then forthcoming indictments as part of an intensifying US campaign to confront China over the economic espionage activities. Planned actions include sanctions against individuals responsible for the activities and declassification of information related to the breaches. How far such measures will go to deter China remains an open question. Though China famously signed an agreement with the US in 2015 promising not to engage in cyber activities for economic espionage, there's no evidence that hacking activity out of the country has even abated, far less stopped. Dave Weinstein, vice president of threat research at Claroty, sees the latest actions as yet another example of the effort law enforcement is putting into investigating and holding accountable those responsible for such attacks.



Quote for the day:


"What you do makes a difference, and you have to decide what kind of difference you want to make." -- Jane Goodall


Daily Tech Digest - December 21, 2018

GDPR: EU Sees More Data Breach Reports, Privacy Complaints
The number of data breach reports filed since GDPR went into effect has hit about 3,500 in Ireland, over 4,600 in Germany, 6,000 in France and 8,000 in the U.K. GDPR also gives Europeans the ability to file class-action lawsuits against breached organizations, and some law firms have already been exploring these types of actions. And under article 77 of GDPR - "Right to complain to a supervisory authority" - Europeans can also file complaints with regulators about organizations' data protection practices, as they were also able to do before enactment of the new regulation. Regulators say these complaints have also been increasing. Numerous national data protection authorities say they have seen an increase in both complaints as well as breach reports. But as information security expert Brian Honan has told Information Security Media Group, the increase in data breach reports does not mean there has been a surge in data breaches


Everything you need to know about the CDO explained

Because the role is so reliant on the use of technology, there is an overlap with the CIO position -- and there's some competition as a result, says Ellis. Yet rather than being experts in IT implementation, CDOs are commonly characterised as change agents. "Where CDOs can be very effective, and can initiate new approaches quickly, is where they buy cloud services and avoid in-house IT development in a traditional sense," says Ellis. "CIOs remain the owners of the technology infrastructure of any company." CDOs tend to be strong communicators. They talk about the power of disruption and get people to buy into change. Darren Curry, CDO at NHS Business Services Authority, says the role is about more than implementing digital services. "I support people, identify a vision and enable our people to do their very best," he says. "I see myself as leader who removes the blockers and barriers to allow our people to achieve their aims for our services. That's what I feel any leader -- whether that's a CDO or another senior role -- should be working to achieve."


Want to use AI and machine learning? You need the right infrastructure
Regardless of use case, AI/ML success depends on making the right infrastructure choice, which requires understanding the role of data. AI and ML success is largely based on the quality of data fed into the systems. There’s an axiom in the AI industry stating that “bad data leads to bad inferences”— meaning businesses should pay particular attention to how they manage their data. One could extend the axiom to “good data leads to good inferences,” highlighting the need for the right type of infrastructure to ensure the data is “good.” Data plays a key role in every use case of AI, although the type of data used can vary. For example, innovation can be fueled by having machine learning find insights in the large data lakes being generated by businesses. In fact, it’s possible for businesses to cultivate new thinking inside their organization based on data sciences. The key is to understand the role data plays at every step in the AI/ML workflow.


The Role Of Data Governance In An Effective Compliance Program

Data governance becomes more important the more systems and applications a compliance function uses. Compliance officers want systems that store data in a single repository with standardized data formats because strong data governance ensures accurate reports. From there, compliance officers can make accurate decisions based on what the data tells them. Here’s the rub: The current landscape of compliance technology is composed of many disparate systems that don’t integrate with each other. Compliance officers are often stuck searching for critical data and don’t have a connected approach to the technology that supports their program. They want and need a system that stores data in a single repository with standardized data. How can data governance fix this problem? Automating a compliance program’s many tasks helps to create a unified operations environment. In this paradigm, the compliance function goes beyond its tasks of third-party due diligence and training. 


Scaling Observability at Uber

Srivatsan states that "high cardinality has always been the biggest challenge for our alerting platform." As Aaron Sun writes, "cardinality in the context of monitoring systems is defined as the number of unique metric time series stored in your system's time series database." Originally, Uber handled their high cardinality by having alert queries return multiple series and having rules that trigger only if enough series crossed a threshold. This worked well with queries that returned a bounded number of series with well-defined dependencies. However, once teams started writing queries to alert on a per city, per product, and per app version to support their new product lines, the queries no longer fit this constraint. The team began leveraging Origami to help with these more complicated queries. As noted above, Origami is capable of deduplication and rollup of alerts. It is also capable of creating alerts on combinations of city, product, and app version which are then triggered on aggregate policies.


5 steps to getting started with robotic process automation

At the extremes, some businesses go big and “all in” right away, while others are more measured with an individual use case to provide proof points before further deployment. Many others take a hybrid approach that lies somewhere in between. Getting started with RPA may look different from business to business, but designing a proof-of-concept project is often the best way to jumpstart RPA efforts in your organization. Depending on the structure of your organization, change may not always come swiftly. Executives need proof points when making major decisions such as augmenting or flat-out reimagining long-standing processes. When it comes to RPA, using these five steps to assess your organization's processes and determine which would make for a high-impact proof of concept will set you up for both short- and long-term automation success. And remember — it’s not about replacing jobs. It’s more about handling mundane or time-consuming tasks in a more efficient manner to enable your teams to spend more time concentrating on meaningful work.


Hackers Bypass Gmail, Yahoo 2FA at Scale

Amnesty discovered several credential phishing campaigns, likely run by the same attacker, targeting hundreds of individuals across the Middle East and North Africa. One campaign went after Tutanota and ProtonMail accounts; another hit hundreds of Google and Yahoo users. The latter was a targeted phishing campaign designed to steal text-based second-factor codes. Throughout 2017 and 2018, human rights defenders (HRDs) and journalists from the Middle East and North Africa shared suspicious emails with Amnesty, which reports most of this campaign's targets seem to come from the United Arab Emirates, Yemen, Egypt, and Palestine. Most targets initially receive a fake security alert warning them of potential account compromise and instructing them to change their password. It's a simple scheme but effective with HRDs, who have to be on constant high alert for physical and digital security. From there, targets are sent to a convincing but fake Google or Yahoo site to enter their credentials; then they are redirected to a page where they learn they've been sent a two-step verification code.


FBI kicks some of the worst ‘DDoS for hire’ sites off the internet

US-JUSTICE-POLITICS-COMPUTERS
Several seizure warrants granted by a California federal judge went into effect Thursday, removing several of these “booter” or “stresser” sites off the internet “as part of coordinated law enforcement action taken against illegal DDoS-for-hire services.” The orders were granted under federal seizure laws, and the domains were replaced with a federal notice. Prosecutors have charged three men, Matthew Gatrel and Juan Martinez in California and David Bukoski in Alaska, with operating the sites, according to affidavits filed in three U.S. federal courts, which were unsealed Thursday. “DDoS for hire services such as these pose a significant national threat,” U.S. Attorney Bryan Schroder said in a statement. “Coordinated investigations and prosecutions such as these demonstrate the importance of cross-District collaboration and coordination with public sector partners.” The FBI had assistance from the U.K.’s National Crime Agency and the Dutch national police, and the Justice Department named several companies, including Cloudflare, Flashpoint and Google, for providing authorities with additional assistance.


Connecting Business Challenges and Emerging Technologies

Robotic Process Automation (RPA) can be used to automate tasks previously done by human beings, said O’Carroll. It is often applied to repetitive and mundane tasks – the ones often seen as boring. With RPA you can have a robot doing it for you, she said. Solutions based on RPA technology have decisions built in which enable you to do creative work. She explained how you could train a robot to do purchase orders by building rules to extract information from an email, enter the information into the purchase order system, and generate the purchase order. O’Carroll mentioned use cases for RPA: case management (for instance in healthcare), HR for administrating joiners, movers, people leaving, and banks. It can be cheaper to do these activities with robots, and automation can give people more time to spend with customers, she argued. Machine learning (ML) and artificial intelligence (AI) are a different kind of technology as they are based on how our brain works with neural networks, said O’Carroll. It’s about predicting the right answer and getting better at it.


How AI-powered commerce will change shopping

If you think AI is over-hyped from a commerce point of view, think again. Research shows that customers are 9.5X more likely to view AI as revolutionary versus insignificant. Within the next five years, 87 percent of customers believe AI will have transformed their expectations of companies. But how, exactly, is AI changing expectations? While pop culture sometimes paints AI with a scary science-fiction hue, the truth is that many AI-driven experiences are winning customer appreciation, if not affection. A majority of customers say they like or love AI-powered capabilities like credit card fraud detection, personalized recommendations, and voice-activated personal assistants. And today, "personalized recommendations" doesn't mean merely adding an individual's name to an email subject line. We're talking about uber-personalized communications; 59 percent of customers say tailored engagement based on past interactions is very important to winning their business.



Quote for the day:


"Leaders think and talk about the solutions. Followers think and talk about the problems." -- Brian Tracy