Daily Tech Digest - December 26, 2018

Digital Transformation: How to Create an Intelligent Company


For a company to shift towards becoming intelligent, it needs to have more than just the technology to enable the transformation. There is a need for significant changes in the way employees think about data and how it can be effectively processed and acted on, i.e. a change in culture and the way employees go about their daily business. In particular, data scientist Ronald van Loon has identified the following areas as key to creating intelligent processes that augment the abilities and efficiency of employees: Design thinking is part of a broad methodology that amalgamates elements of imagination, intuition, holistic reasoning, and logic, to explore all the probable solutions for a given problem. It includes the identification of all unarticulated needs expressed by a consumer. After identifying the needs, the team creates solutions that address all those needs and end up creating the “wow” effect. The solutions are generated creatively and analytically. Design thing should always be more solution-oriented than problem-oriented.



6 types of cyber security risks you need to know about


“Technology can’t help a human problem which involves someone manipulating an employee or contractor to perform an action or divulge confidential material. “In one instance, a stranger came onto the premises for an alleged job interview, told the receptionist he had spilled coffee on his CV, handed her a USB and asked her to print it for him. Once the USB was inserted to her computer the attacker gained remote access to that machine and from there, the entire network,” said Dicks. Physical security is a basic but often overlooked form of defence, said Dicks. “Staff must report all strangers they see in the office that are not clearly marked with a visitor’s access card. Access to the building needs to be rigorously managed. “Unknown USBs may not be used and sensitive information should be shredded. Password protection policies must be strictly adhered to – people are still writing their passwords on a piece of paper.”


Disruptive Effects of Cloud Native Machine Learning Systems and Tools


Automated machine learning (AutoML) goes one step further. It can completely automate training a machine learning model and serve it out in production. It accomplishes this by training models from labeled columns (say, images) and automatically evaluating the best model. Next, an AutoML system registers an API that allows for predictions again that trained model. Finally, the model will have many diagnostic reports available that allow for a user to debug the created model—all without writing a single line of code.  Tools like this drive AI adoption in the enterprise by empowering and democratizing AI to all employees. Often, important business decisions are siloed away in the hands of a group of people who are the ones with the technical skills to generate models. With AutoML systems, it puts that same ability directly into the hands of decision makers who create AI solutions with the same ease that they use a spreadsheet.


YES Bank Unveils 20 Data Driven Products at YES Datathon

Yes Bank
The top 20 models identified by the Bank will be taken live within a month and the remaining will be moved to the Bank’s product library to be iteratively developed and taken live within a period of three months. ... Talking about the event, Rajat Monga, Senior Group President, Financial Markets, YES BANK, said “YES BANK embarked on a data centered business model as part of our TechTonic initiative and now has a full stack of technology and talent capability built up. In order to leap frog on this data native transformation, YES Datathon provides us with an opportunity to engage with 6000+ data scientists. It has helped us identify newer use cases as well as statistical techniques and also incorporate cross-industry best practices. Going forward, YES BANK will also host AI/ML challenges and data engineering workshops to deepen practical and technical knowhow of future technology leaders and to facilitate this, has partnered with top IITs and BITS Bombay to further develop the data science ecosystem, allowing students the opportunity to build algorithms and data models in a deployment ready environment.


Cybersecurity Is Providing Information And Solutions Not Selling Fear


Even the most sophisticated companies with massive code review bureaucracies and elaborate deployment checklists can inadvertently push a bad update out. The issue here is not that SiteLock sent an errant malware alert to Domain.com’s customers. Rather, the issue is that the email did not contain any actionable information for the user to triage the situation, non-SiteLock customers had no ability to access any information about the reported malware and the company waited more than 24 hours to send a correction email to affected users, while Domain.com did absolutely nothing to assist its customers. A website that is actively serving malware to visitors is an incredibly serious situation and could indicate that the site has been breached and that customer data may be stolen as well. Waiting more than an entire day before telling users that a malware alert was in error is immensely irresponsible in today’s day and age.


Enterprise SBCs: Why They Matter

Enterprise Network
Today, securing VoIP sessions and applications has become a huge challenge. With a growing number of calls and collaborative sessions using VoIP on public and private networks, service providers must respond to enterprises’ increasing concerns about security. Session border controllers (SBCs) have always been the backbone of secure, quality VoIP. Today, enterprise session border controllers (E-SBCs) are making it possible for even the most mission-critical, massive enterprise VoIP systems to securely connect with SIP trunks, over-the-top trunks, and cloud-based unified communications (UC) technology. There are different types of SBCs, each serving similar but different purposes in a network. Essentially, SBCs are guardians at the gate: They make sure that only certain people are allowed in or out of a network domain. An E-SBC is a type of SBC that is specifically deployed to manage SIP traffic access – including VoIP, video, or instant messaging traffic – between SIP trunks and the enterprise network or between a UC service and the enterprise network.


Hyperledger Sawtooth 1.1 Adds New Consensus Algorithms

As a result of rearchitecting its consensus engine API, consensus protocols are now implemented as “consensus engines”, which improves their modularity. This required creating a new implementation of the Proof of Elapsed Time (PoET) consensus algorithm, which is one of Sawtooth main tenets and strives to achieve minimal resource consumption. PoET is a form of Nakamoto-style consensus, where a leader is elected through some form of lottery to choose a block to be added to a chain of previously committed blocks. While in Bitcoin the lottery is won by the first participant to solve a cryptographic puzzle, PoET leverages Intel Software Guard Extensions (SGX), which are becoming widely available in consumer and enterprise processors. SGX allow applications to create a trusted-code enclave. In short, each participant in PoET requests a wait time from the enclave and claims its role as a leader at the end of the wait. The first participant to claim its leader role wins.


Are Countries Finally Outgrowing Their Fear of Blockchain?

Are Countries Finally Outgrowing Their Fear of Blockchain?
Switzerland is not the first country to offer a national blockchain program. Australia, Malta, Cyprus, the United Arab Emirates, Ireland, Russia, Brazil, China and India, to name a few, have announced a slew of programs. Some countries, like India, have taken a dichotomous approach to cryptocurrency and blockchain. The Central Bank of India (RBI) is still “evaluating” the legality of cryptocurrencies, like bitcoin, while mulling over its own version, the Laxmi coin. On the contrary, the incumbent central government, headed by Prime Minister Narendra Modi, and various tech-savvy state governments like Maharashtra, Andhra Pradesh and Karnataka, have latched on to the blockchain bandwagon. Blockchain is now appearing in nation’s strategies, lawmakers’ lingo and in the agenda of think-tank sessions. Governments seem to have embarked on the process of appearing out of the fear psychosis of cryptocurrencies.


The opportunities and challenges of a freelance data scientist

Freelancing gives me the opportunity to work with people from all over the world — for example, I have worked with clients in Italy, India, Amsterdam and Belgium. The variety means I get lots of opportunities to learn about new fields and new techniques. These kinds of learning experiences are vital to providing clients with quality deliverables. Perhaps the most successful project I have worked on as a freelancer was with a team that was evaluating survey data for an international authors’ journal. The survey had hundreds of questions and results for several thousand authors and editors, who were located in five different countries. I developed smart techniques for analysing such a vast amount of data and the client was delighted with the outcome. I particularly enjoy helping clients frame their projects, helping them to ask the right questions of the data and pointing out the value of generating testable hypotheses.


Is Fintech Recruitment Heading for Troubling Times?

To attract and retain talent, both from abroad and at home, employers will have to do everything they can to vie for talent. Fintech is a competitive recruitment market. To put it simply, companies need people that are smart, talented, and innovative, with a wide range of skills. Unfortunately talent like this is hard to come by. They know their worth and are choosy about who they want to work for. To combat this, employers will need to focus on creating a compelling offer for potential employees. In our opinion a key strategy for attracting Fintech talent has to be offering flexible working at the point of hire. Much of today’s workforce wants to work flexibly. In a recent PowWowNow study, 70% of workers felt that offering flexible working makes a job more attractive to them. This is especially apparent amongst millennial talent who make up a large proportion of the Fintech sector. Tech talent want to be in control of their working hours. They don’t want to waste time commuting if they don’t have to, and they want to work when they are most productive.



Quote for the day:


"When a man assumes leadership, he forfeits the right to mercy." -- Gennaro Angiulo


Daily Tech Digest - December 25, 2018

DevOps disruptors in 2019

null
In 2019, we see will be a significant shift from commercial testing to open source tools, which will have a dramatic effect on the testing vendors in the market. There are several reasons for this. We all know that continuous testing is a critical component for optimising DevOps pipelines, and by its definition, to continuously test teams must be able dramatically scale the number of tests being executed, including running full regression cycles nightly as opposed to end of the Dev cycle and a massive “shift-left” of testing, all the way to the pre-commit and per-commit level. However, traditional commercial solutions struggle to meet the demands of continuous testing in two ways. Firstly, they do not scale, nor do they have the reliability to meet continuous testing requirements. Secondly, with shift-left, the persona of the test author shifts from QA to Dev. All this means that yesterday’s commercial solutions are simply not a fit for today’s developers. Instead, Open Source solutions are a vital piece of making continuous testing a reality.


Web Portals: More Breaches Illustrate the Vulnerabilities

Web Portals: More Breaches Illustrate the Vulnerabilities
One factor contributing to security issues in web portals is that "most organizations don't think about the total cost of running the system/application," says Mark Johnson, a former healthcare CISO and shareholder at consulting firm LBMC Information Security. "Because of that, a newly reported vulnerability may not get patched, or they may be resource constrained and they make 'risky' configuration choices - like adding too many support people as system or application admins. Finally, they may not dedicate the resources necessary to monitor these systems as closely." Based on what BJC has publicly disclosed about its portal incident, it's unclear exactly what caused the breach, Johnson says. "If it was a problem with the portal software or some underlying system or middleware application configuration or patching, there are some basic things that everyone should look to do when they have interactive systems, especially portals, on the internet," he says. Those steps include understanding the requirements of the system or application and reviewing and then implementing security controls that need to be in place based on the "risk of the system or application" and the type of data involved.


Training machines sans bias will only augment humans: AWS executive

Machines
When it comes to humans, they are good in dealing with situations that have ambiguous kind of data points.  "Humans are really good at learning quickly with very little information. ML models are the opposite. They require a lot of data inputs to be able to be trained. "I would argue that you show someone a bicycle a few times and you show them how to ride a bicycle after few times the human being is able to ride that bicycle pretty easily. To just train a robot to ride a bicycle takes millions of hours of training," explained Klein.  In the last one year, AWS has released over 200 ML services and features.  When it comes to Amazon Alexa now talking to humans, he said lot of their customers are using the platform to do voice profiling for a variety of reasons.  "For example, in the financial services industry, we have customers that are looking into voice profiling as an additional factor at their call centres. So if they want to verify if it's you, they can add voice profiling as an additional factor to further reduce fraudulent or impersonation calls," he explained.


NIST Risk Management Framework 2.0 Updates Cyber-Security Policy

"The RMF provides a dynamic and flexible approach to effectively manage security and privacy risks in diverse environments with complex and sophisticated threats, evolving missions and business functions, and changing system and organizational vulnerabilities," the RMF states. "The framework is policy and technology neutral, which facilitates ongoing upgrades to IT resources and to IT modernization efforts—to support and help ensure essential missions and services are provided during such transition periods." The RMF 2.0 includes a long list of tasks that includes an outline of risk management roles within an organization as well as strategy. Identifying common controls as well as having a continuous monitoring strategy is another key component that is part of RMF. Risk itself is at the core of RMF 2.0, with the requirement that organizations execute a risk assessment that includes all assets that need to be protected.


Business owners must understand that having a one-size-fits-all approach to cybersecurity can leave substantial gaps making their businesses vulnerable. The first step is to think about exposure: this includes the hardware and software you are using as well as operations conducted via web or cloud-based systems. You should also consider what unique threats there are to a particular system. An important note: it isn’t enough to think about your own business. What about the third-party vendors you’ve hired? Any of their vulnerabilities will affect you, too. Connectivity of systems both internally and externally has been a major driver of technological progress, and the advent of things like cloud-based storage and mobile payment options have made doing business easier. But while interconnected systems may make things run more efficiently, it also can increase the risk – a vulnerability in one system can affect the connected ones as well.  Keeping critical systems like payroll, business email, and point-of-sale (POS) separate can decrease the inherent risks of connectivity and help ensure that one cyber threat doesn’t compromise a business’ entire operation.


Digital KYC – why it’s finger-clickin’ good

The universal availability of electronic documentation, such as identity cards, is a fundamental building block without which a fully digitised, automated and near real-time KYC capability proves difficult. Progress towards this is being made, notably in developing nations where the challenge of undocumented segments of the population was tricky until digital solutions became available. The Unique Identification Authority of India (UIDIA) was established in 2008 to give a digital identity to every resident. This ‘Aadhaar’ ID now gives access to many key services, including banking ... “Estonia’s approach makes life efficient: taxes take less than an hour to file, and refunds are paid within 48 hours. By law, the state may not ask for any piece of information more than once, people have the right to know what data are held on them and all government databases must be compatible, a system known as the X-road. In all, the Estonian state offers 600 e-services to its citizens and 2,400 to businesses.”


Keeping AI Beneficial and Safe for Humanity


One analogy that Stuart Russell uses that I find helpful is bridges. When we ask a civil engineer to build a bridge, we don’t have to specify ‘make sure it’s safe’ or ‘make sure it doesn’t fall down’. These concepts are built-in when we talk about bridges. Similarly, CHAI would like to get the field of AI to the point where if we ask a software engineer to build an AI system, we don’t have to specify things like value alignment, ethics, and human-compatibility — they should be built right into the definition of AI. If AI is not beneficial to humans, it’s not actually achieving its purpose. Yet we currently have no guarantees that the systems that are in development at the moment are going to be beneficial, and some good reason to believe they won’t be by default — just as a bridge built without the right engineering expertise likely wouldn’t be safe. “I’m not sure we need to have ‘smarter than human’ AI for a system to be dangerous. Any system that is sufficiently competent could be dangerous, even if it doesn’t resemble something that we would recognise as human-like. ...”


2019 Security Predictions Report Released

This year’s security predictions span the categories of cloud, consumer, digital citizenship, security industry, SCADA/manufacturing, cloud infrastructure, and smart home. I won’t spoil your reading of it, but one of the predictions that jumped out for me was regarding Business Email Compromise (BEC) and how targeted threats will go lower down in the org chart. This makes a lot of sense given that CxOs are getting harder to exploit via BEC. They are becoming more aware of the threat and more BEC safeguards are deployed to protect them. An example of such a safeguard is machine learning to fingerprint executive writing styles, like our Writing Style DNA. This prediction is quite actionable, especially given there are tools and techniques being deployed to protect the C-suite, that can be expanded to protect their direct reports as this threat pivots.


Report: Over 300 British Blockchain Companies Shut Down in 2018

Report: Over 300 British Blockchain Companies Shut Down in 2018
Putting a number on it, the U.K.’s Sky News has found out that at least 340 companies claiming to be involved with crypto or blockchain were shut down this year. It obtained these findings by analyzing publicly available figures from the databases of Companies House and Open Corporates. This figure is an increase of 144 percent from just 139 blockchain-related companies that went bust in 2017. The data shows that over 200 of those companies were established during 2017 and 60 percent of them closed down between June and November 2018 alone. On the other side, the number of newly-registered blockchain companies continued to raise throughout the year, reaching a total of 817 in November 2018, which means the market continued to grow overall. However, the report notes that the number of new companies is now growing slower than the number of blockchain businesses shutting down for the first time. And of the companies which haven’t been shut down, over 50 have removed references to blockchain or crypto from their name.


Digital disruption may widen the gender gap: Jessie Qin, EY

Digital innovation is taking over the workplaces and now is the right time to build diversity. In digital innovation, you need the left brain and the right brain to work together. However, there are pain points and frustration that come from the hypothesis that digital disruption is likely to increase the gender gap.  For instance, there is World Economic Forum data that shows if you look at the 15 top economies of the world, digital, robotics and AI will lead to job losses of about 5 million. Men will get one new job out of the three jobs they lose; women, on the other hand, will get only one job out of the five jobs they are losing. What’s even more alarming is that while the disruption stems from technology, women are far less digitally connected — the global Internet user gender gap grew from 11% in 2013 to 12% in 2016, according to data from the International Telecommunication Union. The gap remains large in the world’s Least Developed Countries (LDCs) at 31%.



Quote for the day:


"Leadership is liberating people to do what is required of them in the most effective and humane way possible." -- Max DePree


Daily Tech Digest - December 24, 2018


Prioritizing TD over MVP and vice versa needs to be someone’s responsibility, otherwise who would handle the delivery time from this balance? Thanks to the gains of my Project Management knowledge, now I do handle. I’m the one who should forecast when it makes sense to spend more time working on a better engineering because I know my Stakeholder enough to predict his or her next move towards a brand-new MVP. Let me quickly change contexts for didactic purposes: Ordering pizza while savagely hungry at home, I expect it to arrive maximum within an hour and hot. I know a lot of stuff might go wrong on the way to my house. I would probably embrace some and others I wouldn’t. If the pizza arrives two hours later, I won’t accept it. If it arrives simply warm, it’s fine. The same applies to projects. Most valuable to stakeholders won’t stand a perfect engineering if that doesn’t pay its cost, which means delivery right on time or sooner.



Automated Cyber Attacks Are the Next Big Threat. Ever Hear of 'Review Bombing'?
This is not a theoretical risk, either. It is already happening. Recent incidents involving Dunkin Donuts' DD Perks program, CheapAir and even the security firm CyberReason's honeypot test showed just a few of the ways automated attacks are emerging “in the wild” and affecting businesses. In November, three top antivirus companies also sounded similar alarms. Malwarebytes, Symantec and McAfee all predicted that AI-based cyber attacks would emerge in 2019, and become more and more of a significant threat in the next few years. What this means is that we are on the verge of a new age in cybersecurity, where hackers will be able unleash formidable new attacks using self-directed software tools and processes. These automated attacks on their own will be able to find and breach even well-protected companies, aand in vastly shorter time frames than can human hackers. Automated attacks will also reproduce, multiply and spread in order to massively elevate the damage potential of any single breach.



A data inventory is key to maintaining data privacy compliance

Building and maintaining a comprehensive data inventory can enhance overall data quality and help create a path to streamline the compliance efforts, which helps in the effort of reducing risk through the creation of an effective controls framework. Additionally, identifying potential processes that can be automated creates opportunity for better regulatory reporting in both accuracy and efficiency. Improved accuracy supports improved data security. Clear data maps and inventories can support more effective and proactive security measures that address critical issues, such as which specific business processes the data touches and the related risks of that interaction. Complete data lineage capability is also enabled through data accuracy, allowing for a cohesive approach by audit, security, and compliance groups alike.


Network management must evolve in order to scale container deployments

Network management must evolve in order to scale container deployments
Highly containerized environments are subject to something called “container sprawl.” Unlike VMs, which can take hours to boot, containers can be spun up almost instantly and then run for a very short period of time. This increases the risk of container sprawl, where containers can be created by almost anyone at any time without the involvement of a centralized administrator. Also, IT organizations typically run about eight to 10 VMs per physical server but about 25 containers per server, so it’s easy to see how fast container sprawl can occur. A new approach to managing the network is required — one that can provide end-to-end, real-time intelligence from the host to the switch. Only then will businesses be able to scale their container environments without the risk associated with container sprawl. Network management tools need to adapt and provide visibility into every trace and hop in the container journey instead of being device centric. Traditional management tools have a good understanding of the state of a switch or a router, but management tools need to see every port, VM, host, and switch to be aligned with the way containers operate.


Office 365, Outlook Credentials Most Targeted by Phishing Kits

The phishing kit used the most during the second half of the year was a multi-brand kit that mainly targets Office 365 and Outlook credentials, but which also supports spoofed pages for AOL, Bank of America, Chase, Daum, DHL, Dropbox, Facebook, Gmail, Skype, USAA, Webmail, Wells Fargo, and Yahoo. The second most popular phishing kit in the timeframe also targets Office 365, Cyren says. This tool, however, was specifically built for Office 365 phishing and packs built-in techniques to evade detection, including blocking IPs and security bots, as well as user agents to hide from phishing defenses. A PayPal phishing kit has emerged as the third most used, and employs new levels of sophistication, with several evasive techniques, the researchers say. Fourth in line comes a multi-brand phishing kit that can target almost anything from lifestyle brands to data, banking and email credentials, and more. Apple, Netflix, Dropbox, Excel, Gmail, Yahoo, Chase, PayPal and Bank of America are among the targeted brands.


5 Cloud Trends That Will Dominate 2019

5 Cloud Trends That Will Dominate 2019
Despite the hubbub being raised about the job-stealing nature of automation, you should expect automation services to keep rising in popularity as 2019 unfolds. Automation platforms are more efficient today than ever before, which means that businesses of all shapes and sizes have a sizable economic incentive to digitize their operations to the greatest extent possible. While human capital will always be vital in the cloud marketplace, it’s growing quite obvious that the future of the cloud will at least partly be determined by clever algorithms that do some of our thinking for us. Major corporations like Amazon and Microsoft are already beginning to cash in on this trend with the use of lawyer SEO; Amazon Web Services has a wide variety cloud automation services, for instance, including automatic testing to locate weak security points. As digital privacy and network security grow more important to the public, especially as new data breaches continue to occur, automation will be viewed as a way of securing the cloud and making it a more reliable place to store our sensitive information.


Understanding Blockchain Basics and Use Cases

The use cases for blockchains are still being hotly debated. There is the obvious example of censorship-resistant digital currencies. However, the volatility and fragmentation seen in the cryptocurrency market during 2018 seems to suggest that the actual applicability of trustless digital currencies is limited. From the enterprise perspective, it is becoming clear that they can also be used to create systems or networks that are deployed as a shared construct between multiple entities that don't necessarily trust each other yet want to share data and maintain a form of consensus about concerns that all parties care about. These use cases, where a centralized authority is unacceptable to the participants, or too costly to set up, are still emerging. This is despite the time, effort and venture capital that has deployed into the wide array of blockchain projects created to date. As more projects come to market as we move into 2019, it remains to be seen whether the promise of blockchain will ever amount to the major impact that its advocates have now been promising for quite some time.


AI Inspires a Healthcare Revolution


Heart surgeons are employing data and analytics alongside scalpels and stents as they carry out intricate operations, using digital replicas of human hearts and AI to predict the likely outcomes of treatments. In the future, we may all have these replicas—known as digital twins—that are continuously fed data about our bodies and can help predict when we may become ill, and suggest preventive therapy and the most effective treatments. Digital twin technology has the potential to make significant improvements in diagnosis and treatment of a range of conditions. Building a digital replica of a heart requires collecting reams of data about the patient’s physiological condition, fitness levels and lifestyle. In one case, cardiologists created a digital version of the heart of a patient suffering from an irregular heartbeat, to test whether the patient was among the 70 percent likely to respond to a particular treatment.


When the Tide Goes Out: Big Questions for Crypto in 2019

Debates have raged around the globe about how cryptocurrencies, and particularly ICOs, fit within existing securities, commodities and derivatives laws. Many contend that so-called ‘utility tokens’ sold for future consumption are not investment contracts – but this is a false distinction. By their very design, ICOs mix economic attributes of both consumption and investment. ICO tokens’ realities – their risks, expectation of profits, reliance on the efforts of others, manner of marketing, exchange trading, limited supply, and capital formation — are attributes of investment offerings. In the U.S., nearly all ICOs would meet the Supreme Court’s ‘Howey Test’ defining an investment contract under securities laws. As poet James Whitcomb Riley wrote over 100 years ago: “When I see a bird that walks like a duck and swims like a duck and quacks like a duck, I call that bird a duck.” In 2019, we’re likely to continue seeing high ICO failure rates while funding totals decline.


Embracing agile development: Don’t let technical debt get in the way of innovation

null
By prioritising this type of approach first, you can begin to reduce debt and then resolve other portions as part of a long-term strategy. Remember: this isn’t going to resolve itself overnight. Get the whole team on board before committing to across-the-board technical debt reduction because unlike a Waterfall approach, Agile changes are small and frequent, so everyone will need to commit to the new method. Again, teams can tackle this by adopting an EAD approach, so they can focus on moving slowly and deliberately to avoid including any new changes that might introduce new debt or increases existing debt. An EAD approach also helps to ensure teams are committed to testing throughout the DevOps process, which in turn creates a more collaborative environment and promotes transparency. With Agile, each successive version of the software builds directly on the previous version. It also allows for repeat work that improves upon previously completed activities.



Quote for the day:


"The greatest leader is not necessarily the one who does the greatest things. He is the one that gets the people to do the greatest things." -- Ronald Reagan


Daily Tech Digest - December 23, 2018

Blockchain Data Network
Some critics have been quick to disparage real efforts to create digital voting with strictly theoretical worries. In reality, the rollout in West Virginia is a very focused solution to a specific issue: low overseas voter participation. The current system is broken. A blockchain-driven digital voting app is a clear solution. Anyone but critics of progress should eagerly support West Virginia’s efforts until there is an actual reason to worry. Once any blockchain application is embraced in sufficient numbers by both the using and accepting sides, the impressive software will become an invaluable and ubiquitous tool. More widespread adoption of blockchain’s most beneficial use cases will trigger network effects that will multiply the benefits. Let’s remember that we are in the early days of blockchain. Many industry observers seem to be in a rush to declare blockchain a mainstream technology. As enthusiastic as I am in my support of blockchain, I would not yet call it mainstream. The interconnectedness of the world means its adoption will probably take root and bloom quickly.


Data Analyst and Business Analyst- A contrast

A business analyst is required to have expertise in the industry in which they function. A business analyst working for a finance company must be good with numbers and understand calculations for a payback period and internal rate of return as both are needed for the calculation of ROI( return on investment). They use various tools to analyse and manipulate data. They should also possess excellent communication skills so that they can easily convey the technical data messages to the clients in a way that is understandable to even those who might lack technical knowledge. ... Data analysts are required to possess sharp technical knowledge coupled with excellent industry knowledge. They act like security guards of the company keeping the data safe and also possess a strong and thorough understanding of the relationships that the organisation’s databases hold. They use complex query statements and technologically advanced database tools to extract information from these databases.


Banking with APIs 101


Communication over the phone is no longer necessary thanks to open banking and APIs (Application Programming Interfaces), pieces of software allowing seamless interaction between clients and banks. Not only retail and corporate clients, but an entire ecosystem of internal stakeholders, software suppliers, brokers, asset managers, fintechs, etc. may now benefit from business models shaped around open banking and alternative ways of generating revenues. But what are APIs essentially for? APIs enable communication and data exchange between clients (data requesters) and servers (data holders) in a secure and consistent manner. Applications and data being unbundled in modern architectures, the bank is now requested to share data under open banking regulations. In other words, the most valuable asset the bank possesses, has to be openly and securely shared. APIs can fulfil these needs in the most effective manner. Banks do not of course need to expose all sorts of data, only to provide access to the specific information needed or required. 


Deep automation in machine learning

Automation doesn’t stop when the model is “finished”; in any real-world application, the model can never be considered “finished.” Any model’s performance will degrade over time: situations change, people change, products change, and the model may even play a role in driving that change. We expect to see new tools for automating model testing, either alerting developers when a model needs to be re-trained or starting the training process automatically. And we need to go even further: beyond simple issues of model accuracy, we need to test for fairness and ethics. Those tests can’t be automated completely, but tools can be developed to help domain experts and data scientists detect problems of fairness. For example, such a tool might generate an alert when it detects a potential problem, like a significantly higher loan rejection rate from a protected group; it might also provide tools to help a human expert analyze the problem and make a correction.


Artificial Intelligence - Leading The Silent Revolution in HealthCare


The AI on the CherryHome device can monitor whether an elderly goes into the bathroom and does not return, if they fall, or if their gait is abnormal. To protect the patient’s privacy, CherryHome turns them into a virtual skeleton and sends caregivers and family members real-time notifications of such anomalies. Also, all video footage is processed on-device—not sent to the cloud, as is the case with most home assistants. Already in place is a pilot partnership between CherryHome, TheraCare, in-home caregiving service and TriCura, a tech ecosystem for care agencies. This represents another differentiator for AI, according to Goncharov. A lot of scientists in the AI space are working on fundamental problems—elderly care being just one of them. Looking forward, Goncharov says that AI will be further propelled as machine learning can be done with less and less data. The biggest hurdle to broader applications right now, he says, is the immense amount of data required to teach machines anything—another way that CherryHome is leading the way.


Transforming a Traditional Bank into an Agile Market Leader

In order to fix the environment, you basically boil it down to two big things. You’ve got to create an environment where you teach people and you give people the ability to get their hands dirty, learning by doing. Experimenting. And the second big thing is the fear of risk. In the professional environment, risk is extraordinarily high. At home, worst case is we get frustrated because some app didn’t work. At the bank, people could lose their jobs, they could lose their bonus. So if you figure out a way to learn by doing and make it OK to fail, then it’s OK to take risks. So how do you get this culture change and become like a startup? You have a central team that creates a culture of experimentation, which gives people an opportunity to work with other people [in a risk-free environment]. I was really surprised that in the first couple of years [of our change in mind-set] we started getting really huge traction. And we made it happen in every part of the company, including human resources, marketing and communications, everywhere.


Not all clouds are the same

null
There are different architectures on the cloud security market, some more readily equipped than others to ease the transition away from hardware. An advantage of containerised cloud architecture is streamlined migration to the cloud without sacrificing your network architecture or security posture. Some less sophisticated solutions may compromise on critical capabilities provided by legacy appliances. Consider, for instance, your company’s IP presence and how important it is to operations: an IP address associated with your organisation is used to identify your users to third-party vendors for whitelisting, and for preventing non-authorised users from accessing SAML authentication. Your traffic’s all-important IP identity is lost, however, when traversing typical shared-proxy security architectures. Think too of GDPR - cloud solutions that don’t offer a strong data centre presence, or the controls to keep data in the right place, can be little more than a liability.


Building a VPC with CloudFormation

This article describes how you can use AWS CloudFormation to create and manage a Virtual Private Cloud (VPC), complete with subnets, NATting, route tables, etc. The emphasis is use of CloudFormation and Infrastructure as Code to build and manage resources in AWS, less about the issues of VPC design. You may be wondering why we would use CloudFormation to build our VPC when we can create one via the VPC wizard in the management console.  CloudFormation allows us to create a "stack" of "resources" in one step. Resources are the things we create (EC2 Instances, VPCs, subnets, etc.), a set of these is called a stack. We can write a template that can easily stand up a network stack exactly as we like it in one step. This is faster, repeatable, and more consistent than manually creating our network via the management console or CLI. We can check our template into source control and use it any time we like for any purpose we want.


European Banks Are Pushing the Adoption of Blockchain Technology

European Banks Are Pushing the Adoption of Blockchain Technology
Led by Italy-based Associazione Bancaria Italiana, 14 banks, including BNP Paribas, contributed two months of data to a Corda-based blockchain network. The original press release, delivered in Italian, mentions the establishment of the first phase as a "basis for subsequent synergistic implementations of DLT technologies," which also includes a form of smart contracts that will regulate the transfer of data. With ABI Labs at the helm overseeing a million test transactions between the banks involved, reports show that the performances were satisfactory, which will allow the process to move forward to the next phase. This cooperation between European banks comes on the heels of a project led by the Polish bank PKO Bank Polski, in partnership with the tech company Coinfirm, that will see blockchain technology utilized to notify customers about changes to product terms. The project, titled Trudatum, was described as a "breakthrough on a global scale" by Pawel Kuskowski, President of Coinfirm. All those success stories inevitably attracted the attention of the European Union.


Machine Learning Explainability vs Interpretability

In the context of machine learning and artificial intelligence, explainability and interpretability are often used interchangeably. While they are very closely related, it’s worth unpicking the differences, if only to see how complicated things can get once you start digging deeper into machine learning systems. Interpretability is about the extent to which a cause and effect can be observed within a system. Or, to put it another way, it is the extent to which you are able to predict what is going to happen, given a change in input or algorithmic parameters. It’s being able to look at an algorithm and go yep, I can see what’s happening here. Explainability, meanwhile, is the extent to which the internal mechanics of a machine or deep learning system can be explained in human terms. It’s easy to miss the subtle difference with interpretability, but consider it like this: interpretability is about being able to discern the mechanics without necessarily knowing why. Explainability is being able to quite literally explain what is happening.



Quote for the day:


"Don't focus so much on who is following you, that you forget to lead." -- E'yen A. Gardner


Daily Tech Digest - December 22, 2018

Digital transformation: Are your people just paying lip service?

CIO Engaging, retaining and co-creating IT
The biggest mistake a company can make in digital transformation is starting the transformation journey without first getting the necessary commitment and support. Senior leaders and business stakeholders must commit to rethink and change organizational boundaries, policies, processes, talent and organizational structure as necessary to achieve the strategic intent or vision. If they’re not committed to doing that, the digital transformation effort will fail. Unfortunately, many companies get only lip service from leaders rather than long-term commitment to change. Company leaders can have a great meeting and talk about the need for change and a digital environment to create new competitive positioning, but not get real commitment to change. If your company starts down the path of trying to enable change, without that real commitment, you will face a high risk of pushback and debilitating, passive-aggressive behavior from managers and employees trying to maintain the status quo. The status quo – the existing business model or operating model and efforts to sustain it – represents the most formidable obstacle in your company’s path to digital transformation.


Machine vision can create Harry Potter–style photos for muggles

The code needs to see a head-to-toe cutout of a body seem from the front. It can handle some types of occlusion, such as an arm in front of the body, but cannot handle more complex occlusions, such as somebody sitting with legs crossed. Even still, mapping the cutout from a photograph onto a 3D skeleton does not produce realistic animations. That’s where Weng and co come in. Their main achievement is to develop a way to warp the 2D cutout in a way that creates a realistic 3D model of the body. “Our key technical contribution, then, is a method for constructing an animatable 3D model that matches the silhouette in a single photo,” they say. In the past, computer scientists have tried to solve this problem by deforming a three-dimensional body-shaped mesh to reflect the 2D cutout. That does not always work well, so Weng and co try a different approach. Their idea is to map the body-shaped mesh into 2D space and then align it with the 2D cutout using a warping algorithm.


Data Pipelines of Tomorrow


To grow to scale, data pipeline owners may need to make a few decisions about the data that they store at rest. In the future, the quantity of data generated even within a system will likely outgrow the capacity to store it all. Thus, data engineers of the future will need to consider the following questions: Which data is to remain volatile (in memory only) and temporary?; Which data is kept persistent and stored somewhere? For the data that is stored, a pipeline's storage capacity will need to massively autoscale, while handling increasingly ambiguous formats. This is explains why we now see data pipelines with several different kinds of data stores running side by side. Elasticsearch, for example, works great for storing unstructured (or semi-structured) text-based data, and might be run alongside Redis where super-fast lookups are needed, or a distributed database containing a ledger. ... On a similar note, we predict that the latency of access to data stores — and the time it takes to run queries — will continue to shrink.


How Do You Know If a Graph Database Solves the Problem?



If you have transactional data and do not care how it relates or connects to other transactions, people, etc, then graph is probably not the solution. There are cases where a technology simply stores data, and analysis of the connections and meanings among it is not important. You might have queries that rely on sequentially-indexed data (next record stored next to previous one in storage), rather than relationship-indexed data (record is stored nearest those it is related to). Searching for individual pieces of data or even a list of items also points to other solutions, as it is not interested in the context of that data. Overall, graph solutions will focus and provide the most value from data that is highly-connected and analysis that is looking for possible connections. If this doesn’t fit your use case, another kind of technology may suit it better. ... If you have constant, unchanging types of data that you are collecting, then graph may not be the most appropriate solution. Graphs are well-suited to storing any or all elements and can easily adapt to changing business and data capture needs.


Transparency Is Key to Building Trust in Business

Good governance is critical to building transparency and trust inside and outside an organization. For example, in Australia, the Royal Commission into Misconduct in the Banking, Superannuation and Financial Services Industry, which was established in late 2017, received more than 10,000 submissions, and its findings have revealed widespread misconduct in the sector. Better oversight is clearly necessary and would go a long way in rebuilding consumer trust. But transparent reporting is only a prerequisite for effective board engagement on this and other such issues. It’s also important for the board to engage in robust debate and, when appropriate, challenge the CEO and other leaders, the decisions they make, and the outcomes. CEOs need to think about the imperative for better oversight as a positive development — good governance promotes a healthy organization, and a healthy organization is one that people have confidence in.


Preparing millennials for the age of automation

Even as technology substitutes some forms of work, new types of work will be created. Photo: Aalok Soni/HT
Given the interplay of all these factors, it is difficult to make predictions, but possible to develop scenarios. Our analysis suggests that in India the growth in demand for work, barring extreme scenarios, could more than offset the number of jobs lost to automation. On jobs lost, we find that some 9% of India’s current work activity hours could be automated by 2030 in a “midpoint” automation adoption scenario, and up to 19% in the “rapid” adoption scenario. But, India can, in fact, create enough new jobs to offset automation and employ new entrants, if it undertakes the investments required. Most occupational categories have the potential to grow as India’s economy expands. As many as 100 million new jobs could be created for Indians—net of automation—if the country’s rising prosperity creates demand for construction, retail, and healthcare and education services, and therefore, jobs.


Artificial intelligence, machine learning momentum continues to build

The report digs into mentions of ML and AI in Canadian and UK parliaments, as well as mentions in the US Congressional Record. From 1995 to 2015, there were less than 25 mentions of the technology each year in US Congress. In 2018 there were 100 mentions. In the UK, the technologies were barely mentioned until 2015, while in 2018 mentions skyrocketed to nearly 300. The report also tracks human-level performance milestones of AI. In 1997 IBM's DeepBlue beat chess champion Gary Kasparov, and in 2011 IBM Watson won Jeopardy. By 2016, Google DeepMind's AlphaGo beat leading Go player Lee Sedol. This year, a DeepMind agent reached human level performance in 3D multiplayer first person game, Quake III Arena Capture the Flag. Notably absent from the report is any analysis of military use of AI and government spending on the technology. As noted by UNSW Sydney AI researcher Toby Walsh, some governments including the UK, France, and Germany have committed billions to AI.


The Intelligent Edge: What it is, what it’s not, and why it’s useful

The 3 Cs of the Intelligent Edge: Connectivity, connect and compute
Now consider how an employee with a smartphone app entering a large office building or campus with wireless location services can find a conference room, printer, or people without asking directions. This immediate insight into where the employee resides in relation to these other connected things greatly enhances the experience in this smart building. It's very similar to the retail shopping experience offered by many large retailers, where customers can access turn-by-turn directions on their phones to locate products, figure out what's on sale, or find the restroom. The media and telecom industries face growing distribution pressures from increased video resolution, new formats, expanding bandwidth, and the need for better security and reliability. As a result, telecom service providers are placing sophisticated compute and control systems in businesses and homes. These distributed intelligent edges make the services more competitive and improve customer experiences.


Ethereum thinks it can change the world. It’s running out of time to prove it.

Ethereum is already the most famous cryptocurrency after Bitcoin and the third largest in total value. Unlike the others, however, it aims to serve as a general-purpose computing platform that could, its adherents believe, make possible entirely new forms of social organization. The central topic of Devcon is “Ethereum 2.0,” a radical upgrade that would finally allow the network to realize its true power. The nagging truth, though, is that all the positivity in Prague masks daunting questions about Ethereum’s future. The handful of idealistic researchers, developers, and administrators in charge of maintaining its software are under increasing pressure to overcome technical limitations that stymie the network’s growth. At the same time, well-funded competitors have emerged, claiming that their blockchains perform better. Crackdowns by regulators, and a growing understanding of how far most blockchain applications are from being ready for prime time, have scared many cryptocurrency investors away


APT10 Indictments Show Expansion of MSP Targeting, Cloud Hopper Campaign

The allegations are not new but are almost certain to put further pressure on the already strained relationship between the US and China. The Washington Post last week, in fact, had described the then forthcoming indictments as part of an intensifying US campaign to confront China over the economic espionage activities. Planned actions include sanctions against individuals responsible for the activities and declassification of information related to the breaches. How far such measures will go to deter China remains an open question. Though China famously signed an agreement with the US in 2015 promising not to engage in cyber activities for economic espionage, there's no evidence that hacking activity out of the country has even abated, far less stopped. Dave Weinstein, vice president of threat research at Claroty, sees the latest actions as yet another example of the effort law enforcement is putting into investigating and holding accountable those responsible for such attacks.



Quote for the day:


"What you do makes a difference, and you have to decide what kind of difference you want to make." -- Jane Goodall