Daily Tech Digest - December 23, 2018

Blockchain Data Network
Some critics have been quick to disparage real efforts to create digital voting with strictly theoretical worries. In reality, the rollout in West Virginia is a very focused solution to a specific issue: low overseas voter participation. The current system is broken. A blockchain-driven digital voting app is a clear solution. Anyone but critics of progress should eagerly support West Virginia’s efforts until there is an actual reason to worry. Once any blockchain application is embraced in sufficient numbers by both the using and accepting sides, the impressive software will become an invaluable and ubiquitous tool. More widespread adoption of blockchain’s most beneficial use cases will trigger network effects that will multiply the benefits. Let’s remember that we are in the early days of blockchain. Many industry observers seem to be in a rush to declare blockchain a mainstream technology. As enthusiastic as I am in my support of blockchain, I would not yet call it mainstream. The interconnectedness of the world means its adoption will probably take root and bloom quickly.

Data Analyst and Business Analyst- A contrast

A business analyst is required to have expertise in the industry in which they function. A business analyst working for a finance company must be good with numbers and understand calculations for a payback period and internal rate of return as both are needed for the calculation of ROI( return on investment). They use various tools to analyse and manipulate data. They should also possess excellent communication skills so that they can easily convey the technical data messages to the clients in a way that is understandable to even those who might lack technical knowledge. ... Data analysts are required to possess sharp technical knowledge coupled with excellent industry knowledge. They act like security guards of the company keeping the data safe and also possess a strong and thorough understanding of the relationships that the organisation’s databases hold. They use complex query statements and technologically advanced database tools to extract information from these databases.

Banking with APIs 101

Communication over the phone is no longer necessary thanks to open banking and APIs (Application Programming Interfaces), pieces of software allowing seamless interaction between clients and banks. Not only retail and corporate clients, but an entire ecosystem of internal stakeholders, software suppliers, brokers, asset managers, fintechs, etc. may now benefit from business models shaped around open banking and alternative ways of generating revenues. But what are APIs essentially for? APIs enable communication and data exchange between clients (data requesters) and servers (data holders) in a secure and consistent manner. Applications and data being unbundled in modern architectures, the bank is now requested to share data under open banking regulations. In other words, the most valuable asset the bank possesses, has to be openly and securely shared. APIs can fulfil these needs in the most effective manner. Banks do not of course need to expose all sorts of data, only to provide access to the specific information needed or required. 

Deep automation in machine learning

Automation doesn’t stop when the model is “finished”; in any real-world application, the model can never be considered “finished.” Any model’s performance will degrade over time: situations change, people change, products change, and the model may even play a role in driving that change. We expect to see new tools for automating model testing, either alerting developers when a model needs to be re-trained or starting the training process automatically. And we need to go even further: beyond simple issues of model accuracy, we need to test for fairness and ethics. Those tests can’t be automated completely, but tools can be developed to help domain experts and data scientists detect problems of fairness. For example, such a tool might generate an alert when it detects a potential problem, like a significantly higher loan rejection rate from a protected group; it might also provide tools to help a human expert analyze the problem and make a correction.

Artificial Intelligence - Leading The Silent Revolution in HealthCare

The AI on the CherryHome device can monitor whether an elderly goes into the bathroom and does not return, if they fall, or if their gait is abnormal. To protect the patient’s privacy, CherryHome turns them into a virtual skeleton and sends caregivers and family members real-time notifications of such anomalies. Also, all video footage is processed on-device—not sent to the cloud, as is the case with most home assistants. Already in place is a pilot partnership between CherryHome, TheraCare, in-home caregiving service and TriCura, a tech ecosystem for care agencies. This represents another differentiator for AI, according to Goncharov. A lot of scientists in the AI space are working on fundamental problems—elderly care being just one of them. Looking forward, Goncharov says that AI will be further propelled as machine learning can be done with less and less data. The biggest hurdle to broader applications right now, he says, is the immense amount of data required to teach machines anything—another way that CherryHome is leading the way.

Transforming a Traditional Bank into an Agile Market Leader

In order to fix the environment, you basically boil it down to two big things. You’ve got to create an environment where you teach people and you give people the ability to get their hands dirty, learning by doing. Experimenting. And the second big thing is the fear of risk. In the professional environment, risk is extraordinarily high. At home, worst case is we get frustrated because some app didn’t work. At the bank, people could lose their jobs, they could lose their bonus. So if you figure out a way to learn by doing and make it OK to fail, then it’s OK to take risks. So how do you get this culture change and become like a startup? You have a central team that creates a culture of experimentation, which gives people an opportunity to work with other people [in a risk-free environment]. I was really surprised that in the first couple of years [of our change in mind-set] we started getting really huge traction. And we made it happen in every part of the company, including human resources, marketing and communications, everywhere.

Not all clouds are the same

There are different architectures on the cloud security market, some more readily equipped than others to ease the transition away from hardware. An advantage of containerised cloud architecture is streamlined migration to the cloud without sacrificing your network architecture or security posture. Some less sophisticated solutions may compromise on critical capabilities provided by legacy appliances. Consider, for instance, your company’s IP presence and how important it is to operations: an IP address associated with your organisation is used to identify your users to third-party vendors for whitelisting, and for preventing non-authorised users from accessing SAML authentication. Your traffic’s all-important IP identity is lost, however, when traversing typical shared-proxy security architectures. Think too of GDPR - cloud solutions that don’t offer a strong data centre presence, or the controls to keep data in the right place, can be little more than a liability.

Building a VPC with CloudFormation

This article describes how you can use AWS CloudFormation to create and manage a Virtual Private Cloud (VPC), complete with subnets, NATting, route tables, etc. The emphasis is use of CloudFormation and Infrastructure as Code to build and manage resources in AWS, less about the issues of VPC design. You may be wondering why we would use CloudFormation to build our VPC when we can create one via the VPC wizard in the management console.  CloudFormation allows us to create a "stack" of "resources" in one step. Resources are the things we create (EC2 Instances, VPCs, subnets, etc.), a set of these is called a stack. We can write a template that can easily stand up a network stack exactly as we like it in one step. This is faster, repeatable, and more consistent than manually creating our network via the management console or CLI. We can check our template into source control and use it any time we like for any purpose we want.

European Banks Are Pushing the Adoption of Blockchain Technology

European Banks Are Pushing the Adoption of Blockchain Technology
Led by Italy-based Associazione Bancaria Italiana, 14 banks, including BNP Paribas, contributed two months of data to a Corda-based blockchain network. The original press release, delivered in Italian, mentions the establishment of the first phase as a "basis for subsequent synergistic implementations of DLT technologies," which also includes a form of smart contracts that will regulate the transfer of data. With ABI Labs at the helm overseeing a million test transactions between the banks involved, reports show that the performances were satisfactory, which will allow the process to move forward to the next phase. This cooperation between European banks comes on the heels of a project led by the Polish bank PKO Bank Polski, in partnership with the tech company Coinfirm, that will see blockchain technology utilized to notify customers about changes to product terms. The project, titled Trudatum, was described as a "breakthrough on a global scale" by Pawel Kuskowski, President of Coinfirm. All those success stories inevitably attracted the attention of the European Union.

Machine Learning Explainability vs Interpretability

In the context of machine learning and artificial intelligence, explainability and interpretability are often used interchangeably. While they are very closely related, it’s worth unpicking the differences, if only to see how complicated things can get once you start digging deeper into machine learning systems. Interpretability is about the extent to which a cause and effect can be observed within a system. Or, to put it another way, it is the extent to which you are able to predict what is going to happen, given a change in input or algorithmic parameters. It’s being able to look at an algorithm and go yep, I can see what’s happening here. Explainability, meanwhile, is the extent to which the internal mechanics of a machine or deep learning system can be explained in human terms. It’s easy to miss the subtle difference with interpretability, but consider it like this: interpretability is about being able to discern the mechanics without necessarily knowing why. Explainability is being able to quite literally explain what is happening.

Quote for the day:

"Don't focus so much on who is following you, that you forget to lead." -- E'yen A. Gardner

No comments:

Post a Comment