Daily Tech Digest - October 14, 2018


According to the sources, global fintech companies reportedly sought an extension of the October 15 deadline but it seems that the RBI is not inclined to relax the norms. Data localisation requires data about residents be collected, processed, and stored inside the country, often before being transferred internationally, and usually transferred only after meeting local privacy or data protection laws. Although domestic companies have welcomed the guidelines, global companies fear increase in their expenses for creation of local servers. To avoid this rise in cost, global companies in recent meeting with the RBI proposed to provide mirror data instead of original data to which the central bank did not agree, the sources said. Last week, Finance Minister Arun Jaitley met RBI Deputy Governor B P Kanungo to discuss RBI’s data localisation norms. The meeting was also attended by Economic Affairs Secretary Subhash Chandra Garg, Financial Services Secretary Rajiv Kumar and IT Secretary Ajay Prakash Sawhney.



The Data Quality Tipping Point

The Data Quality Tipping Point
It’s clear that data is no longer harvested and stored. Data isn’t left to rest any longer. It is the lifeblood that flows through every department in the business. It’s not just the result of a decision: it’s the driving force for your next move. Old, inaccurate and messy data can’t support the marketing department. If the data is old, it cannot be used as a concrete and reliable resource. And if you aren’t continually cleaning new data that comes in, you can’t capitalise on trends, or make decisions on what is and isn’t working. So we’re clear that data quality initiatives must run in parallel to business activities, rather than being carried out sporadically, and there needs to be a constant and attentive process to keep data clean. That means there’s a need for an ongoing investment in data governance, within the parameters of your budget. Few businesses have the budget to put extravagant data management processes in place. It would be wonderful to conduct data reviews every morning, or implement highly elaborate verification and enhancement programs.


Creating a Culture that Works for Data Science and Engineering


While both groups on the team are turning out great code, it’s challenging as a project manager to follow two different streams of work. Sometimes the two groups are working on similar things, but sometimes the data scientists are working on something in the very distant future for the engineers. The most important thing a cross-functional team can do is have everyone come to stand up every day. When we first told the data scientists about our daily “meetings,” they went pale in the face. “Every day?” they asked, with a look of panic in their eyes. I stood firm. It was the right call. Our daily meetings allow the engineers on our team to quickly start working from an informed place when R&D introduces a new project. Furthermore, we are benefiting from the best parts of agile with this approach; I love hearing everyone bounce ideas off each other in stand up. My favorite is when there’s a cross-functional “Ooo did you think about taking this approach?” We work better as a team and we have found a way to leverage everyone’s expertise.



The tech supply chain is more vulnerable than ever


It’s a great business model — especially when you consider that only 38 percent of companies are actively monitoring and managing their software supply chain hygiene. Today, the game has changed. Organizations now must contend with the fact that hackers are intentionally planting vulnerabilities directly into the supply of open source components. In one such example from February 2018, a core contributor to the conventional-changelog ecosystem (a common JavaScript code package) had his commit credentials compromised. A bad actor, using these credentials, published a malicious version of conventional-changelog (version 1.2.0) to npmjs.com. While the intentionally compromised component was only available in the supply chain for 35 hours, estimates are that it was downloaded and installed more than 28,000 times. Some percentage of these vulnerable components were then assembled into applications that were then released into production. The result is that these organizations then unwittingly released a Monero cryptocurrency miner into the wild — and the perpetrators of the supply chain hack profited handsomely.



How to use machine learning to build a predictive algorithm

You also have to make sure you're integrating not only data and platforms, but domain experts who bring invaluable information and skills to the data science team, according to David Ledbetter, a data scientist at Children's Hospital Los Angeles. "The machine learning community often isolates themselves and thinks they can solve all the problems, but domain experts bring value," Ledbetter said during a panel discussion at the AI World Conference & Expo in Boston in December. "Every time we meet with the clinical team, we learn something about what's going on with the data." The project team, with its mix of skills, needs to also identify good vs. bad outcomes based on the business problem you're trying to solve with a predictive algorithm. "It's important to set clear success criteria at the beginning of a project, and [to] pick something that has a reasonable likelihood of success," said William Mark, president of SRI International, aresearch and development firm that works on AI projects for customers, during the same panel discussion at AI World.


Cloud-agnostic container platforms – it’s all to play for

Steps into blue sky with clouds, sun © kraft2727 - Fotolia.com
Container-as-a-service (CaaS) products from the major cloud vendors, notably AWS EKS and Fargate, Azure AKS and Container Instances and Google Cloud Container Engine, present classic trade-offs between convenience and dependence. With their ability to tap into a plethora of cloud data, security and developer services that are unique in implementation if not conception, container products from the big three vendors can trap users in a maze of platform dependencies with no easy exit path. As container use in the enterprise moves from developer sandboxes to production systems, the desire for multi-environment portability presents an opportunity to devise standards, software, and automation systems that facilitate platform-agnostic container platforms. The idea is to ensure easy migration between private and public container environments. Recent announcements from Cisco, Google, and Pivotal Software are important milestones on the road to platform agnostic container infrastructure.


Welcome to Banking-as-a-Service

The underlying theme of this kind of disruption is the unbundling of supply and service. Banking has come late to the unbundling revolution. But now, the sector is ripe for it - for unbundling, or disaggregation - and ripe for its own Software-as-a-Service transformation that will allow customers to pick and choose and pay for applications as they use them. Software-as-a-Service (SaaS) businesses delivered by APIs have a low-touch sales model. These companies don’t sell; buyers help themselves. Low-touch sales combined with recurring revenues and lack of customer concentration are the three hallmarks of a SaaS business. In many cases these businesses are just better in all senses. But combining these three essential ingredients on their own will not be enough. The winners in this field are likely to be nimble specialists capable of creating plug-in-and-play APIs to allow anything to be processed anywhere, rather than the large - slow - generalists of the past. Starling is well-placed in this regard. We have built Starling with a set of public APIs that are freely available for anyone to use through our developer portal. 


5 Tips to Boost Your Company's Digital Transformation With BPM


With tools such as artificial intelligence and machine learning, reams of data can be processed in the blink of an eye, providing insights into how an organization can better meet customer needs. Often, this optimization is a product of changes in business process management, or BPM. Even the most basic organizations function through processes. There might be a process for acquiring leads, a process for vetting them, and a process for making a sale. After you convert a prospect, there's a process for invoicing the customer, one for fulfilling the order, and one for delivering the product. There are also strictly internal processes, such as those triggered when employees ask for time off or request tech support. BPM refers to the management of these procedures, such as ensuring they are effective and determining how to combine them in the most efficient way. When implemented effectively, BPM helps organizations streamline their day-to-day processes, making work more efficient. But implementing BPM or other digital transformations without full buy-in from your team can lead to a lack of teamwork or other disadvantages. 


APIs In Banking: Unlocking Business Value With Banking As A Platform (BaaP)

Banking as a Platform (BaaP), sometimes referred to as Banking as a Service (BaaS), occurs when a bank acts as an infrastructure provider to external third parties. Variations include other banks white-labeling the BaaP platform for faster time to market, fintech firms leveraging the BaaP provider’s banking license to provision bank accounts, and banks and fintechs using the BaaP platform for testing purposes. Banks like CBW, Fidor, JB Financial, solarisBank, and wirecard built their BaaP architecture from scratch, without the constraint of legacy systems, creating modular application stacks broken into discrete services. The modular banking services on a BaaP platform serve as building blocks, accessible to third parties through an API management layer, where they can be mixed and matched to create new products and services tailored to the third party’s business model


Life Is Dirty. So Is Your Data. Get Used to It.

As Dr. Hammond suggests, it's difficult to determine if data is ever clean. Even scientific constants have a degree of accuracy. They are "good enough," but not perfect. Data's ultimate purpose is to drive decisions. Bad data means bad decisions. As data professionals, it is up to us to help keep data "good enough" for use by others. We have to think of ourselves as data janitors. But nobody goes to school to become a data janitor. Let's talk about options for cleaning dirty data. Here's a handful of techniques that you should consider when working with data. Remember, all data is dirty, you won't be able to make it perfect. Your focus should be making it "good enough" to pass along to the next person. The first thing you should do when working with a dataset is to examine the data. Ask yourself, "does this data make sense?" That's what we did in the example above. We looked at the first few rows of data and found that both the city and country listed inside one column.



Quote for the day:


"Courage is more exhilarating than fear and in the long run it is easier." -- Eleanor Roosevelt


No comments:

Post a Comment