Daily Tech Digest - October 17, 2018

Microsoft Surface Pro 6
This time around, the major changes are inside: A bump up in the processor to an 8th-generation Core chip, some weird adjustments in pricing, and a new color— black—separate the new from the old. There's actually a downgrade of sorts in the GPU compared to the Surface Pro (2017), which is a bit of a disappointment. The Performance section of our review shows the clearest differences among the three generations. We've given the Surface Pro 6 what some would consider an "average" score of 3.5 stars, a lower score than we've given some other tablet PCs we've reviewed recently. But we're also giving it an Editor's Choice, like those other products. Despite being underwhelmed by the Surface Pro 6's failure to break new ground (or even add USB-C), we will give it this: It also has a nice, long 8.5 hours of battery life in our tests, which has been an Achilles heel with reviewed competition. It is still one of the best-designed Windows tablets you can buy, and its pricing is competitive with similarly configured products.



AI Common Sense Reasoning

To focus this new effort, MCS will pursue two approaches for developing and evaluating different machine common sense services. The first approach will create computational models that learn from experience and mimic the core domains of cognition as defined by developmental psychology. This includes the domains of objects (intuitive physics), places (spatial navigation), and agents (intentional actors). Researchers will seek to develop systems that think and learn as humans do in the very early stages of development, leveraging advances in the field of cognitive development to provide empirical and theoretical guidance. “During the first few years of life, humans acquire the fundamental building blocks of intelligence and common sense,” said Gunning. “Developmental psychologists have founds ways to map these cognitive capabilities across the developmental stages of a human’s early life, providing researchers with a set of targets and a strategy to mimic for developing a new foundation for machine common sense.”


Digital business projects in a quagmire? Hack your culture!

Gartner: Digital business projects in a quagmire? Hack your culture!
Changing mindsets is a key enabler of new technologies and one of the ways Gartner recommended that IT executives change the culture of their companies. “Hack your culture to change your culture,” said Kristin Moyer, research vice president and distinguished analyst at Gartner. “By culture hacking, we don’t mean finding a vulnerable point to break into a system. It’s about finding vulnerable points in your culture and turning them in to real change that sticks.” Hacking is about doing smaller actions that usually get overlooked Moyer said. Great hacks also trigger emotional responses, have immediate results and are visible to lots of people at once, she said. Gartner says culture is identified by 46 percent of CIOs as the largest barrier to getting the benefits of digital business. Achieving culture change is tied closely to another key direction organizations should strive to achieve – the ability to embrace change and adopt technology in a new way or what Gartner calls “dynamism.”


AI is fueling smarter collaboration

The first is improving the ability of individuals to access data. "Today, finding a document could be tedious [and] analyzing data may require writing a script or form," Lazar said. With AI, a user could perform a natural language query -- such as asking the Salesforce.com customer relationship management (CRM) platform to display third quarter projections and how they compare with the second quarter -- and generate a real-time report. Then, asking the platform to share this information with the user's team and get its feedback could launch a collaborative workspace, Lazar said. The second possible benefit is predictive. "The AI engine could anticipate needs or next steps, based on learning of past activities," Lazar said. "So if it knows that every Monday I have a staff call to review project tasks, it may have required information ready at my fingertips before the call. Perhaps it suggests things that I'll need to focus on, such as delays or anomalies from the past week."


Automation and employment debate takes a new turn


What gives machines -- and process automation -- the edge over humans? In addition to their ability to integrate data, machines, Levav noted, lack biases such as the illusion of validity, which leads people to overestimate their forecasting prowess. Yet, humans are still required in process automation, because only they can decide the important parameters, he added. "You will have a job because machines can't pick the variables that are relevant to a problem," he said. Scott Hartley, partner at venture capital firm Two Culture Capital, shared a similar view regarding the impact of AI on jobs. His take on AI-infused automation and employment takes a cue from Voltaire. Hartley's 2017 book, The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, cites a statement attributed to the 18th century philosopher to support his view that asking the right questions about data is central to acquiring knowledge. Making AI and machine learning work, Hartley said during a UiPath panel discussion, is "still fundamentally rooted in our ability to create diverse teams and ask questions from a multiplicity of angles."


Strengthening the CIO – Corporate Board Relationship

But on a positive note, more board members see how technology is unlocking new business models and spurring growth. They are convinced of the growing need to focus on speed, agility, innovation, and customer obsession, and see that it requires new approaches to business operations and to IT investment. Technology and cybersecurity have historically been seen as compliance issues under the purview of the board’s audit committee. However, given the increasing capability of technology to affect revenue and the business model, there is a greater recognition of its strategic importance. This has led to an increase in the number of CIOs and other technology experts being appointed to boards. Still, though, the majority of boards lack the technology prowess needed to successfully guide today’s digital era company.  What, then, can the CIO do to bridge the gap and develop a great relationship with the board?


Steel yourself for the cloud hangover

Steel yourself for the cloud hangover
We’re at what I call the hangover phase, where a night of cloud-hyped indulgence has led to many self-administered pats on the back, which obscured the reality that transitioning to the cloud is a harder than people originally thought. But the effort is still worth it. The budget overruns are no surprise, given that not much cost planning takes place during initial large cloud computing projects. Indeed, these initial projects fail to illustrate the true costs of using a public cloud, and if you look carefully you can see that the private clouds many such initial efforts focus on are just new cages of servers in data centers that cost more than the old cages of servers. Moreover, people costs are always higher than expected, and few enterprises plan to run both cloud and on-premises systems—but the reality is that you need to. What troubled me is that only 48 percent of the mid-sized businesses and only 36 percent of the large enterprises agree that cloud actually improved the business. I suspect that those who do not see the value have yet to complete a project’s successful journey to the cloud. But still, this figure should be higher.


Five steps for getting started in machine learning: Top data scientists share their tips

"If someone has programming fundamentals then, from a technical point of view, I think that's enough for them to dive into machine learning," he says. "You're not gonna get very far if you can't program at all, because that's ultimately how you configure the machine-learning frameworks is through programming. "I think strong math was probably more essential before than it is now. It's certainly helpful to have mathematical knowledge if you want to develop custom layers or if you're really going very, very deep on a problem. But for people starting out, it's not critical." In some respects, it's just as important to have a willingness to seek out new information, says Yangqing Jia, director of engineering at Facebook. "As long as you keep an exploratory mindset there's such an abundance of tools nowadays you'll be able to learn a lot of things yourself, and you have to learn things yourself because the field is growing really fast."


Researchers expose security vulnerabilities in terahertz data links

terahertz data links security
“In microwave communications, an eavesdropper can put an antenna just about anywhere in the broadcast cone and pick up the signal without interfering with the intended receiver,” Mittleman said. “Assuming that the attacker can decode that signal, they can then eavesdrop without being detected. But in terahertz networks, the narrow beams would mean that an eavesdropper would have to place the antenna between the transmitter and receiver. The thought was that there would be no way to do that without blocking some or all of the signal, which would make an eavesdropping attempt easily detectable by the intended receiver.” Mittleman and colleagues from Brown, Rice University and the University at Buffalo set out to test that notion. They set up a direct line-of-site terahertz data link between a transmitter and receiver, and experimented with devices capable of intercepting signal. They were able show several strategies that could steal signal without being detected — even when the data-carrying beam is very directional, with a cone angle of less than 2 degrees


The Three Dimensions of the Threat Intelligence Scale Problem

There is a massive amount of external TI that organizations can access to improve cyber defense. While cost can be a constraint for expensive commercial threat feeds, there is plenty of lower-cost and even free threat feeds available, from open source, government, and industry sources. While access to external TI is not an issue, the scale problem lies in managing, maintaining, and making effective use of TI. Some of these challenges include: Managing multiple threat feeds that come in different formats; Ensuring your threat feeds are constantly up to date; and Integrating TI into your security operations so that you can use it to improve security. The process of integrating TI into security operations is particularly interesting because it directly leads into another dimension of the network security TI scale problem. While organizations can turn to external TI to make up for the lack of access that a next-generation firewall provides, this same limitation hits you on the other side by hindering your ability to take action based on external TI. It's like a double firewall TI whammy!



Quote for the day:


“The only thing worse than training your employees and having them leave is not training them and having them stay.” -- Henry Ford


Daily Tech Digest - October 16, 2018

The future of the fintech will require security, not just innovation

null
But this innovation must not come at the expense of security. The evolving technology and regulatory landscape have meant that cloud technologies must have security baked at its core. Financial services must also not overlook the security risk associated with the creation of banking apps in the open banking environment – in particular, API security. As developers within banks and fintech companies use APIs to connect technologies (most commonly apps, but also platforms and systems), they create new digital banking innovations and remove barriers to allow more efficient, simpler ways to kickstart innovative programs. But while the value of inter-connected applications is undeniable, there are also significant risks. APIs provide open connections between platforms, a failure to protect these connections will provide hackers with the opportunity to attack API services with both stolen or invalid credentials. It is essential that developers and security teams within these organisations pay close attention to securing APIs. To illustrate this, if you visualise opening a door, you want to make sure only the right people (or in this case, apps) have the correct keys.



Five Ways You’re Already Using Machine Learning: A Day with AI

Whether you’re team Lyft or prefer to hop in an Uber, both services, much like Google Maps, power decisions with AI. Driver assignments, driver ETA, and your ETA at your final destination – are all calculated by algorithms that are constantly tested and refined in real time, using machine learning and the massive quantities of data from drivers and customers. One important thing for many; rideshare companies are using machine learning to help beat the dreaded ‘surge price’. Surge pricing, or time-limited price hikes, currently compensates for times when there are not enough cars on the road to supply all the passengers who want rides. Ideally, machine learning could anticipate times of high demand (say, commute times in April on the East Coast when it frequently rains) and incent cars to be on the road, in advance. ... Simple rules-based filters are used for the spam filter. Think of the words and phrases “pharmacy”, “you’ve won the lottery” or “Nigerian prince”. While you may very well be friends with a Nigerian prince, if the message seems suspicious, and is coming from an unknown sender, then it will probably get flagged and kicked to spam.


Good data governance is good business


“We need to move away from ‘digitised’ to ‘digital’ because a lot of companies have been focused on digitising their processes instead of making them truly digital,” she said. “In an attempt to create a digital channel, they are doing things like putting forms online, instead of thinking about tokenising identity, access, authentication and authorisation, as well as about how to remove friction and improve compliance.” As a result, there are some new roles that are emerging in the ecosystem, said Dow. “There is more and more need for a relying party to feel that they can trust where the data is coming from, trust its provenance, and trust who is supplying it and who is vouching for it.” With the explosion of internet-connected devices making up the internet of things, Dow said there are now many more things to trust, authenticate and authorise. “We need to get the utility layer right around consent, otherwise it becomes just another bad cookie policy,” she said. “In addition, we desperately need better governance and transparency around how data is collected and protected.”


Artificial Intelligence Needs a Strong Data Foundation

It is far easier (and less risky) if a bank or credit union wants to use data insights for internal purposes. Analyzing customer acquisition, attrition, product utilization and cross-selling for department managers has less risk than using this same analysis to communicate with the customer. During this learn and optimize stage, Rogati states, “We need to have a (however primitive) A/B testing or experimentation framework in place, so we can deploy incrementally to avoid disasters and get a rough estimate of the effects of the changes before they affect everybody.” Rogati also stresses the need for establishing a baseline to measure results against. Simple machine learning algorithms like logistic regression are also recommended at this stage to ensure all needed insights are included in the dataset. Again, time spent on this stage will reduce challenges and improve results down the road. There is no guarantee that machine learning and AI will improve your results. Similar to a turbocharged car with bad wheel alignment or bad breaks, the most advanced data analytics tools may simply get you to the wrong outcome faster.


Disruption coming to our cities, roads and skies

panasonic-disruption-connected-worlds
Connected vehicles with advanced driver-assistance systems are already entering our roadways. Built with equipment such as top view camera systems that provide a 360-degree bird’s eye view – plus sensors that detect airbag deployment, windshield wiper operation, engagement of brakes, etc. – these vehicles can transmit data about their status to other connected vehicles on the roadways and emerging intelligent roadway infrastructure. Technology decision makers in Connected World industries (automotive, aviation, government transportation agencies) expect connected vehicles to deliver a whole host of important benefits: fewer collisions, reduced congestion, less pollution, and more connected and informed driving (and riding) experiences. Autonomous vehicle technology builds on this connected vehicle foundation. Many tech decision makers in our Connected World industries think riding in an autonomous vehicle will be exciting, while a few admit it might terrify them.


Survey Reveals That Enterprises Are Entering the Third Era of IT

“The ability to support greater scale is being invested in and developed in three key areas: volume, scope and agility. All aim at encouraging consumers to interact with the organization,” Mr. Rowsell-Jones explained. “For example, increasing the scope means providing a variety of digital services and actions to the consumer. In general, the greater the variety of interactions that are available via digital channels, the more engaged a consumer becomes and the lower the costs to serve them are.” The transformation toward digital business is supported by steady IT budget growth. Globally, CIOs expect their IT budgets to grow by 2.9 percent in 2019. This is only slightly less than the 2018 average growth rate of 3 percent. A look at the regional differences shows that the regions are moving closer together: The leader in budget growth is once again Asia/Pacific with an expected growth of 3.5 percent. However, this is a significant cut from the 5.1 percent projected budget increase in 2018.


What Is A Data Lake? A Super-Simple Explanation For Anyone


Some mistakenly believe that a data lake is just the 2.0 version of a data warehouse. While they are similar, they are different tools that should be used for different purposes. James Dixon, the CTO of Pentaho is credited with naming the concept of a data lake. He uses the following analogy:“If you think of a datamart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.” A data lake holds data in an unstructured way and there is no hierarchy or organization among the individual pieces of data. It holds data in its rawest form—it’s not processed or analyzed. Additionally, a data lakes accepts and retains all data from all data sources, supports all data types and schemas (the way the data is stored in a database) are applied only when the data is ready to be used. ... A data warehouse stores data in an organized manner with everything archived and ordered in a defined way.


Spray-on antennas will revolutionize the Internet of Things

Spray-on antennas will revolutionize the Internet of Things
The way the concept works is that titanium carbide compounds are dissolved in water to make the paint. The compound derives from a type of materials-science product called MXene (invented at Drexel in 2011 and pronounced "maksens"), which is basically an inorganic, super-thin material only a few atoms thick that combines conductive metal with water-dissolving characteristics. The material in the lab tests is then actually sprayed onto the object using a craft-style airbrush. When the water evaporates, the antenna remains. “The exceptional conductivity of the material enables it to transmit and direct radio waves, even when it’s applied in a very thin coating.” It’s extremely conductive, the researchers say. Slimming, such as the tens-of-nanometers to microns thick that the group have obtained with the transparent antennas, would provide IoT weight reduction, too. That’s crucial for some tracking sensors, such as those used in shipping. The lightness could also have a knock-on effect in reducing sensor power consumption — the lighter a drone is


Pushing the Boundaries of Computer Vision

Although augmented reality has occasionally been described as a bridge to true virtual reality, AR is actually more difficult to implement in some ways. Nevertheless, the technology has evolved rapidly in recent years, thanks in part to computer vision advances. At the core of AR is a challenge relevant to other fields of computer vision: Object recognition. Small variations in objects can prove challenging for imagine recognition software, and even a change in lighting can cause mismatches. Experts at Facebook and other companies have made tremendous progress through deep learning and other artificial intelligence fields, and these advances have the potential to make AR and other vision fields dependent on object recognition more powerful in the coming years. Another transformative use-case is predicted to be agriculture. Agricultural science is charged with feeding the world, and computers have been making major strides in the field in recent years.


4 ways AI will impact the financial job market

Machine vision and speech recognition give machines cognitive skills, allowing AI to be applied in the real world
In the new wave of AI, opportunities and challenges exist at the same time. On the positive side, AI could increase automation, support intelligent analysis and decision-making, and create new business models and industries. But AI also carries a series of risks. In the financial industry, potential risks include micro-financial risk and macro-financial risk. The former could influence the stability of markets, causing turmoil. The latter could trigger risk around market concentration, market loopholes, connection and technology. Language and vision have been the two major breakthroughs in AI so far, according to research from the BCG Henderson Institute. Machine vision and speech recognition give machines cognitive skills, allowing AI to be applied in real-world contexts, which will change all aspects of society in the future. The research also reveals that industry users understand AI from three dimensions: data, processes and actions. AI improves workflows by processing structured data as well as unstructured language and image information to deliver new products and services, and provide data or physical feedback.



Quote for the day:


"A single question can be more influential than a thousand statements." -- Bo Bennett


Daily Tech Digest - October 15, 2018

We Need to be Examining the Ethics and Governance of Artificial Intelligence


Recently, the role that pre-crime and artificial intelligence can play in our world has been explored in episodes of the popular Netflix TV show Black Mirror, focusing on the debate between free will and determinism. Working in counter-terrorism, I know that the use of artificial intelligence in the security space is fast becoming a reality. After all, decisions and choices previously made by humans are being increasingly delegated to algorithms, which can advise, and decide, how data is interpreted and what actions should result. Take the example of new technology that can recognize not just our faces but also determine our mood and map our body language. Such systems can even tell a real smile from a fake one. Being able to utilize this in predicting the risk of a security threat in a crowded airport or train station, and prevent it from occurring, for example, would be useful. Some conversations I have had with individuals working in cyber-security indicate that it is already being done.



UK gov launches 'world's first' Code of Practice for IoT security
The Code defines 13 guidelines for manufacturers, service providers, developers and retailers to implement in order to ensure that IoT products are safe to use. They are: no default passwords; implement a vulnerability disclosure policy; keep software updated; securely store credentials and security-sensitive data; communicate securely, minimise exposed attack surface; ensure software integrity; ensure that personal data is protected; make systems resilient to outages; monitor system telemetry data; make it easy for consumers to delete personal data; make installation and maintenance of devices easy and validate input data HP Inc. and Centrica Hive are the first companies to sign up to the new Code. Minister for Digital Margot James said that these pledges are "a welcome first step," but "it is vital other manufacturers follow their lead to ensure strong security measures are built into everyday technology from the moment it is designed."




The so-called password-less authentication, if implemented literally, would lead us to a world where we are deprived of the chances and means to get our volition confirmed in having our identity authenticated. It would be a 1984-like world. The values of democratic societies are not compatible. Some people allege that passwords can and will be eliminated by biometrics or PIN. But logic tells that it can never happen because the former requires a password/PIN as a fallback means and the latter is no more than the weakest form of numbers-only password. Various debates over ‘password-less’ or ‘beyond-password’ authentications only make it clear that the solution to the password predicament could be found only inside the family of broadly-defined passwords. ... If PIN or PINCODE, which is the weakest form of numbers-only password, had the power to kill the password, a small sedan should be able to kill the automobile. Advocates of this idea seem to claim that a PIN is stronger than passwords when it is linked to a device while the password is not linked to the device. 



Juniper advances network automation community, skillsets

Juniper advances network automation community, skillsets
“Since a critical part of automated operations is the individual engineers and processes they follow, Juniper has put deliberate investment into these areas by introducing many formal and informal training programs, cloud-based lab services, testing as a service, free trials, live throwdowns and [the new] Juniper Engineering Network (EngNet),” Koley wrote.  Juniper Engnet is a portal that includes a variety of automation tools, resources and social communities. According to the vendor, the site features API documentations, access to Juniper Labs, virtual resources, a learning portal and an automation exchange of useful network automation tools. “Juniper Engineering Network is aimed at elevating the entire networking community to move beyond incumbent CLIs knowledge and toward an automated, abstracted, self-driving technology. The networking community, including Juniper customers and partners, can contribute to the Automation Exchange within the community," Juniper stated.


AI is no silver bullet for cyber security


“AI is not a silver bullet – when you look at the technology, you have to make sure that senior management is aware of its risks and you don’t invest in it unless you already have good cyber hygiene – starting with people,” said Pereira User education is crucial, he said, because successful cyber attackers often exploit human weaknesses and emotions through social engineering and spear phishing to penetrate a system. “Those who don’t know how phishing attacks work will fall prey to them,” he said. “The panacea and antidote for phishing attacks is cyber education, which, when tailored for a person or function, is more effective than technology in stopping such attacks in many cases.” In deciding when and how to adopt AI to improve cyber security, Pereira said organisations should start with projects that address human and people risks, followed by processes and technology. “And when you get to the technology part, AI shouldn’t come first, but rather look at it as a way to enhance security processes, such as making it faster to review logs,” he said.


Deloitte says CIOs need to adapt or perish

Deloitte CIO survey
Deloitte says that in 2018 CIOs need a better grasp on the big picture, and that means looking ‘inward’, ‘across’ and ‘beyond’ the business. “The digital era presents CIOs with the opportunity to look inward and reinvent themselves by breaking out of the trusted operator mould,” says the report. “We note, as in previous surveys, the importance of strong relationships to the CIO’s business success. This year we suggest that developing a technology fluency programme can help create a solid foundation for these relationship-building efforts. A tech fluency programme can provide organisations with knowledge about technology trends, scalability of emerging technologies and complexities of managing legacy core systems – while enabling CIOs to understand internal and external customer perspectives. “CIOs can also look across the IT organisation and transform it, particularly by focusing on the IT operating model, funding priorities and budget allocation, and tech talent and culture at the heart of their digital agendas.


Why your machine-learning team needs better feature-engineering skills


The skill of feature engineering — crafting data features optimized for machine learning — is as old as data science itself. But it’s a skill I’ve noticed is becoming more and more neglected. The high demand for machine learning has produced a large pool of data scientists who have developed expertise in tools and algorithms but lack the experience and industry-specific domain knowledge that feature engineering requires. And they are trying to compensate for that with better tools and algorithms. However, algorithms are now a commodity and don’t generate corporate IP. Generic data is becoming commoditized and cloud-based Machine Learning Services (MLaaS) like Amazon ML and Google AutoML now make it possible for even less experienced team members to run data models and get predictions within minutes. As a result, power is shifting to companies that develop an organizational competency in collecting or manufacturing proprietary data — enabled by feature engineering. Simple data acquisition and model building are no longer enough.


How blockchain technology is transforming healthcare cybersecurity

An additional critical feature of blockchain technology is that every member of a blockchain generally can access and audit the entire ledger. This allows all interested parties to confirm and update the information contained in individual blocks. Another significant benefit is that laws and regulations can be programmed into the blockchain as smart contracts. Smart contracts are logical rules programmed into the blockchain. They are self-executing contracts where the built-in agreement is enforced on all members. Smart contracts mimic traditional contracts and laws, and can be used to program in obligations and consequences. In this way, the requirements of specific data privacy and security laws, such as the Health Insurance Portability and Accountability Act of 1996 or the European Union General Data Protection Regulation, can be embedded in the blockchain. Innovators are already experimenting with blockchain use cases in the healthcare context that demonstrate many of the blockchain security benefits.


How To Integrate AI Into The Enterprise


Overcoming ignorance is a good place to start, and the tutorial given by Hammond was a pleasing break from many technology events that are largely attended by people from whatever discipline the event covers. Data scientists attend data events, roboticists attend robotics events, and so on. At the O'Reilly event however, techies were in the minority, with most of the attendees from managerial functions. The session began by providing an overview of what AI is, with a whistle stop tour of machine learning, and specifically the nature of learning itself, which feeds into the supervised, unsupervised and reinforcement learning models used by all machine learning systems today. Machine learning is, of course, just one aspect of AI, with McKinsey recently identifying five distinct forms, including physical AI, computer vision, natural-language processing, natural language, and then machine learning. Understanding what each of these is, even on a basic level, can you help you to make informed choices, and not be suckered in by hype.


Criminals' Cryptocurrency Addiction Continues

"With the increasing, malicious focus on cryptocurrency-related threats, attacks and exploits, it is clear that criminal innovation in this space continues unabated," Ferguson tells Information Security Media Group. "Starting from attacks targeting cryptocurrency wallets on individual users' machines - either directly or as an add-on to some widespread ransomware variants - attackers have rapidly diversified into direct breaches of cryptocurrency exchanges, malware for mining on traditional, mobile and even IoT devices, and developed attack methodologies specifically designed to target the mechanics of blockchain-based transactions, such as the 51 percent attack." The 51 percent attack gives attackers who can control more than 50 percent of a network's hash rate - or computing power - the power to reverse transactions on the blockchain or double-spend coins. The first half of this year saw five successful 51 percent attacks leading to "direct financial losses ranging from $0.55 million to $18 million," Moscow-based cybersecurity firm Group-IB says in a recently released cybercrime trends report.



Quote for the day:


"Leaders should influence others in such a way that it builds people up, encourages and edifies them so they can duplicate this attitude in others." -- Bob Goshen


Daily Tech Digest - October 14, 2018


According to the sources, global fintech companies reportedly sought an extension of the October 15 deadline but it seems that the RBI is not inclined to relax the norms. Data localisation requires data about residents be collected, processed, and stored inside the country, often before being transferred internationally, and usually transferred only after meeting local privacy or data protection laws. Although domestic companies have welcomed the guidelines, global companies fear increase in their expenses for creation of local servers. To avoid this rise in cost, global companies in recent meeting with the RBI proposed to provide mirror data instead of original data to which the central bank did not agree, the sources said. Last week, Finance Minister Arun Jaitley met RBI Deputy Governor B P Kanungo to discuss RBI’s data localisation norms. The meeting was also attended by Economic Affairs Secretary Subhash Chandra Garg, Financial Services Secretary Rajiv Kumar and IT Secretary Ajay Prakash Sawhney.



The Data Quality Tipping Point

The Data Quality Tipping Point
It’s clear that data is no longer harvested and stored. Data isn’t left to rest any longer. It is the lifeblood that flows through every department in the business. It’s not just the result of a decision: it’s the driving force for your next move. Old, inaccurate and messy data can’t support the marketing department. If the data is old, it cannot be used as a concrete and reliable resource. And if you aren’t continually cleaning new data that comes in, you can’t capitalise on trends, or make decisions on what is and isn’t working. So we’re clear that data quality initiatives must run in parallel to business activities, rather than being carried out sporadically, and there needs to be a constant and attentive process to keep data clean. That means there’s a need for an ongoing investment in data governance, within the parameters of your budget. Few businesses have the budget to put extravagant data management processes in place. It would be wonderful to conduct data reviews every morning, or implement highly elaborate verification and enhancement programs.


Creating a Culture that Works for Data Science and Engineering


While both groups on the team are turning out great code, it’s challenging as a project manager to follow two different streams of work. Sometimes the two groups are working on similar things, but sometimes the data scientists are working on something in the very distant future for the engineers. The most important thing a cross-functional team can do is have everyone come to stand up every day. When we first told the data scientists about our daily “meetings,” they went pale in the face. “Every day?” they asked, with a look of panic in their eyes. I stood firm. It was the right call. Our daily meetings allow the engineers on our team to quickly start working from an informed place when R&D introduces a new project. Furthermore, we are benefiting from the best parts of agile with this approach; I love hearing everyone bounce ideas off each other in stand up. My favorite is when there’s a cross-functional “Ooo did you think about taking this approach?” We work better as a team and we have found a way to leverage everyone’s expertise.



The tech supply chain is more vulnerable than ever


It’s a great business model — especially when you consider that only 38 percent of companies are actively monitoring and managing their software supply chain hygiene. Today, the game has changed. Organizations now must contend with the fact that hackers are intentionally planting vulnerabilities directly into the supply of open source components. In one such example from February 2018, a core contributor to the conventional-changelog ecosystem (a common JavaScript code package) had his commit credentials compromised. A bad actor, using these credentials, published a malicious version of conventional-changelog (version 1.2.0) to npmjs.com. While the intentionally compromised component was only available in the supply chain for 35 hours, estimates are that it was downloaded and installed more than 28,000 times. Some percentage of these vulnerable components were then assembled into applications that were then released into production. The result is that these organizations then unwittingly released a Monero cryptocurrency miner into the wild — and the perpetrators of the supply chain hack profited handsomely.



How to use machine learning to build a predictive algorithm

You also have to make sure you're integrating not only data and platforms, but domain experts who bring invaluable information and skills to the data science team, according to David Ledbetter, a data scientist at Children's Hospital Los Angeles. "The machine learning community often isolates themselves and thinks they can solve all the problems, but domain experts bring value," Ledbetter said during a panel discussion at the AI World Conference & Expo in Boston in December. "Every time we meet with the clinical team, we learn something about what's going on with the data." The project team, with its mix of skills, needs to also identify good vs. bad outcomes based on the business problem you're trying to solve with a predictive algorithm. "It's important to set clear success criteria at the beginning of a project, and [to] pick something that has a reasonable likelihood of success," said William Mark, president of SRI International, aresearch and development firm that works on AI projects for customers, during the same panel discussion at AI World.


Cloud-agnostic container platforms – it’s all to play for

Steps into blue sky with clouds, sun © kraft2727 - Fotolia.com
Container-as-a-service (CaaS) products from the major cloud vendors, notably AWS EKS and Fargate, Azure AKS and Container Instances and Google Cloud Container Engine, present classic trade-offs between convenience and dependence. With their ability to tap into a plethora of cloud data, security and developer services that are unique in implementation if not conception, container products from the big three vendors can trap users in a maze of platform dependencies with no easy exit path. As container use in the enterprise moves from developer sandboxes to production systems, the desire for multi-environment portability presents an opportunity to devise standards, software, and automation systems that facilitate platform-agnostic container platforms. The idea is to ensure easy migration between private and public container environments. Recent announcements from Cisco, Google, and Pivotal Software are important milestones on the road to platform agnostic container infrastructure.


Welcome to Banking-as-a-Service

The underlying theme of this kind of disruption is the unbundling of supply and service. Banking has come late to the unbundling revolution. But now, the sector is ripe for it - for unbundling, or disaggregation - and ripe for its own Software-as-a-Service transformation that will allow customers to pick and choose and pay for applications as they use them. Software-as-a-Service (SaaS) businesses delivered by APIs have a low-touch sales model. These companies don’t sell; buyers help themselves. Low-touch sales combined with recurring revenues and lack of customer concentration are the three hallmarks of a SaaS business. In many cases these businesses are just better in all senses. But combining these three essential ingredients on their own will not be enough. The winners in this field are likely to be nimble specialists capable of creating plug-in-and-play APIs to allow anything to be processed anywhere, rather than the large - slow - generalists of the past. Starling is well-placed in this regard. We have built Starling with a set of public APIs that are freely available for anyone to use through our developer portal. 


5 Tips to Boost Your Company's Digital Transformation With BPM


With tools such as artificial intelligence and machine learning, reams of data can be processed in the blink of an eye, providing insights into how an organization can better meet customer needs. Often, this optimization is a product of changes in business process management, or BPM. Even the most basic organizations function through processes. There might be a process for acquiring leads, a process for vetting them, and a process for making a sale. After you convert a prospect, there's a process for invoicing the customer, one for fulfilling the order, and one for delivering the product. There are also strictly internal processes, such as those triggered when employees ask for time off or request tech support. BPM refers to the management of these procedures, such as ensuring they are effective and determining how to combine them in the most efficient way. When implemented effectively, BPM helps organizations streamline their day-to-day processes, making work more efficient. But implementing BPM or other digital transformations without full buy-in from your team can lead to a lack of teamwork or other disadvantages. 


APIs In Banking: Unlocking Business Value With Banking As A Platform (BaaP)

Banking as a Platform (BaaP), sometimes referred to as Banking as a Service (BaaS), occurs when a bank acts as an infrastructure provider to external third parties. Variations include other banks white-labeling the BaaP platform for faster time to market, fintech firms leveraging the BaaP provider’s banking license to provision bank accounts, and banks and fintechs using the BaaP platform for testing purposes. Banks like CBW, Fidor, JB Financial, solarisBank, and wirecard built their BaaP architecture from scratch, without the constraint of legacy systems, creating modular application stacks broken into discrete services. The modular banking services on a BaaP platform serve as building blocks, accessible to third parties through an API management layer, where they can be mixed and matched to create new products and services tailored to the third party’s business model


Life Is Dirty. So Is Your Data. Get Used to It.

As Dr. Hammond suggests, it's difficult to determine if data is ever clean. Even scientific constants have a degree of accuracy. They are "good enough," but not perfect. Data's ultimate purpose is to drive decisions. Bad data means bad decisions. As data professionals, it is up to us to help keep data "good enough" for use by others. We have to think of ourselves as data janitors. But nobody goes to school to become a data janitor. Let's talk about options for cleaning dirty data. Here's a handful of techniques that you should consider when working with data. Remember, all data is dirty, you won't be able to make it perfect. Your focus should be making it "good enough" to pass along to the next person. The first thing you should do when working with a dataset is to examine the data. Ask yourself, "does this data make sense?" That's what we did in the example above. We looked at the first few rows of data and found that both the city and country listed inside one column.



Quote for the day:


"Courage is more exhilarating than fear and in the long run it is easier." -- Eleanor Roosevelt


Daily Tech Digest - October 13, 2018

Of the survey respondents who report a blockchain project in the pilot stage, 54 percent say the effort sometimes or often hasn’t been justified by the result. This should be a call to more effective action. To help executives answer that call, the report offers four strategies that can be used to build trust.  ... The participants in a blockchain ecosystem need to decide what the operating standards will be and what various users will be able to see and do. The design begins with the strategic business model, which includes making decisions about whether the blockchain will be permissionless, and thus available to everyone, or permissioned (having various levels of permissions). Permissions determine participants’ roles and engagement with the blockchain, which can vary from entering information or transactions to only viewing information. The choice of model isn’t automatic; organizations will decide based on design and use case considerations. They will also need to consider the type of network to establish. Forty percent of survey respondents report that they are using permissioned blockchains, 34 percent are working with permissionless chains, and 26 percent are taking a hybrid approach


How to put cybersecurity threats into a business context

Focusing on business impact is a different way to think about cybersecurity, and it requires a different mindset than that of tactically responding to cybersecurity threats. Cybersecurity used to be all about preventing attacks, and a breach either occurred or it didn't. "Now, most organizations understand that cybersecurity is not a problem to be solved but a risk to be managed," says Andrew Morrison, US leader of cyberstrategy defense and response at Deloitte & Touche. "Most of the market is acclimated to the fact that it's not longer if an attack will occur but when an attack will occur and how we will manage it. That entails a totally different mindset. "Risks, by nature, can be accepted, mitigated, or transferred," he says. ... A business-focused description of the same problem, however, might be that patching the vulnerability will reduce the probability of a breach to a particular database, which, if exposed, will cost a particular amount of money in lost business, fines and remediation expenses.


Big data processing techniques to streamline analytics


Addressing big data processing techniques requires innovative algorithms and programming, rather than simply adding hardware power. A solution widely used is indexing and partitioning the data to provide better access. GeoSpock's infin8 uses data indexing to process and organize data for subsecond data retrieval by ingesting and processing raw data at any scale, then creating an organized index that preserves every record of the original data set. Making the algorithms smarter has another interesting effect, too, allowing companies to reliably harvest data from images, video and audio that opens the door to new generations of applications that can "look and hear." These advancements let machines scan footage and tag the objects or people they detect. It can also be used as part of companies' intelligence-gathering arsenal. Artificial intelligence provides big benefits in this realm. Advancements in artificial intelligence require large amounts of data to operate properly, and these AI tools provide a better view of the data to see what parts of the data set are more useful and which parts have less value that can be deprioritized.


Why Business Leaders Shouldn’t Have Blind Faith in AI

San Marcos student Amaris Gonzalez takes a selfie with "Pepper" an artificial Intelligence project utilizing a humanoid robot. | Reuters/Mike Blake
Most machine learning algorithms are also bad at thinking about what Athey calls “what-if” scenarios. Like what would happen if a company were to change its prices, or if it hadn’t run a certain ad campaign. And here is where misguided faith in the accuracy of machine learning can become problematic in practice. Consider an algorithm designed to predict hotel-room occupancy based on observed prices, Athey says. It would look at historical occupancy rates and prices and draw the correct conclusion that the hotel is full when prices are high. However, if that predictive model was applied to optimize prices, it would lead to the conclusion that in order to get more people into your hotel, you should raise prices. “Which is of course wrong,” Athey says. “Just because higher prices are correlated with a full hotel doesn’t mean if you change your price you will sell more hotel rooms.”


Regulators can do more to encourage fintech innovation

A lot of work has been done in this area by the Consumer Financial Protection Bureau’s Office of Innovation and Project Catalyst, its predecessor, and by states like Arizona, which became the first state in the United States to adopt a regulatory sandbox statute. Yet these efforts, while welcome, fall short because they are largely focused on each agency’s policies and procedures and participant eligibility. Fintechs need more than process-oriented frameworks. To be successful, regulatory sandboxes require clearly articulated safeguards, terms of use and expectations on transparency. These matters are too important to be left to one-off negotiations. Regulatory sandboxes sound like a great idea, but what actually is a regulatory sandbox? In order for regulatory sandboxes to succeed, stakeholders need to have a common understanding — and acceptance — of the basic concept. First, regulatory sandboxes need a better name. Terms like “clinical trial,” “experiment,” or “lab” may better convey what is really needed.


This AI can predict your personality just by looking at your eyes

An exhibitor presents replacement puppet eyes at the Northern German puppet, teddy bear and miniatures fair in Hamburg February 14, 2010. Picture taken February 14, 2010.  REUTERS/Morris Mac Matzen (GERMANY - Tags: SOCIETY BUSINESS)
The project used artificial intelligence to track and monitor the eye movements of 42 individuals using tools from SensoMotoric Instruments. Those findings were then cross-checked with well-established questionnaires that define personality traits. Of the five key traits – openness, conscientiousness, extraversion, agreeableness and neuroticism – the technology easily identified four: neuroticism, extraversion, agreeableness and conscientiousness. The 42 people were fitted with an eye tracker and given five Australian dollars and 10 minutes to make a purchase in a university campus shop. When they returned they removed the eye tracker and filled in personality and curiosity questionnaires. The findings were analyzed to show how trait-specific eye movements vary across activities. While the study used a small sample and the authors said the predictions aren’t yet accurate enough for practical applications, it does shed light on the close link between personality and eye movements. Pupil diameter, for example, was important for predicting neuroticism.


Meet Your New Colleague: AI


How potential employees actually speak to AI is a different conversation than how potential employees should speak to AI, he added. That is, it’s unclear whether how a person treats a machine says anything about how that person would treat other people, and it’s unclear whether something like a person being rude to a machine agent should impact their job prospects. “We can certainly agree that we do care if it’s a human recruiting coordinator,” Mortenson said. But machines have no feelings or emotions and cannot be offended, so it would be easy to argue why employers shouldn’t care. Ultimately, “I do think we should care even if it is a machine,” Mortenson said. “I understand why we might care a little bit less, but I don’t think we can just discard that as a signal.” He gave the example of a report which found that this technology could have implications on how kids learn how to communicate and teach them that speaking harshly or impolitely to people has no consequences.


How digital technology is changing the world

Hitachi is working with major manufacturers on their digitisation journey, moving away from the conventional customer/supplier relationship and focusing on digital innovation through co-creation. This approach is already delivering results. Swedish ferry operator Stena Line, for instance, wanted to optimise many aspects of its operations, to reduce costs and inefficiencies such as excess fuel consumption. Hitachi gathered data from its ship’s operations and functions and used it to develop an AI algorithm that calculated an optimal way of steering Stena’s vessels and reducing fuel consumption. Mr Ramachander says: “We couldn’t have done that in isolation, without the shipping company. Co-creation is about working with our clients to solve their problems. In a move away from the traditional customer/supplier relationship, we are aiming to become their digital innovation business partner.”


Managing to the Next Century - The 5 Big Things For Agile Transitions


In the new agile world, it is neither possible to tell people to do a particular task or plan at the same level of breadth or depth. Work is defined, managed and executed by empowered teams who are not focused on the task, but instead the outcome they are trying to achieve. Quality including technical debt is treated in the same way that value is treated allowing the team and the business to make explicit, transparent decisions on trade-offs. But moving away from traditional managed work to a more agile approach requires more than managers stepping away ... At the very heart of the agile organization is a collection of teams, self-organized and empowered to make decisions. They have all the right skills to deliver value and are supported by an organization that fills in any gaps and helps them to get better. At scale that means teams of teams and the adoption of practices to ensure that dependencies are effectively managed.


Banking on artificial intelligence

Robot with a credit card. No, really.
Automation and handling masses of data is very valuable indeed but front-line services are also receiving attention and it is here, when married with human intervention, that excitement lies around the use of AI. The concept lies in being able to enhance the service provided to customers via virtual assistants, chatbots, robo-advisors and other analytical tools, all of which can be made more effective when machine learning and AI are applied. Providing better customer service is a good use for AI and something that all banks are focused on. Indeed, banks are commonly using chatbots and voicebots to interact with customers and solve basic problems without the need for human backup. Avika says: “Banks are using machine learning to improve customer engagement in order to increase customer satisfaction. For example, applying machine learning to unstructured complaints data can help a bank to group the complaints into categories, allowing them to tackle the areas that will have the biggest customer impact first. ...”



Quote for the day:


"You may be disappointed if you fail, but you are doomed if you don't try." -- Beverly Sills