Daily Tech Digest - October 18, 2018

How Financial Institutions Can Put Risk Management Back in the Driver’s Seat

The benefits of putting the business clearly in charge of risk management — and holding it accountable — are significant. First, holding the first line accountable for risk management aligns the interests of internal revenue generators with those of the overall firm. When first-line salespeople do their own risk generation “driving,” they gain an understanding of their firm’s position and reputation that they otherwise might not get. They are thus less likely to try to on-board a questionable client or put together loan proposals that may be rejected. In general, the business needs to be clearly accountable for managing the risks it takes in pursuit of its objectives. A system of first-line front-seat drivers also encourages people to keep their eye out for risks wherever they pop up, rather than relying on the oversight specialist — the backseat driver — to point them out. This improves performance by allowing the business to spot some risks sooner, manage them more nimbly, and react more quickly when things do go wrong.



Too many business executives view the analytics transformation too narrowly: They tend to view it as centering on tools or being driven by the hiring of "big data" analysts or machine learning expertise. Almost without exception, every successful analytics transformation that I've seen or experienced has started at the top - this transformation is as much, perhaps, even more, driven by a cultural change as it is by your hiring of new resources and technical expertise. Even today, however, too many business leaders and executives tend to be intimidated by analytics and math. My guidance to you (and you know who you are) is to get educated - fast. That does not mean that you have to get the equivalent of a Masters Degree in operations research or machine learning. It does mean seeking out an expert who can frame these capabilities in the language and the vernacular of a senior executive. This is the most perilous part of the journey because too many senior executives delegate this transformation to lower levels of the organization where it gets lost in a sea of other priorities.



The Future of the Cloud Depends on Magnetic Tape


Although the century-old technology has disappeared from most people’s daily view, magnetic tape lives on as the preferred medium for safely archiving critical cloud data in case, say, a software bug deletes thousands of Gmail messages, or a natural disaster wipes out some hard drives. The world’s electronic financial, health, and scientific records, collected on state-of-the-art cloud servers belonging to Amazon.com, Microsoft, Google, and others, are also typically recorded on tape around the same time they are created. Usually the companies keep one copy of each tape on-site, in a massive vault, and send a second copy to somebody like Iron Mountain. Unfortunately for the big tech companies, the number of tape manufacturers has shrunk over the past three years from six to just two—Sony Corp. and Fujifilm Holdings Corp.—and each seems to think that’s still one too many. The Japanese companies have said the tape business is a mere rounding error as far as they’re concerned, but each has spent millions of dollars arguing before the U.S. International Trade Commission to try to ban the other from importing tapes to America.


Solving the cloud infrastructure misconfiguration problem

The threats to cloud infrastructure are automated, so automated remediation is a requirement to effectively manage misconfiguration risk. His advice to CISOs is to set up a team that includes developers who understand cloud APIs and can automate every repetitive aspect of cloud security, starting with cloud configuration. “In order to be effective, the CISO needs to view their security team as an internal tool vendor in the cloud ecosystem. Development teams need support from security to move quickly, but also require good guard rails and feedback for how to do cloud securely,” he opines. “This security automation team led by the CISO needs to work closely with development teams to establish known-good configuration baselines using a whitelist approach that conforms with compliance and security policy. Once you have a known-good baseline, you can automate the remediation process for misconfiguration without running the risk of false positives leading to bad changes that can cause system downtime events.”


Quantum Computing: Why You Should Pay Attention


As databases continue to grow in size, this improvement can make it more feasible to handle the large volumes of data expected to come online in the coming years and decades as we reach physical limits in storage device latencies. Another practical advantage lies in our understanding of the world. Simulating quantum effects is notoriously difficult using the computers we rely on today, as the very fundamentals of quantum mechanics are vastly at odds with today’s devices. Using quantum computers, simulating these effects will be far simpler, allowing us to better unravel the mysteries of quantum mechanics. Even when quantum computing becomes common, it’s difficult to envision it completely replacing traditional computer devices. The types of applications at which quantum computers excel don’t seem to have much practical use for typical computer users. Furthermore, it will take some time for quantum computers to become smaller and more affordable, and there may be barriers preventing its widespread use.


Authentication Bypass in libSSH Leaves Servers Vulnerable

“Careful reading of code for the affected libSSH library indicated that it was possible to bypass authentication by presenting to the server an SSH2_MSG_USERAUTH_SUCCESS message in place of the SSH2_MSG_USERAUTH_REQUEST message which the server would expect to initiate authentication. The SSH2_MSG_USERAUTH_SUCCESS handler is intended only for communication from the server to the client,” an advisory by Peter Winter-Smith of NCC Group, who discovered the bug, says. In other words, an attacker can connect to any vulnerable server, without authentication, just by sending one message to the server. ... “Not all libSSH servers will necessarily be vulnerable to the authentication bypass; since the authentication bypass sets the internal libSSH state machine to authenticated without ever giving any registered authentication callbacks an opportunity to execute, servers developed using libSSH which maintain additional custom session state may fail to function correctly if a user is authenticated without this state being created,” the advisory says.


Learn Why Doctors Look To Data To Increase Patient Engagement


We’re positively swimming in data. But all that noise stands a good chance of confusing or distracting patients from their ultimate goal of ongoing good health if doctors and patients don’t come to the table together with a plan and a common understanding of which data points are meaningful in context and which are not.  There’s no doubt anymore: Big data is going to revolutionize the way we administer health care throughout the world and help us achieve financial savings. But as doctors look to leverage modern tools for interacting with and sharing patient health data, there are several factors to remember and several key advantages worth checking out. Here’s a rundown. Regrettably, we still lack a cure for many chronic diseases. Therefore, doctors and their patients must instead “manage” these conditions. It’s possible to live a full and active life while undergoing treatment for severe diseases and conditions, but only with the right levels of vigilance and engagement. Patients with chronic illnesses must maintain their motivation, their attention to treatment and medication schedules and their general knowledgeability about their condition.


Wärtsilä Opens World's First International Maritime Cyber Centre

IMCCE
“Cyber is such a critical topic to all players in marine. Taking stewardship in something as important as this, shows that Wärtsilä is committed to transform and digitalize the marine industry. This is the next step in our Smart Marine vision and supports our Oceanic Awakening and Sea20 initiatives,” says Marco Ryan, Chief Digital Officer at Wärtsilä. “There are three main drivers for the maritime industry to collaborate in improving our cyber resiliency: the vast attack surface that the maritime industry offers to cyber criminals; the inclusion of maritime into the critical national infrastructure of nation states and the pending cyber security regulation by the International Maritime Organisation in 2021,” says Mark Milford, Vice President, Cyber Security at Wärtsilä. The MCERT is an international cyber intelligence and incident support platform enhancing cyber resilience for the entire maritime ecosystem. It provides international intelligence feeds, advice and support, including real-time assistance to members on cyber attacks and incidents, and a Cyber Security Reporting Portal (CSRP) for its members.


Arm’s Neoverse will be the infrastructure for a trillion intelligent devices


Arm’s Neoverse intellectual property will take advantage of high-end manufacturing equipment in chip factories. The Ares platform will debut in 2019 with 30 percent per generation performance improvements, Henry said. The designs will be flexible for customer purposes and security will be a crucial part of the platform, Henry said. Ares will be built on seven-nanometer circuitry in the newest chip factories. Follow-up chips include the Zeus at seven nanometers and Poseidon at 5 nanometers. The Arm Neoverse will include advanced processor designs as well as solutions and support for hardware, software, tools, and services. The company announced the platform at the Arm TechCon event in San Jose, California. “Arm has been more successful in infrastructure than many knew. This makes sense as many networking and storage systems have Arm-based chips inside, albeit with smaller cores,” said Patrick Moorhead, analyst at Moor Insights & Strategy, in an email.


What Innovative CEOs and Leaders Need to Know about AI

istockphoto
The authors view AI as “performing tasks, not entire jobs.” Out of the 152 AI projects, 71 were in the automation of digital and physical tasks, 57 were using algorithms to identify patterns for business intelligence and analytics, and 24 were for engaging employees and customers through machine learning, intelligent agents, and chatbots. In the Harvard Business Review article, a 2017 Deloitte survey of 250 executives who were familiar with their companies’ AI initiatives, revealed that 51 percent responded that the primary goals were to improve existing products. 47 percent identified integrating AI with existing processes and systems as a major obstacle. ... Early adopters of AI in the enterprise are reporting benefits — 83 percent indicated their companies have already achieved “moderate (53 percent) or substantial (30 percent) economic benefits. 58 percent of respondents are using in-house resources versus outside expertise to implement AI, and 58 percent are using AI software from vendors.



Quote for the day:


"Leadership is a potent combination of strategy and character. But if you must be without one, be without the strategy." -- Norman Schwarzkopf


Daily Tech Digest - October 17, 2018

Microsoft Surface Pro 6
This time around, the major changes are inside: A bump up in the processor to an 8th-generation Core chip, some weird adjustments in pricing, and a new color— black—separate the new from the old. There's actually a downgrade of sorts in the GPU compared to the Surface Pro (2017), which is a bit of a disappointment. The Performance section of our review shows the clearest differences among the three generations. We've given the Surface Pro 6 what some would consider an "average" score of 3.5 stars, a lower score than we've given some other tablet PCs we've reviewed recently. But we're also giving it an Editor's Choice, like those other products. Despite being underwhelmed by the Surface Pro 6's failure to break new ground (or even add USB-C), we will give it this: It also has a nice, long 8.5 hours of battery life in our tests, which has been an Achilles heel with reviewed competition. It is still one of the best-designed Windows tablets you can buy, and its pricing is competitive with similarly configured products.



AI Common Sense Reasoning

To focus this new effort, MCS will pursue two approaches for developing and evaluating different machine common sense services. The first approach will create computational models that learn from experience and mimic the core domains of cognition as defined by developmental psychology. This includes the domains of objects (intuitive physics), places (spatial navigation), and agents (intentional actors). Researchers will seek to develop systems that think and learn as humans do in the very early stages of development, leveraging advances in the field of cognitive development to provide empirical and theoretical guidance. “During the first few years of life, humans acquire the fundamental building blocks of intelligence and common sense,” said Gunning. “Developmental psychologists have founds ways to map these cognitive capabilities across the developmental stages of a human’s early life, providing researchers with a set of targets and a strategy to mimic for developing a new foundation for machine common sense.”


Digital business projects in a quagmire? Hack your culture!

Gartner: Digital business projects in a quagmire? Hack your culture!
Changing mindsets is a key enabler of new technologies and one of the ways Gartner recommended that IT executives change the culture of their companies. “Hack your culture to change your culture,” said Kristin Moyer, research vice president and distinguished analyst at Gartner. “By culture hacking, we don’t mean finding a vulnerable point to break into a system. It’s about finding vulnerable points in your culture and turning them in to real change that sticks.” Hacking is about doing smaller actions that usually get overlooked Moyer said. Great hacks also trigger emotional responses, have immediate results and are visible to lots of people at once, she said. Gartner says culture is identified by 46 percent of CIOs as the largest barrier to getting the benefits of digital business. Achieving culture change is tied closely to another key direction organizations should strive to achieve – the ability to embrace change and adopt technology in a new way or what Gartner calls “dynamism.”


AI is fueling smarter collaboration

The first is improving the ability of individuals to access data. "Today, finding a document could be tedious [and] analyzing data may require writing a script or form," Lazar said. With AI, a user could perform a natural language query -- such as asking the Salesforce.com customer relationship management (CRM) platform to display third quarter projections and how they compare with the second quarter -- and generate a real-time report. Then, asking the platform to share this information with the user's team and get its feedback could launch a collaborative workspace, Lazar said. The second possible benefit is predictive. "The AI engine could anticipate needs or next steps, based on learning of past activities," Lazar said. "So if it knows that every Monday I have a staff call to review project tasks, it may have required information ready at my fingertips before the call. Perhaps it suggests things that I'll need to focus on, such as delays or anomalies from the past week."


Automation and employment debate takes a new turn


What gives machines -- and process automation -- the edge over humans? In addition to their ability to integrate data, machines, Levav noted, lack biases such as the illusion of validity, which leads people to overestimate their forecasting prowess. Yet, humans are still required in process automation, because only they can decide the important parameters, he added. "You will have a job because machines can't pick the variables that are relevant to a problem," he said. Scott Hartley, partner at venture capital firm Two Culture Capital, shared a similar view regarding the impact of AI on jobs. His take on AI-infused automation and employment takes a cue from Voltaire. Hartley's 2017 book, The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, cites a statement attributed to the 18th century philosopher to support his view that asking the right questions about data is central to acquiring knowledge. Making AI and machine learning work, Hartley said during a UiPath panel discussion, is "still fundamentally rooted in our ability to create diverse teams and ask questions from a multiplicity of angles."


Strengthening the CIO – Corporate Board Relationship

But on a positive note, more board members see how technology is unlocking new business models and spurring growth. They are convinced of the growing need to focus on speed, agility, innovation, and customer obsession, and see that it requires new approaches to business operations and to IT investment. Technology and cybersecurity have historically been seen as compliance issues under the purview of the board’s audit committee. However, given the increasing capability of technology to affect revenue and the business model, there is a greater recognition of its strategic importance. This has led to an increase in the number of CIOs and other technology experts being appointed to boards. Still, though, the majority of boards lack the technology prowess needed to successfully guide today’s digital era company.  What, then, can the CIO do to bridge the gap and develop a great relationship with the board?


Steel yourself for the cloud hangover

Steel yourself for the cloud hangover
We’re at what I call the hangover phase, where a night of cloud-hyped indulgence has led to many self-administered pats on the back, which obscured the reality that transitioning to the cloud is a harder than people originally thought. But the effort is still worth it. The budget overruns are no surprise, given that not much cost planning takes place during initial large cloud computing projects. Indeed, these initial projects fail to illustrate the true costs of using a public cloud, and if you look carefully you can see that the private clouds many such initial efforts focus on are just new cages of servers in data centers that cost more than the old cages of servers. Moreover, people costs are always higher than expected, and few enterprises plan to run both cloud and on-premises systems—but the reality is that you need to. What troubled me is that only 48 percent of the mid-sized businesses and only 36 percent of the large enterprises agree that cloud actually improved the business. I suspect that those who do not see the value have yet to complete a project’s successful journey to the cloud. But still, this figure should be higher.


Five steps for getting started in machine learning: Top data scientists share their tips

"If someone has programming fundamentals then, from a technical point of view, I think that's enough for them to dive into machine learning," he says. "You're not gonna get very far if you can't program at all, because that's ultimately how you configure the machine-learning frameworks is through programming. "I think strong math was probably more essential before than it is now. It's certainly helpful to have mathematical knowledge if you want to develop custom layers or if you're really going very, very deep on a problem. But for people starting out, it's not critical." In some respects, it's just as important to have a willingness to seek out new information, says Yangqing Jia, director of engineering at Facebook. "As long as you keep an exploratory mindset there's such an abundance of tools nowadays you'll be able to learn a lot of things yourself, and you have to learn things yourself because the field is growing really fast."


Researchers expose security vulnerabilities in terahertz data links

terahertz data links security
“In microwave communications, an eavesdropper can put an antenna just about anywhere in the broadcast cone and pick up the signal without interfering with the intended receiver,” Mittleman said. “Assuming that the attacker can decode that signal, they can then eavesdrop without being detected. But in terahertz networks, the narrow beams would mean that an eavesdropper would have to place the antenna between the transmitter and receiver. The thought was that there would be no way to do that without blocking some or all of the signal, which would make an eavesdropping attempt easily detectable by the intended receiver.” Mittleman and colleagues from Brown, Rice University and the University at Buffalo set out to test that notion. They set up a direct line-of-site terahertz data link between a transmitter and receiver, and experimented with devices capable of intercepting signal. They were able show several strategies that could steal signal without being detected — even when the data-carrying beam is very directional, with a cone angle of less than 2 degrees


The Three Dimensions of the Threat Intelligence Scale Problem

There is a massive amount of external TI that organizations can access to improve cyber defense. While cost can be a constraint for expensive commercial threat feeds, there is plenty of lower-cost and even free threat feeds available, from open source, government, and industry sources. While access to external TI is not an issue, the scale problem lies in managing, maintaining, and making effective use of TI. Some of these challenges include: Managing multiple threat feeds that come in different formats; Ensuring your threat feeds are constantly up to date; and Integrating TI into your security operations so that you can use it to improve security. The process of integrating TI into security operations is particularly interesting because it directly leads into another dimension of the network security TI scale problem. While organizations can turn to external TI to make up for the lack of access that a next-generation firewall provides, this same limitation hits you on the other side by hindering your ability to take action based on external TI. It's like a double firewall TI whammy!



Quote for the day:


“The only thing worse than training your employees and having them leave is not training them and having them stay.” -- Henry Ford


Daily Tech Digest - October 16, 2018

The future of the fintech will require security, not just innovation

null
But this innovation must not come at the expense of security. The evolving technology and regulatory landscape have meant that cloud technologies must have security baked at its core. Financial services must also not overlook the security risk associated with the creation of banking apps in the open banking environment – in particular, API security. As developers within banks and fintech companies use APIs to connect technologies (most commonly apps, but also platforms and systems), they create new digital banking innovations and remove barriers to allow more efficient, simpler ways to kickstart innovative programs. But while the value of inter-connected applications is undeniable, there are also significant risks. APIs provide open connections between platforms, a failure to protect these connections will provide hackers with the opportunity to attack API services with both stolen or invalid credentials. It is essential that developers and security teams within these organisations pay close attention to securing APIs. To illustrate this, if you visualise opening a door, you want to make sure only the right people (or in this case, apps) have the correct keys.



Five Ways You’re Already Using Machine Learning: A Day with AI

Whether you’re team Lyft or prefer to hop in an Uber, both services, much like Google Maps, power decisions with AI. Driver assignments, driver ETA, and your ETA at your final destination – are all calculated by algorithms that are constantly tested and refined in real time, using machine learning and the massive quantities of data from drivers and customers. One important thing for many; rideshare companies are using machine learning to help beat the dreaded ‘surge price’. Surge pricing, or time-limited price hikes, currently compensates for times when there are not enough cars on the road to supply all the passengers who want rides. Ideally, machine learning could anticipate times of high demand (say, commute times in April on the East Coast when it frequently rains) and incent cars to be on the road, in advance. ... Simple rules-based filters are used for the spam filter. Think of the words and phrases “pharmacy”, “you’ve won the lottery” or “Nigerian prince”. While you may very well be friends with a Nigerian prince, if the message seems suspicious, and is coming from an unknown sender, then it will probably get flagged and kicked to spam.


Good data governance is good business


“We need to move away from ‘digitised’ to ‘digital’ because a lot of companies have been focused on digitising their processes instead of making them truly digital,” she said. “In an attempt to create a digital channel, they are doing things like putting forms online, instead of thinking about tokenising identity, access, authentication and authorisation, as well as about how to remove friction and improve compliance.” As a result, there are some new roles that are emerging in the ecosystem, said Dow. “There is more and more need for a relying party to feel that they can trust where the data is coming from, trust its provenance, and trust who is supplying it and who is vouching for it.” With the explosion of internet-connected devices making up the internet of things, Dow said there are now many more things to trust, authenticate and authorise. “We need to get the utility layer right around consent, otherwise it becomes just another bad cookie policy,” she said. “In addition, we desperately need better governance and transparency around how data is collected and protected.”


Artificial Intelligence Needs a Strong Data Foundation

It is far easier (and less risky) if a bank or credit union wants to use data insights for internal purposes. Analyzing customer acquisition, attrition, product utilization and cross-selling for department managers has less risk than using this same analysis to communicate with the customer. During this learn and optimize stage, Rogati states, “We need to have a (however primitive) A/B testing or experimentation framework in place, so we can deploy incrementally to avoid disasters and get a rough estimate of the effects of the changes before they affect everybody.” Rogati also stresses the need for establishing a baseline to measure results against. Simple machine learning algorithms like logistic regression are also recommended at this stage to ensure all needed insights are included in the dataset. Again, time spent on this stage will reduce challenges and improve results down the road. There is no guarantee that machine learning and AI will improve your results. Similar to a turbocharged car with bad wheel alignment or bad breaks, the most advanced data analytics tools may simply get you to the wrong outcome faster.


Disruption coming to our cities, roads and skies

panasonic-disruption-connected-worlds
Connected vehicles with advanced driver-assistance systems are already entering our roadways. Built with equipment such as top view camera systems that provide a 360-degree bird’s eye view – plus sensors that detect airbag deployment, windshield wiper operation, engagement of brakes, etc. – these vehicles can transmit data about their status to other connected vehicles on the roadways and emerging intelligent roadway infrastructure. Technology decision makers in Connected World industries (automotive, aviation, government transportation agencies) expect connected vehicles to deliver a whole host of important benefits: fewer collisions, reduced congestion, less pollution, and more connected and informed driving (and riding) experiences. Autonomous vehicle technology builds on this connected vehicle foundation. Many tech decision makers in our Connected World industries think riding in an autonomous vehicle will be exciting, while a few admit it might terrify them.


Survey Reveals That Enterprises Are Entering the Third Era of IT

“The ability to support greater scale is being invested in and developed in three key areas: volume, scope and agility. All aim at encouraging consumers to interact with the organization,” Mr. Rowsell-Jones explained. “For example, increasing the scope means providing a variety of digital services and actions to the consumer. In general, the greater the variety of interactions that are available via digital channels, the more engaged a consumer becomes and the lower the costs to serve them are.” The transformation toward digital business is supported by steady IT budget growth. Globally, CIOs expect their IT budgets to grow by 2.9 percent in 2019. This is only slightly less than the 2018 average growth rate of 3 percent. A look at the regional differences shows that the regions are moving closer together: The leader in budget growth is once again Asia/Pacific with an expected growth of 3.5 percent. However, this is a significant cut from the 5.1 percent projected budget increase in 2018.


What Is A Data Lake? A Super-Simple Explanation For Anyone


Some mistakenly believe that a data lake is just the 2.0 version of a data warehouse. While they are similar, they are different tools that should be used for different purposes. James Dixon, the CTO of Pentaho is credited with naming the concept of a data lake. He uses the following analogy:“If you think of a datamart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.” A data lake holds data in an unstructured way and there is no hierarchy or organization among the individual pieces of data. It holds data in its rawest form—it’s not processed or analyzed. Additionally, a data lakes accepts and retains all data from all data sources, supports all data types and schemas (the way the data is stored in a database) are applied only when the data is ready to be used. ... A data warehouse stores data in an organized manner with everything archived and ordered in a defined way.


Spray-on antennas will revolutionize the Internet of Things

Spray-on antennas will revolutionize the Internet of Things
The way the concept works is that titanium carbide compounds are dissolved in water to make the paint. The compound derives from a type of materials-science product called MXene (invented at Drexel in 2011 and pronounced "maksens"), which is basically an inorganic, super-thin material only a few atoms thick that combines conductive metal with water-dissolving characteristics. The material in the lab tests is then actually sprayed onto the object using a craft-style airbrush. When the water evaporates, the antenna remains. “The exceptional conductivity of the material enables it to transmit and direct radio waves, even when it’s applied in a very thin coating.” It’s extremely conductive, the researchers say. Slimming, such as the tens-of-nanometers to microns thick that the group have obtained with the transparent antennas, would provide IoT weight reduction, too. That’s crucial for some tracking sensors, such as those used in shipping. The lightness could also have a knock-on effect in reducing sensor power consumption — the lighter a drone is


Pushing the Boundaries of Computer Vision

Although augmented reality has occasionally been described as a bridge to true virtual reality, AR is actually more difficult to implement in some ways. Nevertheless, the technology has evolved rapidly in recent years, thanks in part to computer vision advances. At the core of AR is a challenge relevant to other fields of computer vision: Object recognition. Small variations in objects can prove challenging for imagine recognition software, and even a change in lighting can cause mismatches. Experts at Facebook and other companies have made tremendous progress through deep learning and other artificial intelligence fields, and these advances have the potential to make AR and other vision fields dependent on object recognition more powerful in the coming years. Another transformative use-case is predicted to be agriculture. Agricultural science is charged with feeding the world, and computers have been making major strides in the field in recent years.


4 ways AI will impact the financial job market

Machine vision and speech recognition give machines cognitive skills, allowing AI to be applied in the real world
In the new wave of AI, opportunities and challenges exist at the same time. On the positive side, AI could increase automation, support intelligent analysis and decision-making, and create new business models and industries. But AI also carries a series of risks. In the financial industry, potential risks include micro-financial risk and macro-financial risk. The former could influence the stability of markets, causing turmoil. The latter could trigger risk around market concentration, market loopholes, connection and technology. Language and vision have been the two major breakthroughs in AI so far, according to research from the BCG Henderson Institute. Machine vision and speech recognition give machines cognitive skills, allowing AI to be applied in real-world contexts, which will change all aspects of society in the future. The research also reveals that industry users understand AI from three dimensions: data, processes and actions. AI improves workflows by processing structured data as well as unstructured language and image information to deliver new products and services, and provide data or physical feedback.



Quote for the day:


"A single question can be more influential than a thousand statements." -- Bo Bennett


Daily Tech Digest - October 15, 2018

We Need to be Examining the Ethics and Governance of Artificial Intelligence


Recently, the role that pre-crime and artificial intelligence can play in our world has been explored in episodes of the popular Netflix TV show Black Mirror, focusing on the debate between free will and determinism. Working in counter-terrorism, I know that the use of artificial intelligence in the security space is fast becoming a reality. After all, decisions and choices previously made by humans are being increasingly delegated to algorithms, which can advise, and decide, how data is interpreted and what actions should result. Take the example of new technology that can recognize not just our faces but also determine our mood and map our body language. Such systems can even tell a real smile from a fake one. Being able to utilize this in predicting the risk of a security threat in a crowded airport or train station, and prevent it from occurring, for example, would be useful. Some conversations I have had with individuals working in cyber-security indicate that it is already being done.



UK gov launches 'world's first' Code of Practice for IoT security
The Code defines 13 guidelines for manufacturers, service providers, developers and retailers to implement in order to ensure that IoT products are safe to use. They are: no default passwords; implement a vulnerability disclosure policy; keep software updated; securely store credentials and security-sensitive data; communicate securely, minimise exposed attack surface; ensure software integrity; ensure that personal data is protected; make systems resilient to outages; monitor system telemetry data; make it easy for consumers to delete personal data; make installation and maintenance of devices easy and validate input data HP Inc. and Centrica Hive are the first companies to sign up to the new Code. Minister for Digital Margot James said that these pledges are "a welcome first step," but "it is vital other manufacturers follow their lead to ensure strong security measures are built into everyday technology from the moment it is designed."




The so-called password-less authentication, if implemented literally, would lead us to a world where we are deprived of the chances and means to get our volition confirmed in having our identity authenticated. It would be a 1984-like world. The values of democratic societies are not compatible. Some people allege that passwords can and will be eliminated by biometrics or PIN. But logic tells that it can never happen because the former requires a password/PIN as a fallback means and the latter is no more than the weakest form of numbers-only password. Various debates over ‘password-less’ or ‘beyond-password’ authentications only make it clear that the solution to the password predicament could be found only inside the family of broadly-defined passwords. ... If PIN or PINCODE, which is the weakest form of numbers-only password, had the power to kill the password, a small sedan should be able to kill the automobile. Advocates of this idea seem to claim that a PIN is stronger than passwords when it is linked to a device while the password is not linked to the device. 



Juniper advances network automation community, skillsets

Juniper advances network automation community, skillsets
“Since a critical part of automated operations is the individual engineers and processes they follow, Juniper has put deliberate investment into these areas by introducing many formal and informal training programs, cloud-based lab services, testing as a service, free trials, live throwdowns and [the new] Juniper Engineering Network (EngNet),” Koley wrote.  Juniper Engnet is a portal that includes a variety of automation tools, resources and social communities. According to the vendor, the site features API documentations, access to Juniper Labs, virtual resources, a learning portal and an automation exchange of useful network automation tools. “Juniper Engineering Network is aimed at elevating the entire networking community to move beyond incumbent CLIs knowledge and toward an automated, abstracted, self-driving technology. The networking community, including Juniper customers and partners, can contribute to the Automation Exchange within the community," Juniper stated.


AI is no silver bullet for cyber security


“AI is not a silver bullet – when you look at the technology, you have to make sure that senior management is aware of its risks and you don’t invest in it unless you already have good cyber hygiene – starting with people,” said Pereira User education is crucial, he said, because successful cyber attackers often exploit human weaknesses and emotions through social engineering and spear phishing to penetrate a system. “Those who don’t know how phishing attacks work will fall prey to them,” he said. “The panacea and antidote for phishing attacks is cyber education, which, when tailored for a person or function, is more effective than technology in stopping such attacks in many cases.” In deciding when and how to adopt AI to improve cyber security, Pereira said organisations should start with projects that address human and people risks, followed by processes and technology. “And when you get to the technology part, AI shouldn’t come first, but rather look at it as a way to enhance security processes, such as making it faster to review logs,” he said.


Deloitte says CIOs need to adapt or perish

Deloitte CIO survey
Deloitte says that in 2018 CIOs need a better grasp on the big picture, and that means looking ‘inward’, ‘across’ and ‘beyond’ the business. “The digital era presents CIOs with the opportunity to look inward and reinvent themselves by breaking out of the trusted operator mould,” says the report. “We note, as in previous surveys, the importance of strong relationships to the CIO’s business success. This year we suggest that developing a technology fluency programme can help create a solid foundation for these relationship-building efforts. A tech fluency programme can provide organisations with knowledge about technology trends, scalability of emerging technologies and complexities of managing legacy core systems – while enabling CIOs to understand internal and external customer perspectives. “CIOs can also look across the IT organisation and transform it, particularly by focusing on the IT operating model, funding priorities and budget allocation, and tech talent and culture at the heart of their digital agendas.


Why your machine-learning team needs better feature-engineering skills


The skill of feature engineering — crafting data features optimized for machine learning — is as old as data science itself. But it’s a skill I’ve noticed is becoming more and more neglected. The high demand for machine learning has produced a large pool of data scientists who have developed expertise in tools and algorithms but lack the experience and industry-specific domain knowledge that feature engineering requires. And they are trying to compensate for that with better tools and algorithms. However, algorithms are now a commodity and don’t generate corporate IP. Generic data is becoming commoditized and cloud-based Machine Learning Services (MLaaS) like Amazon ML and Google AutoML now make it possible for even less experienced team members to run data models and get predictions within minutes. As a result, power is shifting to companies that develop an organizational competency in collecting or manufacturing proprietary data — enabled by feature engineering. Simple data acquisition and model building are no longer enough.


How blockchain technology is transforming healthcare cybersecurity

An additional critical feature of blockchain technology is that every member of a blockchain generally can access and audit the entire ledger. This allows all interested parties to confirm and update the information contained in individual blocks. Another significant benefit is that laws and regulations can be programmed into the blockchain as smart contracts. Smart contracts are logical rules programmed into the blockchain. They are self-executing contracts where the built-in agreement is enforced on all members. Smart contracts mimic traditional contracts and laws, and can be used to program in obligations and consequences. In this way, the requirements of specific data privacy and security laws, such as the Health Insurance Portability and Accountability Act of 1996 or the European Union General Data Protection Regulation, can be embedded in the blockchain. Innovators are already experimenting with blockchain use cases in the healthcare context that demonstrate many of the blockchain security benefits.


How To Integrate AI Into The Enterprise


Overcoming ignorance is a good place to start, and the tutorial given by Hammond was a pleasing break from many technology events that are largely attended by people from whatever discipline the event covers. Data scientists attend data events, roboticists attend robotics events, and so on. At the O'Reilly event however, techies were in the minority, with most of the attendees from managerial functions. The session began by providing an overview of what AI is, with a whistle stop tour of machine learning, and specifically the nature of learning itself, which feeds into the supervised, unsupervised and reinforcement learning models used by all machine learning systems today. Machine learning is, of course, just one aspect of AI, with McKinsey recently identifying five distinct forms, including physical AI, computer vision, natural-language processing, natural language, and then machine learning. Understanding what each of these is, even on a basic level, can you help you to make informed choices, and not be suckered in by hype.


Criminals' Cryptocurrency Addiction Continues

"With the increasing, malicious focus on cryptocurrency-related threats, attacks and exploits, it is clear that criminal innovation in this space continues unabated," Ferguson tells Information Security Media Group. "Starting from attacks targeting cryptocurrency wallets on individual users' machines - either directly or as an add-on to some widespread ransomware variants - attackers have rapidly diversified into direct breaches of cryptocurrency exchanges, malware for mining on traditional, mobile and even IoT devices, and developed attack methodologies specifically designed to target the mechanics of blockchain-based transactions, such as the 51 percent attack." The 51 percent attack gives attackers who can control more than 50 percent of a network's hash rate - or computing power - the power to reverse transactions on the blockchain or double-spend coins. The first half of this year saw five successful 51 percent attacks leading to "direct financial losses ranging from $0.55 million to $18 million," Moscow-based cybersecurity firm Group-IB says in a recently released cybercrime trends report.



Quote for the day:


"Leaders should influence others in such a way that it builds people up, encourages and edifies them so they can duplicate this attitude in others." -- Bob Goshen


Daily Tech Digest - October 14, 2018


According to the sources, global fintech companies reportedly sought an extension of the October 15 deadline but it seems that the RBI is not inclined to relax the norms. Data localisation requires data about residents be collected, processed, and stored inside the country, often before being transferred internationally, and usually transferred only after meeting local privacy or data protection laws. Although domestic companies have welcomed the guidelines, global companies fear increase in their expenses for creation of local servers. To avoid this rise in cost, global companies in recent meeting with the RBI proposed to provide mirror data instead of original data to which the central bank did not agree, the sources said. Last week, Finance Minister Arun Jaitley met RBI Deputy Governor B P Kanungo to discuss RBI’s data localisation norms. The meeting was also attended by Economic Affairs Secretary Subhash Chandra Garg, Financial Services Secretary Rajiv Kumar and IT Secretary Ajay Prakash Sawhney.



The Data Quality Tipping Point

The Data Quality Tipping Point
It’s clear that data is no longer harvested and stored. Data isn’t left to rest any longer. It is the lifeblood that flows through every department in the business. It’s not just the result of a decision: it’s the driving force for your next move. Old, inaccurate and messy data can’t support the marketing department. If the data is old, it cannot be used as a concrete and reliable resource. And if you aren’t continually cleaning new data that comes in, you can’t capitalise on trends, or make decisions on what is and isn’t working. So we’re clear that data quality initiatives must run in parallel to business activities, rather than being carried out sporadically, and there needs to be a constant and attentive process to keep data clean. That means there’s a need for an ongoing investment in data governance, within the parameters of your budget. Few businesses have the budget to put extravagant data management processes in place. It would be wonderful to conduct data reviews every morning, or implement highly elaborate verification and enhancement programs.


Creating a Culture that Works for Data Science and Engineering


While both groups on the team are turning out great code, it’s challenging as a project manager to follow two different streams of work. Sometimes the two groups are working on similar things, but sometimes the data scientists are working on something in the very distant future for the engineers. The most important thing a cross-functional team can do is have everyone come to stand up every day. When we first told the data scientists about our daily “meetings,” they went pale in the face. “Every day?” they asked, with a look of panic in their eyes. I stood firm. It was the right call. Our daily meetings allow the engineers on our team to quickly start working from an informed place when R&D introduces a new project. Furthermore, we are benefiting from the best parts of agile with this approach; I love hearing everyone bounce ideas off each other in stand up. My favorite is when there’s a cross-functional “Ooo did you think about taking this approach?” We work better as a team and we have found a way to leverage everyone’s expertise.



The tech supply chain is more vulnerable than ever


It’s a great business model — especially when you consider that only 38 percent of companies are actively monitoring and managing their software supply chain hygiene. Today, the game has changed. Organizations now must contend with the fact that hackers are intentionally planting vulnerabilities directly into the supply of open source components. In one such example from February 2018, a core contributor to the conventional-changelog ecosystem (a common JavaScript code package) had his commit credentials compromised. A bad actor, using these credentials, published a malicious version of conventional-changelog (version 1.2.0) to npmjs.com. While the intentionally compromised component was only available in the supply chain for 35 hours, estimates are that it was downloaded and installed more than 28,000 times. Some percentage of these vulnerable components were then assembled into applications that were then released into production. The result is that these organizations then unwittingly released a Monero cryptocurrency miner into the wild — and the perpetrators of the supply chain hack profited handsomely.



How to use machine learning to build a predictive algorithm

You also have to make sure you're integrating not only data and platforms, but domain experts who bring invaluable information and skills to the data science team, according to David Ledbetter, a data scientist at Children's Hospital Los Angeles. "The machine learning community often isolates themselves and thinks they can solve all the problems, but domain experts bring value," Ledbetter said during a panel discussion at the AI World Conference & Expo in Boston in December. "Every time we meet with the clinical team, we learn something about what's going on with the data." The project team, with its mix of skills, needs to also identify good vs. bad outcomes based on the business problem you're trying to solve with a predictive algorithm. "It's important to set clear success criteria at the beginning of a project, and [to] pick something that has a reasonable likelihood of success," said William Mark, president of SRI International, aresearch and development firm that works on AI projects for customers, during the same panel discussion at AI World.


Cloud-agnostic container platforms – it’s all to play for

Steps into blue sky with clouds, sun © kraft2727 - Fotolia.com
Container-as-a-service (CaaS) products from the major cloud vendors, notably AWS EKS and Fargate, Azure AKS and Container Instances and Google Cloud Container Engine, present classic trade-offs between convenience and dependence. With their ability to tap into a plethora of cloud data, security and developer services that are unique in implementation if not conception, container products from the big three vendors can trap users in a maze of platform dependencies with no easy exit path. As container use in the enterprise moves from developer sandboxes to production systems, the desire for multi-environment portability presents an opportunity to devise standards, software, and automation systems that facilitate platform-agnostic container platforms. The idea is to ensure easy migration between private and public container environments. Recent announcements from Cisco, Google, and Pivotal Software are important milestones on the road to platform agnostic container infrastructure.


Welcome to Banking-as-a-Service

The underlying theme of this kind of disruption is the unbundling of supply and service. Banking has come late to the unbundling revolution. But now, the sector is ripe for it - for unbundling, or disaggregation - and ripe for its own Software-as-a-Service transformation that will allow customers to pick and choose and pay for applications as they use them. Software-as-a-Service (SaaS) businesses delivered by APIs have a low-touch sales model. These companies don’t sell; buyers help themselves. Low-touch sales combined with recurring revenues and lack of customer concentration are the three hallmarks of a SaaS business. In many cases these businesses are just better in all senses. But combining these three essential ingredients on their own will not be enough. The winners in this field are likely to be nimble specialists capable of creating plug-in-and-play APIs to allow anything to be processed anywhere, rather than the large - slow - generalists of the past. Starling is well-placed in this regard. We have built Starling with a set of public APIs that are freely available for anyone to use through our developer portal. 


5 Tips to Boost Your Company's Digital Transformation With BPM


With tools such as artificial intelligence and machine learning, reams of data can be processed in the blink of an eye, providing insights into how an organization can better meet customer needs. Often, this optimization is a product of changes in business process management, or BPM. Even the most basic organizations function through processes. There might be a process for acquiring leads, a process for vetting them, and a process for making a sale. After you convert a prospect, there's a process for invoicing the customer, one for fulfilling the order, and one for delivering the product. There are also strictly internal processes, such as those triggered when employees ask for time off or request tech support. BPM refers to the management of these procedures, such as ensuring they are effective and determining how to combine them in the most efficient way. When implemented effectively, BPM helps organizations streamline their day-to-day processes, making work more efficient. But implementing BPM or other digital transformations without full buy-in from your team can lead to a lack of teamwork or other disadvantages. 


APIs In Banking: Unlocking Business Value With Banking As A Platform (BaaP)

Banking as a Platform (BaaP), sometimes referred to as Banking as a Service (BaaS), occurs when a bank acts as an infrastructure provider to external third parties. Variations include other banks white-labeling the BaaP platform for faster time to market, fintech firms leveraging the BaaP provider’s banking license to provision bank accounts, and banks and fintechs using the BaaP platform for testing purposes. Banks like CBW, Fidor, JB Financial, solarisBank, and wirecard built their BaaP architecture from scratch, without the constraint of legacy systems, creating modular application stacks broken into discrete services. The modular banking services on a BaaP platform serve as building blocks, accessible to third parties through an API management layer, where they can be mixed and matched to create new products and services tailored to the third party’s business model


Life Is Dirty. So Is Your Data. Get Used to It.

As Dr. Hammond suggests, it's difficult to determine if data is ever clean. Even scientific constants have a degree of accuracy. They are "good enough," but not perfect. Data's ultimate purpose is to drive decisions. Bad data means bad decisions. As data professionals, it is up to us to help keep data "good enough" for use by others. We have to think of ourselves as data janitors. But nobody goes to school to become a data janitor. Let's talk about options for cleaning dirty data. Here's a handful of techniques that you should consider when working with data. Remember, all data is dirty, you won't be able to make it perfect. Your focus should be making it "good enough" to pass along to the next person. The first thing you should do when working with a dataset is to examine the data. Ask yourself, "does this data make sense?" That's what we did in the example above. We looked at the first few rows of data and found that both the city and country listed inside one column.



Quote for the day:


"Courage is more exhilarating than fear and in the long run it is easier." -- Eleanor Roosevelt