Daily Tech Digest - August 26, 2018

robot-3010309_960_720
There’s good reason for Gartner’s confidence. AI has been moving fast and is already a key area of research and development for many organizations. The fruits of these efforts can already be seen in algorithms that influence things such as social media feeds, autonomous vehicles, apps and even some call centers. And that rapid progress means that we’ll soon start seeing AI technology appear “virtually everywhere” over the next 10 years as it becomes available to the masses. Gartner says movements and trends such as cloud computing, open-source technology projects and the “maker” community are fueling AI’s rise. It adds that AI is most prevalent in technologies such as AI platform as a service, artificial general intelligence, autonomous driving, autonomous mobile robots, flying autonomous vehicles, smart robots, conversational AI platforms, deep neural networks and virtual assistants.



What It Takes To Disrupt A Massive Industry

The strategy involves finding a gap in an incumbent’s product line and create something that will augment it. This essentially makes you a friend, not a foe. To this end, Nir initially built an analytics layer on top of existing platforms to help with cyber fraud. “And it worked out,” he said. “We became partners with the major players. We talked to their partners, sold to their customers.” But of course, this strategy must go to the next level if a company is to get to scale. “We poured the profits from the helper app back into engineering, building our next-gen SIEM platform,” said Nir. “Not only did the sales fund development, but we had access to customers who were using all the major SIEM platforms. We had a front-row seat and saw all the problems these customers were having with the legacy products. Needless to say, the big SIEM vendors weren’t pleased, but we had enough sales and momentum to go it alone. Also, we knew from experience how much better our next-gen platform was. When we launched our SIEM, we already had customers committed."


Is it time to automate politicians?


A robot could take over every politician’s favourite task of cutting ribbons to inaugurate new buildings. We already cede decision-making responsibility on health and finances to algorithms, why not with voting? An automated democracy could replace both politicians and ballot boxes. That may be extreme. Yet comical though it sounds, parts of our politics has already been technified. Consider reach. Both Narendra Modi, India’s prime minister, and the French presidential candidate Jean-Luc Mélenchon, beamed holograms of themselves to speak to several groups of thousands of people simultaneously. Next, there’s the message. In America’s 2016 election, candidates used social-media advertising to target different voters with different messages. The growing automation of our government is no longer sci-fi. Instead, it’s a reality we are only beginning to grasp. So to the question, can we replace politicians with robots? The answer is a soft yes.


Cyber Resilience: Where do you rank?


It is imperative for businesses to be proactive rather than reactive when it comes to cyber security. Businesses need to ensure every part of their enterprise is protected, even from employees as well as contractors and their supply chain partners. A small vulnerability can be hugely detrimental to a businesses’ cyber security. An effective method of preventing cyber attacks is to develop a culture of resilience within a business. Cyber security should not be the exclusive domain of the IT department, it has companywide consequences and should be the responsibility of the C-suite to drive a cyber safe culture within their organisations. As cyber crime presents itself in a variety of forms, businesses can combat the risk of a cyber attack by implementing staff training to spot a potential cyber attacks, like phishing emails, establishing password strength and change requirements, mandating software updates and data back-ups to secure their data. All while restricting what data employees can access and share.


How artificial intelligence is transforming the financial ecosystem

Artificial intelligence is fundamentally changing the physics of financial services. It is weakening the bonds that have held together the component parts of incumbent financial institutions, opening the door to entirely new operating models and ushering in a new set of competitive dynamics that will reward institutions focused on the scale and sophistication of data much more than the scale or complexity of capital. A clear vision of the future financial landscape will be critical to good strategic and governance decisions as financial institutions around the world face growing competitive pressure to make major strategic investments in AI and policy makers seek to navigate the challenging regulatory and social uncertainties emerging globally. Building on the World Economic Forum’s past work on disruptive innovation in financial services, this report provides a comprehensive exploration of the impact of AI on financial services.


Predictive Analytics: The Future of Financial Marketing


As we move from traditional analytics to predictive analytics, we can leverage new technology to deliver marketing messages to customers. Beyond direct mail, email, and even digital marketing, new touchpoints, such as chatbots, and voice-first interactive assistants will provide new ways to engage with a consumer. “Artificial intelligence (AI) that is fueled by predictive analytics, machine learning, and natural language processing will be the brains behind the face,” states Aite Group. Predictive analytics is the future of financial institution marketing, predicting when a consumer will experience a life event or need a financial service solution. This advanced form of needs analysis, once only available to the largest organizations, is now financially and operationally available to organizations of all sizes. The combination of predictive analytic tools and advanced digital delivery options can guide the customer to the best financial solution at the most opportune time … sometimes before the consumer even realizes they have a need.


New DevOps Study Offers A Reality Check for Financial Services

Despite reason for optimism, there are still potential hurdles to making the most of DevOps practices. Among those whose businesses have already migrated to DevOps, seventy-one percent of respondents claimed to have experienced challenges. In 26 percent of cases, IT leaders found that the operating teams were limiting the transition to DevOps. Another 26 percent reported difficulty due to management structure lacking clear business objectives, which made defining DevOps strategy difficult. The survey’s results paint a clear picture: Organizations need to have a unified approach in order to make the most of their DevOps goals. Speaking about areas where financial services organizations should focus, Hayes-Warren stated that IT leaders should lead the drive to automate processes and applications by leveraging the cloud. Furthermore, companies need to focus on their culture to ensure they’re adoption organization-wide practices that integrate members of their IT departments.


Will Machine Learning AI Make Human Translators An Endangered Species?


Training a neural machine to translate between languages requires nothing more than feeding a large quantity of material, in whichever languages you want to translate between, into the neural net algorithms. To adapt to this rapid transformation, One Hour Translation has developed tools and services designed to distinguish between the different translation services available, and pick the best one for any particular translation task. "For example, for travel and tourism, one service could be great at translating from German to English, but not so good at Japanese. Another could be great at French but poor at German. So we built an index to help the industry and our customers. We can say, in real time, which is the best engine to use, for any type of material, in any language." This work – comparing the quality of the output of NMT generated translation, gives a clue as to how human translators could see their jobs transforming in coming years.


Fintech Without Borders: Regulators Consult on Global Financial Innovation Network

The GFIN Regulators are encouraging responses from “innovative financial services firms, financial services regulators, technology companies, technology providers, trade bodies, accelerators, academia, consumer groups and other stakeholders keen on being part of the development of the GFIN.” Firms should submit responses to GFIN@fca.org.uk. Feedback submitted to this email address will be shared among the GFIN Regulators unless a firm specifically states otherwise. Alternatively, firms may provide feedback or arrange to discuss the Consultation with one or more particular GFIN Regulators. Contact details for these purposes are provided in the Consultation. A regulatory sandbox is a platform for firms to test innovative new products, services or business models on a limited scale before a full launch. A number of national regulators including the FCA, the HKMA and the MAS have developed such sandboxes


How ‘Similar-Solution’ Information Sharing Reduces Risk at the Network Perimeter


Even when information is shared, it’s typically between identical solutions deployed across various sites within a company. While this represents a good first step, there is still plenty of room for improvement. Let us consider the physical security solutions found at a bank as an analogy for cybersecurity solutions. A robber enters a bank. Cameras didn’t detect the intruder wearing casual clothes or anything identifying him or her as a criminal. The intruder goes to the first teller and asks for money. The teller closes the window. Next, the robber moves to a second window, demanding money and that teller closes the window. The robber moves to the third window, and so on until all available windows are closed. Is this the most effective security strategy? Wouldn’t it make more sense if the bank had a unified solution that shared information and shut down all of the windows after the first attempt?



Quote for the day:


"Added pressure and responsibility should not change one's leadership style, it should merely expose that which already exists." -- Mark W. Boyer


Daily Tech Digest - August 25, 2018


Biometrics aren't just being used at border control. Sydney Airport has announced it's teaming up with Qantas, Australia's largest airline, to use facial recognition to simplify the departure process. Under a new trial, passengers on select Qantas international flights can have their face and passport scanned at a kiosk when they check in. From then on, they won't need to present their passport to Qantas staff -- they'll be able to simply scan their face at a kiosk when they drop off luggage, enter the lounge and board their flight at the gate. Travellers will still need to go through regular airport security and official immigration processing, but all of their dealings with Qantas can be handled with facial recognition. "Your face will be your passport and your boarding pass at every step of the process," Geoff Culbert, Sydney Airport CEO, said of the new development. 


Google just gave control over data center cooling to an AI


Now, Google says, it has effectively handed control to the algorithm, which is managing cooling at several of its data centers all by itself. “It’s the first time that an autonomous industrial control system will be deployed at this scale, to the best of our knowledge,” says Mustafa Suleyman, head of applied AI at DeepMind, the London-based artificial-intelligence company Google acquired in 2014.  The project demonstrates the potential for artificial intelligence to manage infrastructure—and shows how advanced AI systems can work in collaboration with humans. Although the algorithm runs independently, a person manages it and can intervene if it seems to be doing something too risky. The algorithm exploits a technique known as reinforcement learning, which learns through trial and error. The same approach led to AlphaGo, the DeepMind program which vanquished human players of the board game Go


Overlook 5G security at your peril


Attacks can come in many different shapes and sizes; user malware, fraudulent calls, spam, viruses, data and identity theft, and denial of service, to name a few examples. The rise in security threats is partly due to the growing deployment of carrier Wi-Fi access infrastructures and small cells in public areas, offices and homes and will increase exponentially with M2M. Historically, carrier-grade telecom networks have had an excellent record for user and network security; however, today’s communications infrastructure is far more vulnerable than its predecessors. And with advances in security threats constantly evolving, service providers must invest in the right tools to keep on top of the issue. These increasing security risks are due to the move to the IP-centric LTE architecture. The flatter architecture is what exposed the 4G networks, due to the fact there were fewer steps to the core network, and this will continue to be an issue with 5G networks.


Companies lack leadership capabilities for digital transformation projects

Percentage of organizations believing they have the required capabilities
Yet, even after years of exponential growth in the digital and digital consulting arenas, new Capgemini research shows that the implementation of digital transformation projects is still lagging in its nascent stages. According to the responses of more than 1,300 business leaders from some 750 organisations, only a relatively small number of companies have the digital (39%) and managerial (35%) capacities needed to make their digital transformation successful. While the fact that these figures remain less than 50% is surprising, what is even more shocking is that, compared to exactly the same measurement six years ago, there has actually been a decline in the firms’ general readiness for digital transformation. Capgemini found that organisations today feel less equipped with the right leadership skills, at 45% in 2012 compared to 35% in 2018. According to Vincent Fokke, Chief Technology Officer at Capgemini in the Benelux, this is an important point to note.


AI and Robots: Not What You Think


Depending on what you read – and choose to believe about what you read – AI-driven robots are able to autonomously make decisions about what work gets done, how it gets done and who does it or there are decades of work yet to be done before we see a material impact. Personally, I think we’re somewhere in the middle, as manufacturers – pragmatists that they are – design and implement manufacturing strategies in a very deliberate way to achieve business requirements and then focus ongoing efforts to make key processes better and better. And I think that collaborative robots (cobots) will play a larger and larger role in accelerating progress. The AI that cobots possess makes them so much more than just machines for dirty, dull and dangerous work. So let the world watch and wait for artificial intelligence that will enable wholesale change in how we drive, care for our aged, teach our children and more. Manufacturers don’t have to wait for artificial intelligence-driven robots to help them make their operations better.


Serverless vs. Containers

Debate about serverless vs. containers often starts with control, or the lack thereof in the case of serverless. This is not new. In fact, I clearly remember the same debates around control when AWS was starting to gain traction way back in 2009. Now 10 years later, the dust has settled on that original debate but we have failed to learn our lesson. It's human nature to want control, but how much are you willing to pay for it? Do you know the total cost of ownership (TCO) you will take on. The ability to control your own infrastructure comes with a lot of responsibilities. To take on these responsibilities, you need to have the relevant skill sets in your organization. That means salaries (easily the biggest expense in most organizations), agency fees, and taking time away from your engineers and managers for recruitment and onboarding. Given the TCO involved, the goal of having that control has to be to optimize for something (for example, to achieve predictable performance for a business critical workflow), and not having control for its own sake.


The Evolution of Internet of Things: New Business Models Need Interoperability


The predicted rate of connected devices growth that is often cited by Gartner, Deloitte and others, is based upon the proliferation in data and the effect this rate of growth will have on businesses and the number of new businesses that will be created. But, if the current trend of single-use case IoT solutions continues to be siloed, these predictions for connected devices growth may not be realised. Open APIs between product and service providers are the key technology towards resolving this issue. ... That is simply too expensive and time consuming, particularly for smaller businesses, to maintain. Simply managing the connection between one partner will result in maintenance costs that, as a business with tight margins, might not be viable to continue. Gartner predicted 75 per cent of IoT projects will take twice the time allocated to be accomplished, because of the increasing complexity associated with developing this connectivity. So what is the solution?


Streamlining Data Science and Analytics Workflows for Maximum ROI

Despite the multitude of tasks associated with the data science position, its basic workflow (in terms of analytics) is readily codified into three steps. The first is data preparation or data wrangling; where the data scientist starts with raw data and “just tries to make sense of it before they’re doing anything real with it,” Mintz explains. “Then there’s the actual model building when they’re building a machine learning model. Assuming they find something valuable, there’s getting that insight back into the hands of the people who can use it to make the business run better.” Typically, data scientists approach building a new analytics solution for a specific business problem by accessing raw data from what might be a plethora of sources. Next, they engage in a lengthy process to prepare the data for consumption. “So much time and energy goes into that,” says Mintz. “You look at the surveys of data scientists and they say 70-80% of my time goes to data cleaning.”


Stranded and in Need of Rescue: Your Enterprise Data


In today’s enterprise, it is still very common for data to be stored disparately in any number of locations and systems. Getting to a single version of the truth is virtually impossible to achieve with siloed data and different areas of the business act and operate in different ways, depending upon which version of the truth they are subscribed or have access to. In fact, in an upcoming report released later this month, IDC name data siloes as the number one challenge for Digital Transformation (DX). Because isolated data leads inexorably to isolated working practices and those are the antithesis of an integrated strategy, which is what DX is all about. Integration. No wonder then, that figures such as those produced by Harvard Business Review and Forbes show that nearly two thirds of DX initiatives are failing. Optimal use of information is a Critical Success Factor for today’s enterprise.


Network technologies are changing faster than we can manage them

Data breach and user experience are the two biggest network worries. About 33 percent of network professionals said a data breach worries them the most about their network. Given the almost daily data breaches, who can blame them? In an ideal world, network managers would like to see tools that combine network and security management. However, only about 40 percent of respondents said their organization was using the same stack of tools to manage both network performance and security. But network pros are also being overwhelmed by the huge proliferation of cloud and network management tools. Many organizations are trying combinations of tools to manage the challenge. Network traffic analytics appears to be the most commonly used, with just over 28 percent of network professionals using it to manage their network challenge.



Quote for the day:


"If you don’t have some self doubts and fears when you pursue a dream, then you haven’t dreamed big enough." -- Joe Vitale


Daily Tech Digest - August 23, 2018

Google Home at Work
It wouldn't make sense for every office environment, of course; having such a gadget in a crowded cubicle farm would probably lead to more annoyances (not to mention mischievous co-worker interference) than anything. But if you have a relatively isolated space in which you work, be it your own executive suite (look at you!) or a more humble home office (like mine), you might be surprised at how handy a Google Home or Smart Display could be.[Get fresh tips and insight in your inbox every Friday with JR's new Android Intelligence insider's newsletter. Exclusive extras await!] Now, is there a fair amount of overlap between what a Google Home or Smart Display on your desk can do and what you could already do with your phone? You'd better believe it. But performing a task on a permanent, stationary device can often be easier and more effective than futzing around with your phone. Using a smart speaker also doesn't wear down your precious mobile battery, and the device's standalone nature makes it better suited for certain types of tasks.


Service mesh architecture radicalizes container networking

A service mesh architecture uses sidecar containers to facilitate network traffic
To Thomas, true microservices are as independent as possible. Each service handles one individual method or domain function; uses its own separate data store; relies on asynchronous event-based communication with other microservices; and lets developers design, develop, test, deploy and replace this individual function without having to redeploy any other part of the application. "Plenty of mainstream companies are not necessarily willing to invest quite that much time and money into their application architecture," Thomas contended. "They're still doing things in a more coarse-grained manner, and they're not going to use a mesh, at least until the mesh becomes built into the platform as a service that they're using, or until we get brand-new development frameworks." Some early adopters of the service mesh architecture don't believe a slew of microservices is necessary to benefit from the technology. "It allows you to push traffic around in a centralized way that's consistent across many different environments and technologies, and I feel like that's useful at any scale," said Zack Angelo, director of platform engineering at BigCommerce, an e-commerce company based in Austin, Texas, that uses the Linkerd service mesh. "Even if you have 10 or 20 services, that's an immensely useful capability to have."


Redefining work in the digital age

cio1002018 feature story changing the way we work
To ensure a successful transition, experts say organizations must figure out the right intersection of humans and intelligent machines. Fifty-four percent of those surveyed by Accenture said human-machine collaboration is important to achieving strategic priorities, while 46 percent believe traditional job descriptions are now obsolete and 29 percent have already redesigned job roles extensively. “We’ve never seen change like this,” says Katherine Lavelle, managing director of Accenture’s Strategy, Talent & Organization practice in North America. “This is about generating new levels of capabilities and results for clients and customers augmented through smart automation and humans. Whoever figures out the collaboration between the two is poised to win the war.” Training and reskilling workers will be essential to creating an enhanced employee experience that redefines the nature of work. “In some ways, we’ll go back to the basics on things we put a value on prior to automation,” Lavelle says.


7 steps to better code reviews

7 steps to better code reviews
Code review had been demonstrated to significantly speed up the development process. But what are the responsibilities of the code reviewer? When running a code review, how do you ensure constructive feedback? How do you solicit input that will expedite and improve the project? Here are a few tips for running a solid code review. ... Try to get to the initial pass as soon as possible after you receive the request. You don’t have to go into depth just yet. Just do a quick overview and have your team write down their first impressions and thoughts. Use a ticketing system. Most software development platforms facilitate comments and discussion on different aspects of the code. Every proposed change to the code is a new ticket. As soon as any team member sees a change that needs to be made, they create a ticket for it. The ticket should describe what the change is, where it would go, and why it’s necessary. Then the others on your team can review the ticket and add their own comments.


How advanced OCR found new life in big data systems

One reason that OCR was rarely used until recently is that it wasn’t especially reliable. Even when, in the early 2000s, the programs reached about 95% accuracy, businesses ran the risk that software would produce documents containing major mistakes – and particularly with numerals, such errors can be labor intensive to identify and correct. Analysts would do just as well entering the data by hand. However, now that the scan accuracy is significantly improved, the resultant data is more valuable, and analysts need only cross-reference the scans with original documents if something in the content doesn’t make sense. NLP has also helped increase the accuracy of OCR scans. For example, older OCR programs might read chart lines as the letter ‘L’ or number ‘1.’ NLP is context dependent, however, so it can identify if something is a chart or graph, whether it’s reading a bill or an invoice, and other types of nuanced content.


Climb the five steps of a continuous delivery maturity model


A maturity model describes milestones on the path of improvement for a particular type of process. In the IT world, the best known of these is the capability maturity model (CMM), a five-level evolutionary path of increasingly organized and systematically more mature software development processes. The CMM focuses on code development, but in the era of virtual infrastructure, agile automated processes and rapid delivery cycles, code release testing and delivery are equally important. Continuous delivery (CD), sometimes paired with continuous integration to make CI/CD, is an automated process for the rapid deployment of new software versions. A complicated process, CD includes several steps that span multiple departments. CI/CD and DevOps can prove daunting to organizations that view modernization as a dichotomy: Either you're DevOps or you're legacy. But continuous delivery is an efficiency improvement that can evolve in stages.


Analysis: Anthem Data Breach Settlement

"Credit monitoring itself as an award is frankly not that effective, at least in my personal view," DeGraw, who was not involved in the Anthem case, says in an interview with Information Security Media Group. "A persisting problem is that post-breach, [bad actors] can still potentially use the stolen records, including medical information, to cause harm." A more affective approach for most consumers, DeGraw says, is to put a credit freeze on their accounts "which is a bit more cumbersome at times ... but that's a more effective remedy." For breach victims, "there is no easy way to clean up your life," the attorney says. "You have a fair number of out-of-pocket costs, including taking a day off [from work] to file a report ... and maybe hire people to clean up your accounts and other things that have been opened in your name. It can be a hassle and it's time-consuming and it doesn't go away soon because we can't change our Social Security numbers or healthcare numbers relatively easily."


Testing Programmable Infrastructure - a Year On


Worse than the technical challenges, we faced cultural challenges too. Sysadmins and testers aren't used to working with one another! The project made it very clear to me that programmable infrastructure is becoming widespread. There are very specific domain issues that make testing it tricky. But it felt like nobody had the answers. Infrastructure resources are critical to successful software. If there's a problem with your database or your load balancer, it now could be due to committed code. That code is production code, so we should test it! Over a year has passed since I first presented that talk. Even longer since the project which inspired me to present it. I have been on a number of other projects since, and my thinking has changed too. ... When testing anything new, it’s important to revisit fundamentals. When I first gave my talk, I focused a lot on howwe tested it, but not a lot on whywe tested it. I cannot tell you what your cloud infrastructure landscape looks like. The topic is very broad, and fast changing.


Microsoft Office 365 Turns Data Storage Upside Down

In addition to syncing and storing your own private files, the OneDrive for Business client can also sync corporate data stored elsewhere in SharePoint. So this client provides access to files in both locations. Best of all you can choose what you “see” in your OneDrive for Business client and what you are going to sync locally to your computer. To muddy the waters just a bit more, Microsoft recently announced that One Drive for Business will soon start to offer the option to automatically sync your local profile default data locations such as the documents and pictures folders. And it will also have one-button ransomware protection for your files. So now we’re storing personal data, bits of the user profile, and we’re syncing locally some or all of the data in SharePoint. But we’re still upside down from how business has historically stored data because our corporate space is smaller than the personal space.


How security binding choices impact everything in global file search

Few non-software architects understand the outsized impact that security binding has on global file search performance, scalability, multi-tenancy, hardware, supporting infrastructure, capital expenditures, operating expenditures and the total cost of ownership. The wrong security binding choice can add hundreds of thousands to millions of dollars to the TCO. From additional expensive hardware, supporting infrastructure, maintenance, software licensing, training, power, cooling, shelf space, rack space, cables, conduit, transceivers and allocated overhead, the costs can be shockingly high. When we examine the pros, cons, tradeoffs, consequences, and workarounds, for each of three different security binding choices – late binding, early binding, and real-time binding - we find that real-time binding provides the performance of early binding with the accuracy of late binding.



Quote for the day:


"Good leaders must first become good servants." -- Robert Greenleaf


Daily Tech Digest - August 22, 2018


I have been in the space of artificial intelligence for a while and am aware that multiple classifications, distinctions, landscapes, and infographics exist to represent and track the different ways to think about AI. However, I am not a big fan of those categorization exercises, mainly because I tend to think that the effort of classifying dynamic data points into predetermined fixed boxes is often not worth the benefits of having such a “clear” framework. I also believe this landscape is useful for people new to the space to grasp at-a-glance the complexity and depth of this topic, as well as for those more experienced to have a reference point and to create new conversations around specific technologies. What follows is then an effort to draw an architecture to access knowledge on AI and follow emergent dynamics, a gateway of pre-existing knowledge on the topic that will allow you to scout around for additional information and eventually create new knowledge on AI. I call it the AI Knowledge Map (AIKM).




Using innovation labs and accelerators as a form of R&D to learn about certain industries is a great idea, as long as leaders realise that R&D and innovation are not the same thing. Innovation is the combination of clever new ideas and technologies with sustainably profitable business models. So the question still remains - as we work with startups or internal teams to learn about new industries, how are we going to convert those learnings into long-term revenues for the company? We have to design our labs and accelerators to be able to extract insights and create value. Other companies are very explicit about using innovation labs to accomplish their bottom line goals. These leaders are focused on balancing their portfolio, adding new business models and revenues to the company. The biggest challenge these leaders face is what to do with successful innovations. Not every product or service from the innovation lab or accelerator will be successful. But once we have something promising, we need to figure out a way to scale that product or service.



The Case for Work from Home and Flexible Working


To attract and retain employees under the old paradigm the business must deliver an EVP that provides career opportunity and professional development, regularly sign-posted by role expansion and salary growth. ... An alternative operating model is required for the front line. An operating model that supports a new recruitment promise based on the provision of flexible working arrangements that allows employees to manage their work life priorities. ... The flexible working operating models will be designed to reflect our evolving understanding of what motivates employees, this also forms an important part of the EVP. It is based on engaging the intrinsic motivators of autonomy, mastery and purpose. ... Flexible working enables the front-line to be deployed dynamically to where customers choose to be, whether that be in store, online or on the phones. Creating opportunities to vary not just “where and when I work” but also “what I do” empowers our employee to genuinely design their own work experience. Role variation provides a unique and highly competitive dimension to the EVP and it enhances the businesses resilience to uncertainty.


Data management: Using NoSQL to drive business transformation

Because of our core architecture it is very important to our customers, who are deploying applications, that they can do that in near real-time speed. So, when we think about the capabilities that we have layered together inside of our data platform, we're unlocking the power of NoSQL, but doing so in a way that enables application developers to very quickly learn the platform, and help them become efficient in picking up applications to take advantage of it. Now, that core platform can run at any point in the cloud -- everything from the major public cloud to customer's private data centres -- and it can also run on premise. Now we've extended the power of the platform out to the edge. We have a solution that we have called Couchbase Lite. This is small enough that it can run inside an application on a mobile device, and you still get the full power of the platform, including the data structure and our ability to query and, very soon, you will be able to run operational analytics on top of that.


Balancing innovation and compliance for business success


Whether it is the need for greater transparency with user data, improved reporting methods for the regulator or enhanced security measures, any new technology being introduced will need to carefully assessed so that businesses recognises and understand whether it is compliant with current legislation. While staff trials can often help to raise any last-minute concerns about the functionality of new IT solutions, management also needs to include the IT team and compliance teams in this activity. In many cases, the IT department is left out of discussions regarding data management and compliance, making it hard for them to identify any potential conflicts in this area. In order to address this issue, IT needs to have a greater understanding of the wider business. In particular, the IT department needs to be as involved in the company’s wider compliance measures as it is with particular applications or systems, as this will make it much easier to establish what controls need to be put in place


It’s Time for Token Binding

What is so great about token binding, you might ask? Token binding makes cookies, OAuth access tokens and refresh tokens, and OpenID Connect ID Tokens unusable outside of the client-specific TLS context in which they were issued. Normally such tokens are “bearer” tokens, meaning that whoever possesses the token can exchange the token for resources, but token binding improves on this pattern, by layering in a confirmation mechanism to test cryptographic material collected at time of token issuance against cryptographic material collected at the time of token use. Only the right client, using the right TLS channel, will pass the test. This process of forcing the entity presenting the token to prove itself, is called “proof of possession”. It turns out that cookies and tokens can be used outside of the original TLS context in all sorts of malicious ways. It could be hijacked session cookies or leaked access tokens, or sophisticated MiTM. This is why the IETF OAuth 2 Security Best Current Practice draft recommends token binding, and why we just recently doubled the rewards on our identity bounty program.


Artificial General Intelligence Is Here, and Impala Is Its Name


AGI is a single intelligence or algorithm that can learn multiple tasks and exhibits positive transfer when doing so, sometimes called meta-learning. During meta-learning, the acquisition of one skill enables the learner to pick up another new skill faster because it applies some of its previous “know-how” to the new task. In other words, one learns how to learn — and can generalize that to acquiring new skills, the way humans do. This has been the holy grail of AI for a long time. As it currently exists, AI shows little ability to transfer learning towards new tasks. Typically, it must be trained anew from scratch. For instance, the same neural network that makes recommendations to you for a Netflix show cannot use that learning to suddenly start making meaningful grocery recommendations. Even these single-instance “narrow” AIs can be impressive, such as IBM’s Watson or Google’s self-driving car tech. 


David Chamberlain, general manager of Licensing Dashboard, says IT can sometimes be over-cautious. “IT people can be worried about things like an Exchange server going down, so there is a tendency to over-provision, and then people will forget to decommission the cloud service when it is no longer needed,” he says. “The cloud is very elastic and is easy to throttle up, which is a big change from on-premise servers.” For Witt, virtual machine (VM) sprawl has always been an issue on-premise, even when companies have had good processes in place. “It is always easier to spin a VM up than it is to decommission it,” he says. “Most datacentres will have a significant proportion of unused VMs – in my experience, it’s around 30-40%.” While on-premise, only a fraction of storage and compute is consumed for these unused VMs, in the cloud, the VM is charged per second, he points out. “You’re charged for the VM size regardless of whether it is fully utilised,” he says.


Reprogrammable quantum computers are the "ultimate goal" of current research.
The ultimate goal of quantum information programming – a device capable of being reprogrammed to perform any given function – is one step closer following the design of a new generation silicon chip that can control two qubits of information simultaneously. The invention, by a team led by Xiaogang Qiang from the Quantum Engineering Technology Labs at the University of Bristol in the UK, represents a significant step towards the development of a practical quantum computing. In a paper published in the journal Nature Photonics, Qiang and colleagues report proof-of-concept of a fully programmable two-qubit quantum processor “enabling universal two-qubit quantum information processing in optics”. The invention overcomes one of the primary obstacles facing the development of quantum computers. Using current technology, operations requiring just a single qubit (a unit of information that is in a superposition of simultaneous “0” and “1”) can be carried out with high precision.


How data breaches are affecting the retail industry

From phishing, vishing and smishing to acquiring consumers’ identification details, or full-blown criminal hacking, the flow of fresh news stories detailing the latest attacks clearly demonstrate the scale of this growing issue. Indeed, such are the risks of data breaches that they are no longer viewed as IT issues, but organisational issues that can derail day-to-day operations and have long-term reputational impact. So, what are the real business costs of a data breach? According to the 2018 Cost of a Data Breach Study by Ponemon Institute, the average cost of a data breach is $3.86 million, which is a 6.4% increase on the 2017 cost of $3.62 million. ... The harsh reality is that no organisation can ever deem itself completely safe and at zero risk of a data breach. However, what you can – and should – do is take a critical look at your infrastructure, processes, systems and controls, and ensure that you have taken steps to address risks and know what to do if you suffer a breach.



Quote for the day:


"Not all readers are leaders, but all leaders are readers." -- Harry S. Truman


Daily Tech Digest - August 21, 2018

Google and banks are being less than truthful about customer tracking

Hacking stealing password data
At least the banks, as far I can tell, didn't say that they weren't tracking people. They merely said nothing about either way. But bank app developers need to remember that banks are in a much more precarious position than Google and they need to at least pretend to be trustworthy in a much more public fashion. Why? Google is still the most effective and comprehensive search engine on the planet. I'd love to be able to say that DuckDuckGo or other privacy-oriented engines are as good or better, but based on daily testing, Google still comes out far ahead. Bing, Yahoo and others long ago lost the search battle to Google. That means that an annoyed Google user can't leave Google without losing some serious search functionality. And on an Android phone, the reliance is even deeper and better integrated. But banks? Not even close. Disgruntled customers can easily take their money and data and move to the rival bank across the street, and they will likely suffer no disruption or degradation of services.



Closeup of woman hand backing up android phone
If you lose your Android phone or decide to move to another, there's a decent chance your existing text messages will vanish into the digital ether. That might be fine (and hey, who knows, maybe even a positive thing), but if you do want to back up and save your SMS data, it's pretty painless to do. The simplest way is to use a messaging app that does all the heavy lifting for you. If you have one of Google's Pixel phones, Google's own free Android Messages app will automatically back up some of your messages — up to 25MB worth, according to Google, and only SMS texts (not MMS media messages). It's preinstalled as the default messaging app on your device, so you don't have to do anything to get it up and running. If you're using a phone other than a Pixel — or if you're using a Pixel and want something a bit more robust — the third-party Pulse SMS app is an excellent next-level option. In addition to providing its own universally available automatic cloud backup and sync system, it offers plenty of opportunities for customization



The biggest risk in cloud computing is not doing it

The biggest risk in cloud computing is not doing it
First, there are risks of changing any aspect of IT, as we saw when moving to the PC, LANs, client/server, mobile, and the web—all things that made us rethink IT yet again, as well as drive change that also drives risk. Second, if businesses did not take risk then nothing would change—and they would die. So, the cost of risk should always be offset by the value gained in taking the risk. In the case of cloud computing, its better operational efficiency leads to lower operational costs. And cloud computing also improves business agility to better react to market changes and expand quickly as the business grows. These are all game-changers and value drivers for cloud computing. Third, risk can be reduced with planning. That means taking the time to figure out what your issues are, how technology such as cloud computing can address your issues (if it can), and how to reduce the risks in doing so. Security, for example, is always a risk. But addressed with the right approaches and technologies, youre cloud-based system will actually be more secure than your “as is” on-premises systems.


What is data deduplication, and how is it implemented?


The usual way that dedupe works is that data to be deduped is chopped up into what most call chunks. A chunk is one or more contiguous blocks of data. Where and how the chunks are divided is the subject of many patents, but suffice it to say that each product creates a series of chunks that will then be compared against all previous chunks seen by a given dedupe system. The way the comparison works is that each chunk is run through a deterministic cryptographic hashing algorithm, such as SHA-1, SHA-2, or SHA-256, which creates what is called a hash. For example, if one enters “The quick brown fox jumps over the lazy dog” into a SHA-1 hash calculator, you get the following hash value: 2FD4E1C67A2D28FCED849EE1BB76E7391B93EB12 If the hashes of two chunks match, they are considered identical, because even the smallest change causes the hash of a chunk to change. A SHA-1 hash is 160 bits. If you create a 160-bit hash for an 8 MB chunk, you save almost 8 MB every time you back up that same chunk. This is why dedupe is such a space saver.



Cybersecurity is a proactive journey, not a destination

The topic itself is broad and expansive, and the true impact of this segment of computing will be around for generations to come. For strong perspective on where the industry stands in its current state, ISACA’s State of Cybersecurity 2018 research is a must-read. This report provides a great assessment of what needs to happen in the cybersecurity field to move from reactive to proactive. Challenges around cybersecurity are not new and have actually been around since the dawn of computing. However, it is now a topic that everyone talks about. It is a board topic, it is a public safety and livelihood topic, and it is a personal topic. Hitting this trifecta of impact has finally created the sense of urgency and the attention that is needed. Now, the key is that as an industry, as a country, and as a world of over 7 billion people, we need to effectively address these industry challenges to preserve the computing environment for the future.


Gartner recommends CIOs get skilled up on deep learning


“CIOs and technology leaders should always be scanning the market along with assessing and piloting emerging technologies to identify new business opportunities with high-impact potential and strategic relevance for their business.” Walker said CIOs or a business decision maker can use predictions like the emerging trends of Hype Cycle as a reality check, helping them to prioritise what areas are likely to become established in the near future. “Some of these capabilities are being delivered in a rapid fashion,” said Walker. Gartner’s predictions show that some technologies, particularly in the AI space such as deep learning, virtual assistants and custom silicon for AI, are likely to become mainstream within two to five years, which does not give CIOs much time to get ready. As an example, Walker said the hospitality sector is being disrupted, such as at the Marriott hotel, which is building service bots to deliver room service.


Establish a data classification model for cloud encryption


Behind the scenes, a data classification model should include metadata that sticks with the newly created document throughout its life. This requires an organization to permanently link the document with immutable metadata -- which is where information management systems, such as those from M-Files and FileHold, come into play. By having users choose a template with its associated metadata, data can then be encrypted as required before it hits any storage media, whether that is a local device, an on-site system or a cloud platform. Anything that isn't open/public -- sticking with the example above – will then be encrypted or dealt with using virtual private networks (VPNs). This approach can also help a business determine if data should primarily be held on premises or in the cloud. Once the basic metadata is created in an immutable manner, users can add extra metadata for further classification. 


This smart bandage can help diagnose and treat your injuries

smartbandage2.png
The bandage is the culmination of over six years' work between Tufts and other higher education institutions to create a bandage that includes sensors to monitor a number of markers showing that show how well, or otherwise, a wound is healing, alongside a drug delivery mechanism - all in a form factor that's flexible enough to be wrapped around a wound. "Chronic wounds are a very biologically complex system, and you have to have the bandage interface in very close contact with the wound so you can monitor whether the wound is healing. At the same time, we wanted to find out if there was a way to intervene at the right time to accelerate wound healing," Sameer Sonkusale, professor of electrical and computer engineering at Tufts University's School of Engineering, told ZDNet. The bandage is a combination of a cloth layer and an electronics layer. The electronics layer includes sensors that track the pH and temperature of the wound -- a higher than normal pH or temperature indicates it's not healing well.


Fiber transmission range leaps to 2,500 miles, and capacity increases

Researchers work to improve fiber transmission efficiency, throughput
Signal noise and distortion have always been behind the limits to traditional (and pretty inefficient) fiber transmission. They’re the main reason data-send distance and capacity are restricted using the technology. Experts believe, however, that if the noise that’s found in the amplifiers used for gaining distance could be cleaned up and the signal distortion inherent in the fiber itself could be eliminated, fiber could become more efficient and less costly to implement. Plus, if fiber could carry more traffic in single strands, it would be cheaper to power, and it would also keep up with rapidly escalating future internet growth. Those two areas of improvement are where many scientists are concentrating their fiber development efforts. The researchers at Chalmers University of Technology and Tallinn University of Technology said they can now send data 4,000 kilometers (nearly 2,500 miles) — or roughly the air-travel distance from Los Angeles to New York.


How to overcome the potential for unintended bias in data algorithms

What makes an algorithm “fair?” Let’s say I have a lot more data besides income - things like credit score, job history, etc. I have a large dataset of past outcomes to train an algorithm for future use. Aiming for accuracy alone will almost definitely result in different treatment of people along age, race, and gender lines. To be fair, should I aim to approve the same percentage of people from each class, even if that means taking some risks? Alternatively, I could train my algorithm to equalize the percentage of people from each class that get approved who actually paid back their loan (the true-positive rate which we can estimate from historical data). A bit of a catch - if I do either of these things, I would have to hold the different groups to different standards. Specifically, I would have to say that I will issue a loan to someone of a certain class, but not to someone else of a different class with the exact same credentials, leading to yet another unfair scenario.



Quote for the day:


"Making those around you feel invisible is the opposite of leadership." -- Margaret Heffernan


Daily Tech Digest - August 20, 2018

mobile apps crowdsourcing via social media network [CW cover - October 2015]
No matter how good your internal IT security team is, no matter whether you have an internal or external pentesting team, you need a bug bounty program and responsible vulnerability disclosure program as a key part of your IT security. I’ve been with firms that decided, wrongly, they didn’t need a bug bounty program. Each, after years of negative lessons learned, started a bug bounty program. They could have saved themselves some pain by starting one earlier. Every company should consider and deploy all three of these types of programs. I’ve known many otherwise good-hearted hackers who grew frustrated, and even resentful, because a company didn’t have an easy way to report a bug they found, didn’t effectively respond to the outreach, or incorrectly told the hacker that their big find wasn’t a big deal. If you make it hard for good people to report serious things, you’re just asking for trouble. If you don’t already have these functions as a mature part of your organization, you can only benefit by getting involved with a company, crowdsourcing or not, that can help you to set them up.



Why CIOs Haven’t Mastered the Elusive Economics of Innovation

The good news is that cloud computing, including a new breed of cloud-based autonomous (self-tuning, self-repairing, self-updating) platform services powered by machine learning, finally gives CIOs the needed technical framework to start pulling their organizations out of the 80/20 spending rut and accelerate their pace of innovation. By offloading much of the onerous maintenance and security work to expert cloud service providers or to the cloud systems themselves, IT organizations can “free up their imaginations,” while getting access to a range of emerging technologies, says Oracle Senior Vice President Steve Daheb. Consider an HR example. At auto parts retailer AutoZone, newly automated processes for employee background checks and onboarding, made possible by its Oracle HCM Cloud application, already are freeing up company HR and store managers to do less administrative work and more value-added work, such as identifying candidates who are a good fit with the company’s distinctive, go-the-extra-mile customer service culture.


In UK: 1 out of every 3 Business had Cryptojacking Malware Infection

In UK 1 out of every 3 Business had Cryptojacking Malware Infection
Citrix Research, in their study, has revealed that a third of the large UK companies were affected by Cryptojaking incidents in July 2018. The survey was participated by 750 British IT leaders, cryptojacking steals processing cycles from workstations, servers, IoT devices and other computing devices in order to collectively mine cryptocurrency. Instead of an elaborate malware with complex functionality, the cybercriminals create and or take-over a legitimate website for it to host cryptojacking virus, which will do hashing attempts in hopes to mine cryptocurrency at the expense of the machine. All of these mining events happen without the users realizing its presence, a stark contrast to ransomwares that by design need to announce its existence to the users. The period of time between infection and eventual detection is wider with cryptojacking malware. Bitcoin and its derivatives are mined using a computing device, but it needs enough time and processing power to do so these days. The longer the detection time, the better chance that the cryptojacking malware will successfully mine virtual coins.


The Usability of Cryptocurrency

If cryptocurrency is going to live up to its hype, it will need to attract users from all professions, backgrounds, and ages. Today, according to an eToro report, most cryptocurrency users are 18- to 35-year-old males working in sales, marketing, IT, and financial services. In other words, cryptocurrency is a trend for people who are already working in a tech-savvy environment. But if such a system’s orientation process targets only these users, anyone who is not tech savvy would likely be lost from the very beginning. Of course, this barrier to entry is just one part of a bigger problem. Crypto enthusiasts are aware that these currencies need to reach a critical mass of users before they are really useful as currencies. Currently, even those who are creating accounts and purchasing cryptocurrency frequently are not using cryptocurrency for its ostensible purpose—buying things! Some individuals may have gotten rich by treating cryptocurrency as an investment vehicle and riding early speculative fluctuations, but this actually presents yet another obstacle for potential users. Investment markets are not user friendly.


Australian Teenager Pleads Guilty to Hacking Apple

Australian Teenager Pleads Guilty to Hacking Apple
The teenager, who legally cannot be named because he is a juvenile offender, pleaded guilty in Australian Children's Court on Thursday to multiple hack attacks against Apple as well as to downloading 90 GB of sensitive information from the company and accessing customers' accounts, Melbourne, Australia-based daily newspaper The Age reported, citing statements made in court. The report says that the boy began his year-long hacking spree when he was 16 years old, motivated in part by his love of Apple gear and hope to one day work for the technology giant. The court heard that after a tipoff from the FBI, the Australian Federal Police last year obtained a search warrant and raided the teenager's family home in Melbourne. "Two Apple laptops were seized and the serial numbers matched the serial numbers of the devices which accessed the internal systems," a prosecutor told the court, The Age reported.


Brendan Eich on JavaScript’s blessing and curse

Being the creator of JavaScript has been a blessing and a curse for Brendan Eich. On the one hand, JavaScript has the distinction of being the most popular programming language in the world. On the other, no language has been the target of more snark. Eich is well aware of the language’s drawbacks—after all, in 1995, he worked around the clock to create JavaScript in a mere 10 days. In this lively interview with IDG’s Eric Knorr, Eich readily admits to JavaScript’s flaws and talks frankly about what he might have done better, while touching on JavaScript’s improvements over its 23-year lifespan. Warts and all, JavaScript has indeed become “the assembly language of the web.” ... WebAssembly supports more than 20 languages, not just JavaScript, opening the ability to write and compile fast web applications to developers of all stripes—and causing many to predict WebAssembly will be central to the future web development.


Intel buys deep-learning startup Vertex.AI to join its Movidius unit

2018 Sundance Film Festival – General Atmosphere
“There’s a large gap between the capabilities neural networks show in research and the practical challenges in actually getting them to run on the platforms where most applications run,” Ng noted in a statement on the company’s launch in 2016. “Making these algorithms work in your app requires fast enough hardware paired with precisely tuned software compatible with your platform and language. Efficient plus compatible plus portable is a huge challenge—we can help.” For Intel, this could mean using Vertex’s IP to help build its own applications, or potentially applications for of its customers. It’s not clear how much funding Vertex.AI had raised. Investors included Curious Capital, which focused on pre-seed and seed-stage funding for startups in the Pacific Northwest; and the Creative Destruction Lab, an accelerator focused on machine learning startups based in Toronto. Intel doesn’t break out revenues specifically for its Artificial Intelligence Product Group, a business unit it established in March 2017


Can the police search your phone?

police homeland security
The Australian government on Tuesday proposed a law called the Assistance and Access Bill 2018. If it becomes law, the act would require people to unlock their phones for police or face up to ten years in prison (the current maximum is two years). It would empower police to legally bug or hack phones and computers. The bill would force carriers, as well as companies such as Apple, Google, Microsoft and Facebook, to give police access to the private encrypted data of their customers if technically possible. Failure to comply would result in fines of up $7.3 million and prison time. Police would need a warrant to crack, bug or hack a phone. The bill may never become law. But Australia is just one of many nations affected by a new political will to end smartphone privacy when it comes to law enforcement. If you take anything away from this column, please remember this: The landscape for what’s possible in the realm of police searches of smartphones is changing every day.


GDPR: Data Protection Is Only The Tip Of The Iceberg

data management, data ownership, privacy, right to be forgotten, GDPR, compliance, data governance
In almost every type of business process, unstructured information is created, required, or exchanged. And while the creator or recipient of that content will likely understand its full context and thus its importance, only too soon that memory fades, and the content is effectively lost to the organization. Even if an individual recollects the content’s existence and location, no connection is maintained between the content itself and the context of the business process that made it relevant in the first place. Further complicating matters, stakeholders – increasingly spread across various global locations – often collaborate using multiple environments or applications, making complete visibility nearly impossible. What’s more, because the majority of team communication occurs through email, a lot of project-relevant content and key audit-trail information is lost or invisible through normal productivity tools.


To succeed at digital transformation, do a better job of data governance

Regardless of the reason an organization undertakes a digital transformation—be it to glean operational insights, change the way it engages with customers or to set the stage for other emerging technologies such as machine learning and artificial intelligence—it needs reliable data as its foundation. And that requires robust data governance. Some consider data governance essential only for cross-departmental collaboration—such as sharing customer data. But it also plays a key role in turning taking seemingly unrelated sources of data and turning into insightful sources of information. Data governance uses a set of defined roles, processes and policies to help manage data assets and ensure their integrity, accuracy and security. Without these structures and controls, data assets lose much of their strategic value. Without effective data governance, no-one can be certain about what data assets a company has, who controls them, what information they can provide and how they should be used.



Quote for the day:


"Leadership is intangible, and therefore no weapon ever designed can replace it." -- Omar N. Bradley