Daily Tech Digest - March 31, 2019

Commonwealth Bank and Westpac cautious on using AI for compliance


"Regtech is at an early stage in its life cycle, but we are trying to get it mature, make it understood and get it linked into the strategic core – that is a big focus for us," Ms Cooper said. At the regtech event, a key frustration from start-ups was the time it took for new, outside technology to be considered by the ultimate decision-makers. "It's extremely hard. I've probably been in Westpac's building 30 times in the last year and a half or so," said SimpleKYC founder Eric Frost, whose system is being used by American Express to reduce customer on-boarding times. Neither he nor Westpac said they could disclose whether Westpac had started using the system. Ms Cooper said Westpac's senior management was encouraging collaboration with start-ups, but at the same time, compliance was an area where there was little room for failure. "There is a very strong strategic focus, right form the top, on partnering, not sitting in our ivory tower thinking we can build it ourselves, which creates the conditions for us to do things like 'minimum viable procurement'," she said.


Be Unreasonable in Pursuing Your Goals
The point is that you need a lot of quarters. You can’t rely on one of anything. Big dreams start with money because it’s measurable. Your parents told you to be reasonable, to play it safe. Rich people do not say "money is not everything."  Don’t be a victim. Don’t appoint blame to anyone other than yourself. Quit making excuses. Get your heart in the deal all the time. Have enough so that nothing can stop you. Embrace this thing called sales. Every business I ever started was built on making sales. No sales equal no business. You don’t need therapy, you need to take action. You don’t need to write a business plan or organize your address book. Without money coming in you’re dead in the water. The healing is in the doing, not the thinking. Sales is about doing. It’s the ask, the follow-up. Were you told not to be too persistent when you were a kid? Money doesn’t grow on trees is code for I don’t know how to bring money in. Replace the excuses with the truth. The truth is you’re lost when it comes to income. 


Wiring financial organisations for regulatory success

Technology can help tie regulations to internal processes. Structured data sets mean that it’s possible to connect the dots between policies/procedures and processes, systems, controls and products and services through structured content and ML tagging. A clear link to the broader risk-management framework, governance, and processes is necessary at all levels of the hierarchy, across both large and small companies. No longer is this something presented as a futuristic view at conferences and industry events, but a new reality which regtech is bringing to life. With the use of technology, a huge amount of data that offers significant insight into risk can be captured for evidencing and provided to regulators in a detailed structured format that is easy to understand. Needless to say, such a technology-driven holistic structured approach to data is fast becoming the only viable way to successfully manage policies and stay compliant in the current regulatory landscape.


How Insurers Can Tackle Cyber Threats in the Digital Age

Man holding umbrella walking into a laptop with lightning on the screen showing how insurers have a role in tackling cyber threats
Persistent knowledge gaps hinder the creation of effective cybersecurity cultures. Human error accounts for a significant share of cyber breaches, with phishing schemes alone is responsible for three quarters of malware hitting organizations globally, according to NTT Data. And while there’s broad recognition that cyber-attacks pose a major threat to organizations, there’s a stark divide between IT professionals and corporate leadership regarding the effectiveness of organizational protocols. In one survey, 59 percent of corporate board members said that their organizations’ cybersecurity governance practices were very effective, while only 18 percent of IT professionals agreed. Insurers can work with their clients to achieve a unified understanding of cybersecurity policies and terms. When all stakeholders operate according to a standardized cybersecurity framework, organizations can better manage risk, understand their vulnerabilities, respond to emerging threats and contain the fall-out of breaches.


30+ Powerful Artificial Intelligence Examples you Need to Know

Trading algorithms are already used successfully in world’s markets recognizing the staggering speed with which computer systems have transformed stock trading. Even though automation rules the trading world, the most complex algorithms use basic AI reasoning. Machine learning is poised to change the tradition by putting emphasis on making the decision more hard-data based and lesser grounded on trading theories. While humans will always play a role in regulation and for making the final decisions- more and more financial transactions are making their way to computer systems. Plus, given the competitive nature of this field, investment in AI and machine learning will be one of the most defining aspects of the field. Luckily, these technologies have the potential to stabilize and not disrupt, the financial industry- therefore resulting in better job stability (even reducing the probability of market crashes).


Why a Digital Mindset Is Key to Digital Transformation

sculpture of metal face
While infrastructure and technology are clearly important considerations, digital transformation is as much about the people and changing the way they approach business problems and where they look to find solutions. In fact, according to Gartner research analyst Aashish Gupta, many organizations forget to address the necessary cultural shift needed to change the mindset of workers, without which no digital transformation project is going to succeed. "The culture aspect and the technology demand equal attention from the application leader, because culture will form the backbone of all change initiatives for their digital business transformation. Staff trapped in a 'fixed' mindset may slow down or, worse, derail the digital business transformation initiatives of the company,” he said in a statement. To encourage a change in mindset from traditional to digital, Gartner has developed a four-step plan which it outlines in its report "Digital Business Requires a New Mindset, Not Just New Technology," due to be released soon.


The new third-party oversight framework: Trust but verify

There is a need to identify risk at different points in the third-party life cycle: at the commencement of the relationship, and on a regular basis thereafter, based on a number of factors that influence the risk the third party generates, such as privacy, regulatory compliance, business continuity planning, and information security. However, there also needs to be an early warning system that can alert management to a potential increase of risk outside of these scheduled assessments. This is where the link to risk appetite, key performance, and risk indicators comes into play. The risk oversight functions need to work together to build a set of factors that can assess the inherent risk associated with an activity plus any increase in risk associated with outsourcing the activity to a third party, the mitigating effect of existing measures employed by the institution and the third party to control that risk, and the determining of the remaining risk,or residual risk that the institution continues to bear.


What data dominance really means, and how countries can compete

A woman takes pictures with Nokia's new smartphone, the Lumia 1020 with a 41-megapixel camera, after its unveiling in New York July 11, 2013.    REUTERS/Shannon Stapleton   (UNITED STATES - Tags: BUSINESS TELECOMS SCIENCE TECHNOLOGY) - GM1E97C01ZA01
A lot of the current debate approaches data from the supply side, asking about ownership and privacy. These are no doubt important questions. But countries need to think deeply about the demand side: are they growing local industries that will make use of data? If not, they will find themselves forever exporting raw data and importing expensive digital services. People say data is like oil. But it isn’t, really. For one thing, data isn’t “fungible”: you can’t swap one piece of information for something else. Knowing my Amazon purchase history won’t help a self-driving car identify a stop sign. This is true even when data is the exact same type: my browsing history may not be as valuable as yours. This non-fungible nature shows up in my estimations of Facebook’s average monthly revenue-per-user, which shows that the average Canadian user generates 100 times more revenue than the average Ethiopian user. 


5 Ways Marketers Can Gain an Edge With Machine Learning

5 Ways Marketers Can Gain an Edge With Machine Learning
In the past -- and occasionally today -- these recommendations were manually curated by a human. For the past 10 years, they have often been driven by simple algorithms that display recommendations based on what other visitors have viewed or purchased. Machine learning can deliver substantial improvements over these simple algorithms. Machine learning can synthesize all the information you have available about a person, such as his past purchases, current web behavior, email interactions, location, industry, demographics, etc., to determine his interests and pick the best products or the most relevant content. Machine learning-driven recommendations learn which items or item attributes, styles, categories, price points, etc., are most relevant to each particular person based on his engagement with the recommendations -- so the algorithms keep improving over time. And machine learning-driven recommendations are not limited to products and content. You can recommend anything -- categories, brands, topics, authors, reviews vs. tech specs etc.



Why DevOps Fails: Some Key Reasons To Consider

It is important to know why culture is important. Culture is a set of practices, standards, beliefs, and structure that reinforce the organizational structure. DevOps is not only a set of tools; you must create a culture of DevOps in your organization to get the results you seek. A U.S. government agency that adopted DevOps for continuous deployment failed to identify the importance of people and process, which led to misconduct and confusion among developers and key people. ... every organization is a technology-driven organization regardless of the domains. The journey from digital transformation to continuous digital journey demanded flexibility, agility, and quality as most-focused aspects. DevOps has become a need for organizations that are associated with software delivery or often releasing an update or new features in order to serve their customers with quality and superiority. There is no doubt that DevOps can make software development faster, but every organization has a different set of requirements, and each company's DevOps adoption must be tailored to that set of requirements.



Quote for the day:


"Leadership is the other side of the coin of loneliness, and he who is a leader must always act alone. And acting alone, accept everything alone." -- Ferdinand Marcos


Daily Tech Digest - March 30, 2019

As memory prices plummet, PCIe is poised to overtake SATA for SSDs

As memory prices plummet, PCIe is poised to overtake SATA for SSDs
PCIe is several times faster and has much more parallelism, so throughput is more suited to the NAND format. It comes in two physical formats: an add-in card that plugs into a PCIe slot and M.2, which is about the size of a stick of gum and sits on the motherboard. PCIe is most widely used in servers, while M.2 is in consumer devices. There used to be a significant price difference between PCIe and SATA drives with the same capacity, but they have come into parity thanks to Moore’s Law, said Jim Handy, principal analyst with Objective Analysis, who follows the memory market. “The controller used to be a big part of the price of an SSD. But complexity has not grown with transistor count. It can have a lot of transistors, and it doesn’t cost more. SATA got more complicated, but PCIe has not. PCIe is very close to the same price as SATA, and [the controller] was the only thing that justified the price diff between the two,” he said. DigiTimes estimates that the price drop for NAND flash chips will cause global shipments of SSDs to surge 20 to 25 percent in 2019


Edge computing is real. It's here, and companies have to have a strategy to handle the enormous influx of data coming in real time from devices globally. Analysts project there will be 50 billion telematics devices by 2020 and forecast the sum of the world's data will reach 175 zettabytes by 2025. Although edge computing is putting enormous pressure on IT infrastructure -- where legacy systems at the networking, storage, and application layers are straining today -- a new generation of systems is coming to market to help companies deal with the data explosion caused by edge computing. What is most exciting is the ability these new systems give companies to engage with customers in fundamentally new ways. There are examples of new business models being developed around the edge -- Netflix, Uber, and Amazon are notable examples -- but now many companies can adopt these new business models with next-generation, edge-aware systems emerging today.


The second-biggest improvement that Microsoft has made in HoloLens 2 is that the gesture control has been revamped. If I am to be completely honest, I have never had the best luck with getting HoloLens gestures to work. I always assumed that I was doing something wrong, because nobody else that I have talked to seems to have any trouble. From what I have heard about HoloLens 2, a new artificial intelligence (AI) processor and something called a time-of-flight depth sensor will collectively make it so that HoloLens will allow you to interact with holographic objects in the same way that you would interact with their real-world counterparts. This might mean being able to pick up a hologram and move it as if it were a physical object, as opposed to having to resort to using the convoluted gestures that are currently required. It remains to be seen how this new capability will actually be implemented, but I have high hopes that using HoloLens 2 will be far more intuitive than using its predecessor.


How to eliminate the security risk of redundant data
Most enterprises migrate their data to the public cloud in that second way: they just cart it all from the data center to the cloud. Often, there is no single source of truth in the on-premises databases, so all the data is moved to the public cloud keeps all its redundancies. Although it’s an architectural no-no, the reality is that most systems are built in silos, which is where the redundancies come from. They often create their own version of common enterprise data, such as customer data, order data, and invoice data. As a result, most enterprises have several security vulnerabilities that they have inadvertently moved to the cloud. ... The best solution to this problem is to not maintain redundant data. I’m sure the CRM system has APIs to allow for secure access to customer data that can be integrated directly into the inventory system. Or, the other way around. The goal is to maintain data in a single physical location, even if accessed by multiple systems. Even if you do eliminate most of the redundant data, all your data should be secured under a holistic security system that’s consistent from application to application and from database to database.


Vulnerability management woes continue, but there is hope
Let data analytics be your guide. In other words, take all your vulnerability scanning data and analyze it across a multitude of parameters, including asset value, known exploits, exploitability, threat actors, CVSS score, similar vulnerability history, etc. This data analysis can be used to calculate risk scores, and these risk scores can help guide organization on which vulnerabilities should be patched immediately, which ones require compensating controls until they can be patched, which ones can be patched on a scheduled basis, and which ones can be ignored.  Of course, few organizations will have the resources or data science skills to put together the right vulnerability management algorithms on their own, but vendors such as Kenna Security, RiskSense, and Tenable Networks are all over this space. Furthermore, SOAR vendors such as Demisto, Phantom, Resilient, ServiceNow, and Swimlane are working with customers on runbooks to better manage the operational processes.  


7 tips for stress testing a disaster recovery plan

A disaster recovery plan is a bit like an insurance policy: we all agree we need it and we all hope we’ll never use it. And as with insurance, nobody wants to discover their DR plan doesn’t actually protect them when a disaster hits. Similarly, nobody wants to find out that their DR plan is overdone – meaning they’ve been spending too much time, money and energy maintaining it. But if you don’t regularly stress test your DR plan, you could find yourself in one of these situations. I’ve worked with a lot of businesses, and I’ve noticed that few conduct regular stress tests of their DR plans. That’s a problem: no disaster recovery plan is good enough to magically transform as a business changes – and realistically, no business remains static. At a previous firm, we tested quarterly and found changes and updates during every test! So how can you verify that your DR plan fits your current needs? Follow these seven steps.


woman with hands over face mistake oops embarrassed shy by marisa9 getty
Cisco rates both those router vulnerabilities as “High” and describes the problems like this:  One vulnerability is due to improper validation of user-supplied input. An attacker could exploit this vulnerability by sending malicious HTTP POST requests to the web-based management interface of an affected device. A successful exploit could allow the attacker to execute arbitrary commands on the underlying Linux shell as root; and the second exposure is due to improper access controls for URLs. An attacker could exploit this vulnerability by connecting to an affected device via HTTP or HTTPS and requesting specific URLs. A successful exploit could allow the attacker to download the router configuration or detailed diagnostic information. Cisco said firmware updates that address these vulnerabilities are not available and no workarounds exist, but is working on a complete fix for both. On the IOS front, the company said six of the vulnerabilities affect both Cisco IOS Software and Cisco IOS XE Software, one of the vulnerabilities affects just Cisco IOS software and ten of the vulnerabilities affect just Cisco IOS XE software.


VS Code Python Type Checker Is Microsoft 'Side Project'

Deemed a work in progress with no official support from Microsoft and much functionality yet to be implemented, the GitHub-based project is described as an attempt to improve on currently available Python type checkers, with mypy mentioned specifically. Of course, the increasingly popular Visual Studio Code editor already sports an increasingly popular Microsoft-backed, jack-of-all-trades Python extension (just updated) that boasts more than 35 million downloads and 7.3 million installations and does type checking and a whole lot more. But Pyright isn't aiming to compete with that tool, rather to just improve on its type-checking capabilities, which are powered by the Microsoft Python Language Server that uses the language server protocol to provide IntelliSense and other advanced functionality for different programming languages in code editors and IDEs. "Pyright provides overlapping functionality but includes some unique features such as more configurability, command-line execution, and better performance," the GitHub project says.


Tapping security cameras for better algorithm training

surveillance camera (Sensay/Shutterstock.com)
For computer vision and facial recognition systems to work reliably, they need training datasets that approximate real-world conditions. So far, researchers have had access to only a small number of image datasets, many of which are heavily populated with still pictures of fair-skinned men. This limitation impacts the accuracy of the technology when it comes across types of images it's not familiar with – those of women or people of color, for instance. Another challenge is related to the varying quality of the images on video feeds available from surveillance cameras. Often the cameras' scope and angle, as well as the lighting or weather during a given recording, make it difficult for law enforcement to track or re-identify people from security camera footage as they try to reconstruct crimes, protect critical infrastructure and secure special events. To help solve this problem, the Intelligence Advanced Research Projects Activity has issued a request for information regarding video data that will help improve computer vision research in multicamera networks.


Huawei Security Shortcomings Cited by British Intelligence

Huawei Security Shortcomings Cited by British Intelligence
The latest findings are contained in the fifth annual report to be issued by the NCSC's Huawei Cyber Security Evaluation Center, which the U.K. government launched in 2010 to review Huawei's business strategies and test all product ranges before they were potentially used in any setting that might have national security repercussions. The new report emphasizes that the findings should not imply that U.K. telecommunications networks are at any greater risk now than they were before. Rather, the findings are part of a high-level review to ensure that Britain's telecommunications networks remain as secure as possible. "We can and have been managing the security risk and have set out the improvements we expect the company to make. We will not compromise on the progress we need to see: sustained evidence of better software engineering and cybersecurity, verified by HCSEC," the NCSC spokeswoman says. "This report illustrates above all the need for improved cybersecurity in the U.K. telco networks, which is being addressed more widely by the digital secretary's review."



Quote for the day:



"Prosperity isn't found by avoiding problems, it's found by solving them." -- Tim Fargo


Daily Tech Digest - March 28, 2019

The firm's Risk in Review study said when risk management is at the top of its game, "leaders have a clear line of sight into threats for informed decision making." The report is based on a global survey of 2,073 CEOs, board members, and professionals in risk management, internal audit, and compliance, conducted in October and November 2018, and described six habits risk functions follow that help their companies set a course for sustainable growth. Digital transformations don't work well in isolation, the report said, because of the many connection points that can be exploited without proper controls. A well-thought-out and communicated digital strategy with growth targets and values anchors a risk culture. As organizations go all-in with transformations, the entire organization should prioritize items such as new technology, while risk functions set controls that map back to the strategy. In another survey recently conducted by the firm, CEOs globally said they expect the artificial intelligence (AI) "revolution to be bigger than the Internet revolution."


UK IoT research centre to tackle cyber risk


The centre’s research focus will be on the opportunities and threats that arise from edge computing, an innovative way to collect and analyse data in machine learning and artificial intelligence (AI) technology. When implemented successfully, edge computing can improve network performance by reducing latency, which is the time taken for data to traverse a system. “The centre’s ultimate aim is, by creating a trustworthy and secure infrastructure for the internet of things, to deliver a step change in socio-economic benefit for the UK with visible improvements for citizen wellbeing and quality of life,” said Jeremy Watson, Petras director and professor at University College London department of science, technology, engineering and public policy (STEaPP). “I expect productivity improvements and cost savings across a range of sectors including healthcare, transport and construction. In bringing together academics, industry technologists and government officials, our research will create accessible and relevant knowledge with clearly visible economic, societal or cultural impact that will help to cement the UK’s position as a world leader in this area.”


5 steps employers can take to retain project managers

istock-916423726frustrated.jpg
Many project managers recognize their impact on the overall morale of the company and understand the need to remain positive and put on a "good face" for their teams, sponsors, and other stakeholders. The trouble is, employers may make the assumption that what they see is what truly exists, and this can create a sense of complacency. As an employer, it's important to keep in regular contact with your project management professionals to ensure that there exist no issues impacting their job satisfaction. Although your project managers are likely to remain the continent professional and push through to ensure that their projects are executed successfully; they could experience concern in some areas, yet not feel supported enough to say anything. In fact, many project managers experience a great deal of responsibility to put the needs of others ahead of their own, sometimes to their own detriment. Take the time to regularly sit down with your project management professionals and keep up-to-date with the issues that impact their level of job and employee satisfaction.


Mashreq Bank’s Lean Agile Journey

Snowdon stated that the goal of agile was to work in a more collaborative way, to get decisions closer to the customer, and to provide a better structure so that they could more quickly respond to the customer-driven demand, rather than push products/services at them. Capaldi stated, "I believe in kaizen and kaikaku as central concepts all companies must value; I will therefore only get involved in a transformation if I see these. In this case I was pleasantly surprised by the passion Steve and his team had in wanting to fully understand agile and these concepts were clearly there, and the fact they also come from a lean background just like me helped." "The head of the division was also massively behind the transformation and we very quickly agreed the metrics that would track progress," Capaldi mentioned. He said that agile is a journey; he prefers to challenge his clients in that they aren’t really trying to be agile, and that it’s ok to start with "fake agile". Fake it till you make it!, he said


Mind the overlap between GDPR and ePD, warns privacy lawyer


According to Ustaran and Campion, as the digital economy progresses, European data protection law is likely to lead to a more harmonised approach to its interpretation and enforcement, as reflected by the EDPB’s opinion. However, the situation going forward it far from clearcut as the ePD was initially intended to be replaced by the proposed European ePrivacy Regulation (ePR) in May 2018, but then was expected to be implemented at some point in 2019 and now looks likely to take a little longer. “The whole e-Privacy Directive / forthcoming Regulation and GDPR debate is one of the most complex legal conundrums going on at the moment in this space,” Ustaran told Computer Weekly. “The recent EDPB opinion is very helpful in terms of understanding the regulators’ thinking, but where the e-Privacy Regulation fits in is a big missing piece,” he said. According to Ustaran, the e-Privacy Regulation is unlikely to be fully effective before 2020, given that the European Council has not decided on a preferred draft, which will then need to be discussed in detail with the European Parliament and the European Commission before being formally adopted.


Site reliability engineer shift creates IT ops dilemma


In some ways, the transitional struggle described by the SREcon attendee is unavoidable, according to experienced SREs who presented here this week. "If you talk to experienced veterans in the field, they might get a faraway look in their eye and say, 'Oh, yes, I remember that,'" said Jaren Glover, infrastructure ghostwriter at Robinhood, a fintech startup in Palo Alto, Calif. "A bit of this pain is par for the course." There are, unfortunately, no easy solutions to the problem, SREs said, though support from employers to hire new engineers and scale up site reliability engineer teams is crucial. "It's also a matter of prioritization," said Arnaud Lawson, senior infrastructure software engineer at Squarespace, a website creation company in New York, in an interview after his SREcon presentation on service-level objectives. "Even if 80% of the team is dedicated to firefighting, the rest can tap into automation to get rid of tedious work." At large enough companies, such as the professional networking site LinkedIn, SREs are sometimes repurposed from other teams to help those that struggle to meet team performance targets or who are overwhelmed by pager alerts.


Shared learning: Establishing a culture of peers training peers

Shared learning: Establishing a culture of peers training peers
“After you walk your teammates through how you apply a skill, let them test it out on their own to see whether they can repeat the process you used and achieve the same or a similar result,” he says. With so many organizations relying on technology for training, this hands-on aspect is key. “We’re moving from a world where just watching online tutorials and going to classes was enough to one that emphasizes experiential learning. Just knowing isn’t enough — it’s about doing,” Schawbel says. “If you’re lucky, your organization will give you access to learning, training, educational materials or subscriptions to various resources, but they aren’t actually providing the hands-on, peer-to-peer learning, mentorship, situational and project-based knowledge.” ... Once your coworkers have attempted to complete a task using the skill you taught them, review it, Schawbel says, but understand that nowadays, people don’t even like using the word “feedback,” and prefer “suggestions for improvement.” Here, the key is starting with the positive.


Understanding the role and need of a data protection officer

The DPO works alongside of the other C-suite officers at your firm and maintains Data Protection Authority rules and regulations. This means that they should be expert or well-versed in the GDPR and all of its requirements, but it also means that the DPO needs to understand other jurisdictional requirements around the world in places your business operates. This responsibility is a serious one, and you should review the information available at the International Association of Privacy Professionals (IAPP) for further clarity. The IAPP is the world’s largest information privacy community and provides comprehensive data privacy and regulatory certification training. Because you have gotten this far, you must believe that your business has opportunities to create value through your data and data partnerships. You have also certainly noticed the seemingly daily disastrous headlines about data breaches plaguing companies. There have been hundreds of different data breaches involving more than 30,000 records each; some of these breaches affected hundreds of millions of data subjects. 


Identifying exceptional user experience (UX) in IoT platforms

Industry 4.0 / Industrial IoT / Smart Factory
Enterprises should pick IoT platforms with superlative access to on-platform configuration functionality with an emphasis on declarative interfaces for configuration management. Although many platform administrators are capable of working with RESTful API endpoints, good UX design should not require that platform administrators use third-party tools to automate basic functionality or execute bulk tasks. Some programmatic interfaces, such as SQL syntax for limiting monitoring views or dashboards for setting event processing trigger criteria, are acceptable and expected, although a fully declarative solution that maintains similar functionality is preferred. ... In general, the UX should be focused on providing information immediately required for the execution of day-to-day operational tasks while removing more complex functionality. These platforms should have easy access to well-defined and well-constrained operational functions or data visualization. An effective UX should enable easy creation and modification of data views, graphs, dashboards, and other visualizations by allowing operators to select devices using a declarative rather than SQL or other programmatic interfaces.


How IoT can transform four industries this year

"Among providers, IoT enablement will be leveraged toward the triple aim of cost, quality, and population health," Khaled said. Simple, embedded digital tools are already being piloted at large scale to mitigate infection risk around replaceable medical instruments, while smart threads and sticker or patch sensors have improved in their fidelity, tracking everything from cardiac readouts to body chemistry and sleep patterns. Among payers, IoT presents a distinct opportunity to enable smarter population risk management and accompanying reimbursement rate adjustments. IoT-enabled, long-term care facilities will be able to negotiate better rates if their sensor data supports fall risk and infection likelihood mitigation, Khaled said. The growing ecosystem of wearable fitness devices will help insurers recognize members who are (literally) taking steps to actively change their individual risk. IoT technologies supporting patient medication adherence will help both of these groups see major cost-saving and health improvement opportunities.



Quote for the day:


"True success is a silence inner process that can empower the mind, heart and soul through strong aspiration." -- Nur Sakinah Thomas


Daily Tech Digest - March 27, 2019

5 things you can do in 5 minutes to boost your internet privacy


For websites and services where you need to ensure the security of your account, like your bank, passwords alone simply are not enough anymore. In this scenario, you need two-factor authentication (2FA) -- specifically, the kind where a mobile app generates login codes for you. Not the kind where you are sent an SMS text message, because those can be intercepted or just fail to arrive. With app-based 2FA, you log into an app or website like normal, then you open an app that generates a special six-digit code every 30 seconds. This authentication app is synced with the other app or service so that your code matches the one that the main app or service expects to get. You enter the code from the authenticator app into the app or website that's asking for it, and then your login is complete. Google makes its own free authenticator app for iOS and Android. Unfortunately, there isn't a standardized method for setting up your account with 2FA. Amazon, PayPal, eBay and your bank will all use slightly different systems and terminology.



Scaling Microservices: Identifying Performance Bottlenecks

A bottleneck is the point in a system where contention occurs. In any system, these points usually surface during periods of high usage or load. Once identified, the bottleneck may be remedied bringing performance levels into an acceptable range. Utilizing synthetic load testing enables you to test specific scenarios and identify potential bottlenecks, although this only covers contrived situations. In most cases, it is better to analyze production metrics and look for outliers to help identify trouble on the horizon. Key performance indicators from your application include request/sec, latency, and request duration. Indicators from the runtime or infrastructure also include CPU time, memory usage, heap usage, garbage collection, etc. This list isn't inclusive, there may be business metrics or other external metrics which may factor into your optimizations as well.


The devil is in the data, but machine learning can unlock its potential

The devil is in the data image
Effective data governance can enable intelligent real-time business decision-making that will, in turn, drive organisations in a more profitable direction. One of the best approaches when it comes to unleashing big data’s potential is investing in a data lake: a central repository that allows organisations to collect everything — every bit of data, regardless of its structure and format — which can then be accessed, normalised, explored and enriched by users across multiple business units to reveal patterns across a shared infrastructure. The advantage of this approach is that organisations can gain end-to-end visibility of the enterprise data and actionable business insights. The disadvantage is that the data has to be kept up to date, which takes time and effort. Another downside is the GDPR compliance and data security risks that are associated with depositing the entirety of an organisation’s business-critical data into a data lake.


Insights for your enterprise approach to AI and ML ethics

The promise of AI is in augmenting and enhancing human intelligence, expertise and experience. Think helping a aircraft mechanic make better, more accurate and more timely repairs – not automating the mechanic out of the picture. But the scope of what you can do is tempered by inherent limitations in today’s AI systems. I like to frame this as a recognition that computers don’t “understand” the world the way we do (if at all). I don’t want to get into an epistemological discussion about the definition or nature of understanding, but here’s what I think is a very illustrative and accessible example. One common application of AI is in image processing problems. i.e., I show the machine an image – like what you might take with your phone – and the machine’s task is to report back what’s in the image. You build a system like this by shoving in thousands or millions or even billions of images to an AI program (such as a neural network) – you might hope that somehow, as a result of processing all of these images the software builds some kind of semantic representation of the world.


Alibaba's UC Browser can be used to deliver malware


Dr Web researchers note that for now UC Browser represents a "potential threat" but warn that all users could be exposed to malware due to its design.  "If cybercriminals gain control of the browser's command-and-control server, they can use the built-in update feature to distribute any executable code, including malware. Besides, the browser can suffer from MITM (man-in-the-middle) attacks," the security company notes. The MITM threat arises because UCWeb committed the security blunder of delivering updates to the browser over an unsecured HTTP connection. "To download new plug-ins, the browser sends a request to the command-and-control server and receives a link to file in response. Since the program communicates with the server over an unsecured channel (the HTTP protocol instead of the encrypted HTTPS), cybercriminals can hook the requests from the application," explains Dr Web.  "They can replace the commands with ones containing different addresses. ... "


Deep Learning for Speech Synthesis of Audio from Brain Activity

In three separate experiments, research teams used electrocorticography (ECoG) to measure electrical impulses in the brains of human subjects while the subjects listened to someone speaking, or while the subjects themselves spoke. The data was then used to train neural networks to produce speech sound output. The motivation for this work is to help people who cannot speak by creating a brain-computer interface or "speech prosthesis" that can directly convert signals in the user's brain into synthesized speech sound. The first experiment, which was run by a team at Columbia University, used data from patients undergoing treatment for epilepsy. The patients had electrodes implanted in their auditory cortex, and ECoG data was collected from these electrodes while the patients listened to recordings of short spoken sentences. The researchers trained a deep neural-network (DNN) using Keras and Tensorflow using the ECoG data as the input and a vocoder/spectrogram representation of the recorded speech as the target.


An inside look at Tempo Automation's IIoT-powered ‘smart factory’
“There could be up to 20 robots, 400 unique parts, and 25 people working on the factory floor to produce one order start to finish in a matter of hours,” explained Shashank Samala, Tempo’s co-founder and vice president of product in an email. Tempo “employs IIoT to automatically configure, operate, and monitor” the entire process, coordinated by a “connected manufacturing system” that creates an “unbroken digital thread from design intent of the engineer captured on the website, to suppliers distributed across the country, to robots and people on the factory floor.” Rather than the machines on the floor functioning as “isolated islands of technology,” Samala added, Tempo Automation uses Amazon Web Services (AWS) GovCloud to network everything in a bi-directional feedback loop. “After customers upload their design to the Tempo platform, our software extracts the design features and then streams relevant data down to all the devices, processes, and robots on the factory floor,” he said. This loop then works the other way: As the robots build the products, they collect data and feedback about the design during production.



Using value stream management and mapping to boost business innovation

Value stream mapping purists may argue that the above exercise is not the real process because traditional components such as the time metrics, activity ratios and future state were omitted. Fear not, these components are included in a full-blown formal value stream mapping exercise. However, teams such as Thrasher’s have made substantial improvements from shorter versions of the exercise by making work visible. The net result is a compelling change in the right direction. Value stream management is the practice of improving the flow of the activities that deliver and protect business value -- and prove it. It’s a nascent digital-concept that measures work artifacts in real time to visualize the flow of business value and expose bottlenecks to optimize business value. A significant strength of this practice centers around how and where work is undertaken. This activity is captured through the work items mentioned above in the toolchain, providing a traceable record of how software is planned, built and delivered.


Redis in a Microservices Architecture

Redis in a Microservices Architecture
Redis can be widely used in microservices architecture. It is probably one of the few popular software solutions that may be leveraged by your application in so many different ways. Depending on the requirements, it can act as a primary database, cache, or message broker. While it is also a key/value store we can use it as a configuration server or discovery server in your microservices architecture. Although it is usually defined as an in-memory data structure, we can also run it in persistent mode. ... If you have already built microservices with Spring Cloud, you probably have some experience with Spring Cloud Config. It is responsible for providing a distributed configuration pattern for microservices. Unfortunately, Spring Cloud Config does not support Redis as a property source's backend repository. That's why I decided to fork a Spring Cloud Config project and implement this feature. I hope my implementation will soon be included into the official Spring Cloud release, but, for now, you may use my forked repo to run it. It is available on my GitHub account: piomin/spring-cloud-config. 


Data visualization via VR and AR: How we'll interact with tomorrow's data

virtualitics.jpg
Data visualization in VR and AR could be the next big use case for the technologies. It's early days, but examples of 3D data visualizations hint at big changes to come in how we interact with data. Recently, I spoke with Simon Wright, Director of AR/VR for Genesys, about one such experiment. Genesys helps companies streamline their customer service experience with automated phone menus and chatbots, for example, but in his role Wright has a lot of latitude to push the boundaries of Mixed Reality technologies for enterprise customers. "One of the things I'm personally excited about is the ability to create hyper visualizations," Wright tells me. "We capture massive amounts of data, and we've created prototypes to almost magically bring up a 3D model of Genesys data. This is where there could be huge opportunities for AR, which has advantages over a 2D screen." For one recent project, Wright and his team wanted to project data pertaining to website analytics onto the wall of a restaurant in a beautiful way. "It started as a marketing-led project," he explains.




Quote for the day:


"Leadership to me means duty, honor, country. It means character, and it means listening from time to time." -- George W. Bush


Daily Tech Digest - March 26, 2019

Firms urged to gear up for new malware and tactics as threats proliferate


As network defences increase in sophistication, so do the anonymity of attacks that now include the targeting of non-standard ports to ensure payloads are concealed upon delivery, the SonicWall researchers warned. Based on a sampling of more than 700 million malware attacks, SonicWall found that 19.2% of malware attacks used non-standard ports, up 8.7% compared with 2017. “The concern over security and privacy is more prevalent than ever before. Industry and government must collaborate to build a more secure environment, mitigate risk, and build citizen trust in government and consumer trust in business,” said Michael Chertoff, executive chairman and co-founder of The Chertoff Group, and former US secretary of homeland security. “This report provides critical analysis into the evolution of cyber adversaries’ threat tactics and methods. As organisations increasingly rely on metrics to understand and forecast risk, this intelligence will help enterprises and governments make informed decisions on their security investment.”



Maintaining security control in the age of the mobile workforce

The ability to do our jobs from outside the corporate walls keeps workers productive and helps businesses remain operational. Or in some cases, saves the organization travel fees – especially those caused by rescheduling or canceling hotels and airfare during inclement weather. Beyond the seasonal spikes, many organizations are adopting more flexible work policies. The number of U.S. mobile workers is expected to grow to 105.4 million, or more than 70 percent of the population, by 2020. The composition of the modern workforce is changing. Not to mention that, as an always-on society, we have a problem disconnecting. Fast forward from this brutal winter and 42 million people are expected to travel over Memorial Day weekend, with a majority of them still tethered to work communications on their devices. When only 11 percent of end users access business applications from the corporate office 100 percent of the time, the growth of the mobile workforce places a lot of strain on data security. Data now sits on endpoints spread around the globe.


'Operation ShadowHammer' Shows Weakness of Supply Chains


While the exact scope and purpose of installing these backdoors remains unclear, researchers say the APT group does appear to have targeted a very specific set of Asus PC users. These victims were identified through their MAC addresses. Specifically, the attackers hardcoded MAC addresses into the trojanized software samples recovered by Kaspersky Lab, which says this list was used to target specific machines and their users. Once the backdoor was installed on a victim's machine, it would signal back to a command-and-control server and then receive additional malware to plant in the PC, according to Vice Motherboard, which first reported the story. If the PC was not on the target list, the malware did not initiate a call to the C&C server. One reason why the operation continued for so long without being detected is that attackers used legitimate certificates, such as "AsusTeK Computer Inc.," as part of the trojanized updates, researchers say. The updater software was also hosted on legitimate domains.


Have we reached “peak” chief digital officer?


Our latest CDO study found that CDOs with a market-facing background had dropped to 18 percent, down from 39 percent two years ago, and that 41 percent of organizations had attracted CDOs with a solid technology background, up from 32 percent in 2016. And 28 percent had a strategy, business development, or consulting background — a surge from 2016, when just 21 percent had such credentials. Indeed, a key study finding was that a third of CDO positions saw turnover in 2018. This reflects the need for new skill sets and experience as digital transformation programs move beyond pilots in specific corners of the business to play a central part in everyday operations. Looking ahead, we think that as transformation becomes part of the core business, the next step will be for the CDO to disappear. Digital transformation will become the responsibility of every member of the executive team. We are not there yet. But where organizations have a CDO in place today, the priority should be to ensure that the person in this role still has the appropriate perspective and capabilities to move the digital transformation agenda forward, by embedding it both deeply within operations and at scale across the organization.


How blockchain is becoming the 5G of the payment industry

digital currency
On the heels of JPMorgan Chase & Co. creating its own stable coin token for use on blockchain distributed ledgers, IBM last week launched its Blockchain World Wire, which will enable banks to transfer tokens and cryptocurrency in near-real time, cutting out banking intermediaries and lowering capital costs and clearing fees. The distributed ledger technology (DLT) network will initially enable cross-border payments and settlements based on the Stellar protocol, a decentralized payment network that uses its own cryptocurrency, Stellar Lumens (XLM). While the IBM network will support XLM, it will primarily use stable coin backed one-for-one by the U.S. dollar and other national currencies. In other words, IBM will run the blockchain infrastructure – the computer nodes and software – and the banks will transmit digital tokens tied to fiat currency over the network. The Rizal Commercial Banking Corporation (RCBC), one of the Philippines' top 10 banks by assets, will be among the first of four using the Blockchain World Wire for remittance payments services. 


ProBeat: Google’s Stadia is all about the cloud and AI

Google Stadia Style Transfer ML
The cloud conversation is straightforward. Stadia is a cloud gaming service, after all. But what about AI? The consumer-facing part is of course Google Assistant. Google is positioning YouTube as Stadia’s killer app, but Google Assistant also has a role to play. The Stadia Controller has a Google Assistant button. When you get stuck, instead of scouring YouTube for the right walkthrough explanation, you’ll supposedly be able to push the Google Assistant button for help. How rudimentary or how complex this is will depend on how good the AI is. The developer-facing part is Style Transfer ML. Style transfer is a machine learning technique that recomposes images in the aesthetic of other images, letting game developers map textures and color palettes from artwork, photos, and still images to game environments automatically. Google needs to woo developers to Stadia, and it’s naturally leaning on its AI chops to do so. But like everything Google does, Stadia is really about data collection.


Artificial intelligence making major inroads into Russian banking


According to the Expert RA study, those obstacles include discrepancies with data in information systems, but once the issue of data consistency is resolved, finding qualified personnel to process data is set to be a major challenge. Industry insiders have already been complaining about difficulties in finding qualified personnel to operate AI-based solutions. “The main factors that are impeding the adoption and development of AI are shortages of qualified professionals and problems with the infrastructure of information systems,” said Smirnov. Putyatinsky agreed, saying: “The acutest issue is training of qualified personnel.” To help resolve this challenge, CBoM has been running an internship programme called IB Universe for the past 12 months. “This allows students and recent graduates to acquire practical experience in various areas of investment business,” added Putyatinsky. According to Putyatinsky, educational programmes of that kind will eventually allow banks to train personnel in the working environment, producing a new wave of employees who will already be prepared to deal with new technologies, such as AI and machine learning.


EU Seeks Better Coordination to Battle Next Big Cyberattack


The news that Europol is trying to better prepare EU member states for the next big cyberattack comes as fresh warnings are being sounded that Russia is looking to interfere in upcoming EU parliamentary elections scheduled for May. On Thursday, CNBC reported that FireEye has found evidence that two advanced persistent threat groups are gearing up for more attacks in the coming months. The CNBC report specifically pointed to increasing activity from APT 28, the Russian-backed group that is also known as Fancy Bear and which is believed to been involved in different disruption campaigns around the world, including Sandworm, which has been linked to the NotPetya wiper-malware attack that was unleashed in July 2017. To help governments better defend themselves against such attacks, numerous vendors - including Cloudflare, Google, Microsoft and Symantec - have moved to offer free services. In February, Microsoft announced that it would expand its AccountGuard, which provides protection and threat detection geared to blocking nation-state and APT activity, to 12 more European countries in preparation for the 2019 elections. 


Pivoting to digital maturity


Transformation initiatives are only as valuable as the business impact they drive. In our analysis of the survey results, therefore, we adopted a simple measure of digital maturity: the extent to which respondents said an organization’s digital transformation efforts are delivering business benefit. We then classified respondents into three segments—lower, median, and higher—according to the degree of business benefit they said they were achieving from their actions. Digital transformation is a continual process, and digital maturity is a moving target. So we present these as relative rather than absolute classifications. ... Achieving data mastery can entail an organization wide effort, sometimes under the direction of a chief data officer, to identify and evaluate data assets and build or acquire the necessary platforms and competencies. Eighty-eight percent of higher-maturity companies in our survey reported that they were obtaining a significant positive impact from their use of data, compared to just 24 percent of lower-maturity companies.


Algorithms have already taken over human decision making


In fact, algorithms operating without human intervention now play a significant role in financial markets. For example, 85% of all trading in the foreign exchange markets is conducted by algorithms alone. The growing algorithmic arms race to develop ever more complex systems to compete in these markets means huge sums of money are being allocated according to the decisions of machines. On a small scale, the people and companies that create these algorithms are able to affect what they do and how they do it. But because much of artificial intelligence involves programming software to figure out how to complete a task by itself, we often don’t know exactly what is behind the decision-making. As with all technology, this can lead to unintended consequences that may go far beyond anything the designers ever envisaged. ... But the algorithms that amplified the initial problems didn’t make a mistake. There wasn’t a bug in the programming. The behaviour emerged from the interaction of millions of algorithmic decisions playing off each other in unpredictable ways, following their own logic in a way that created a downward spiral for the market.



Quote for the day:


"At the heart of great leadership is a curious mind, heart, and spirit." -- Chip Conley


Daily Tech Digest - March 25, 2019

Why Big Banks Are Losing To Tech Giants Over Open Banking

uncaptioned image
42% disagree that collaboration with fintech’s is needed for retail banks to innovate faster. Michal Kissos Hertzog, CEO of Pepper, said of the research: It highlights the size of the disconnect between traditional banks and their customers. Banks are not innovating fast enough, and the value proposition and consumer experience is nowhere it should be. It's not for lack of trying but the reality is that banks are failing to go fully digital and are falling further behind. However, it’s not all bad news - banks still retain consumer trust which is a position of tremendous strength and decision-makers understand how they need to improve. Only time will tell if they are able to deliver.” For banks in the U.K., research shows that decision-makers believe traditional retail banks are struggling to compete in the digital era. The vast majority (82%) say banks aren’t innovating fast enough to meet changing consumer demands for digital services, with almost half (48%) thinking that these banks are at least three years behind fintech rivals.



Finding real strength in numbers through data partnerships

Over the last two years, we’ve seen some form of the following paragraph on a presentation slide at almost every data-focused conference attended. The quote has been stolen, and re-stolen, from a TechCrunch article by Tom Goodwin, in which he said: "Uber, the world’s largest taxi company, owns no vehicles. Facebook, the world’s most popular media owner, creates no content. Alibaba, the most valuable retailer, has no inventory. And Airbnb, the world’s largest accommodation provider, owns no real estate. Something interesting is happening." Each of those companies has created massive value by crafting data partnership approaches and then delivering a value greater than any one dataset could provide on its own. This is data innovation, combined with masterfully executed consumer marketing and user experience. Each one embraced the Amazon vision of open data structures, internally and externally, to power their go-to-market value proposition. In other words, when it comes to data partnerships, the whole is often greater than the sum of its parts.


The Benefits Of Edge Computing In IoT


Edge computing in IoT implies having autonomous systems of devices at these endpoints (or the edge) that simultaneously gather information and respond to the information without having to communicate with a remotely constructed data center. Instead of having remote data centers and computational servers, the processing of data can be done right where the data is collected, eliminating the need for constant connectivity to centralized control systems and the problems inherently associated with such setups.  For instance, a software company that sells cloud-based mobile applications can have cloud servers based in multiple locations closer to users instead of in a single location that may lead to undesirable latency and a single point of failure in case of any mishap. If the centralized servers failed due to some reason, all application users would lose their data and access to services at once. Additionally, the servers would also have to deal with heavy traffic, causing latency and inefficiency. On the contrary, a decentralized system would ensure that all the data pertinent to specific users would be hosted in the closest data center them among multiple ones, minimizing latency and limiting the impact of any potential failure. 


Cohesity plans to put backup data to good use

As with Isilon’s OneFS file system, Cohesity’s SpanFS distributes storage across several nodes, ensures redundancy of data, indexes data by means of metadata and shares it across the network, NAS-style. SpanFS is not limited to physical nodes and can integrate its capacity with the cloud. It has replication functionality that allows it to continue activity from a remote site or the cloud in case of an incident. In addition to NFS and SMB access, it can share data via the object storage S3 protocol, that is widely used for cloud applications. SpanFS is part of Cohesity’s DataPlatform, which in part comprises access to admin functionality, including configuration, deduplication, replication, monitoring. Also among these is SnapTree, which allows use of cloned content to, for example, run project tests with real data. DataPlatform software can come on hardware from HPE, Dell or Cisco as appliances or in virtual appliance format. As an option, the Helios SaaS console allows the centralisation of administration for multiple DataPlatform clusters across a number of cloud sites.


Containers, cloud-centric app services boost support for DevOps world


The underlying proxy technology also provides transparent routing to multiple back-end components, Transport Layer Security (TLS) termination, etc, and crosscutting concerns (i.e. logging, security and data transfer) at the edge of systems. This is particularly valuable within an API gateway – the entry point into microservices-based applications from external API clients.  Further, F5 is introducing a new cloud-native application services platform, specifically designed for the apps your DevOps and AppDev teams care about most. One significant innovation is itsService Mesh incubation, Aspen Mesh. “While container orchestration tools like Kubernetes have solved microservice build and deploy issues, many runtime challenges remain unsolved,” said Kara Sprague, senior vice president and general manager of Application Services Business Unit at F5. “Our fully supported service mesh makes it easy to manage the complexity of microservice architecture.” 


Are You Setting IT Up To Stifle Your Innovation?

uncaptioned image
The fact is that manufacturing organizations are a bit late to enterprise self-service analytics, or should I say self-service data management, compared to more centrally managed or highly regulated organizations like financial services or healthcare companies. Such organizations have already been dabbling in big data, cloud, and machine learning with varying degrees of success for a decade. Many deployed self-service analytics environments years ago. Nowadays, they are experiencing the “trough of disillusionment,” setting them up to finally realize the fruits of artificial intelligence (AI) adoption. They’ve learned that going back to basics around data quality, governance, cataloging, and cloud-based data integration to facilitate “data democratization” is needed to take full advantage of more advanced technologies. Manufacturers can avoid the mistakes and costly learnings of other industries by doing it right the first time. However, their traditional plant-centric approach and tactile-oriented innovation viewpoint permeate – and potentially limit – IT-related innovation.


IT needs to make mobile unified communications a priority

Double-exposure shot of a businesswoman using a mobile phone, binary code and statistical graphs..
The need for safe, reliable, and easy-to-use communications tools has given rise to unified communications (UC), a strategy that integrates multiple communications modalities under a single management and security umbrella. The result is more effective communication, improved collaboration, and a boost to security and regulatory policies. Now that mobility is the primary networking vehicle for end users, it’s time for IT departments to make mobile unified communications (MUC) a priority. The most important benefit of MUC is the ability of organizations to finally leave behind the uncontrolled, untracked mish-mash of consumer-centric, carrier, and third-party communications tools traditionally applied over the years. Communications are a critical organizational resource; MUC is a much easier vehicle to manage and scale, and MUC offers the visibility and control that’s essential to enterprise IT deployments. These advantages will enable MUC to become the dominant provisioning strategy and mechanism for organizational communications over the next five to 10 years.


Ransomware, Cryptojacking, and Fileless Malware: Which is Most Threatening?

The drama of the subtitle actually understates the danger of fileless malware. Of ransomware, cryptojacking, and fileless malware, fileless malware is both the youngest and perhaps the most dangerous. Fileless malware, as the name suggests, doesn’t behave as traditional malware. Malware usually downloads a file onto the victim device or enterprise environment; this allows the legacy antivirus solutions to locate and remove them. Fileless malware doesn’t do this. Instead, it uploads a program to a native process on the endpoint such as Java or PowerShell. Then fileless malware forces the native program to run its code, which performs the malicious task concealed behind its normal processes. Legacy endpoint security systems, which depend on traditional threat signatures, can’t possibly detect these attacks. Often, fileless malware leaves no trace of itself behind. Hackers increasingly adopt fileless malware attacks because, especially against legacy solutions, they prove largely successful.


How to make sure your artificial intelligence project is heading the right way


"The research highlights how everyone involved in the use of AI and big data must have wider discussions about the outcome you're looking for, such as better health, and then work backwards to issues like data sharing and information security. You should always start with the outcome," he says. Baker suggests business leaders looking to ensure they focus on the right objectives for AI and data should consider establishing a public ethics board. Just like companies have executive boards to make decisions, these ethics panels can help organisations that are using emerging technology to make publicly minded decisions. "We know some tech companies, like Deep Mind, already do this," says Baker. "Don't assume that you know what the public wants or that the market research you conduct into public opinions is correct. You need to actually have an ethics panel, and discuss what the issues are and what the needs of the public really are."


Small businesses hit hardest by cyber crime costs


The average cost of cyber attacks to small businesses was £65,000 in damaged assets, financial penalties and business downtime. The puts the total cost of cyber crime across all UK small businesses in 2018 at an estimated £13.6bn. This represents 80% of the financial impact of cyber attacks on all UK business in the past year, with a third reporting that they were hit by cyber crime. The survey, conducted by research consultancy Opinium, found that while phishing emails claimed the greatest number of victims (25%), ransomware attacks were the most financially damaging, costing victims £21,000 each on average. Although the trend for large businesses to fall victim at the highest rate continued, with seven in every 10 companies of more than 250 people being hit, the rate at which small companies succumbed to cyber criminals reached its highest level since Beaming started surveying business leaders in 2016. Nearly two-thirds (63%) of small businesses reported being a victim of cyber crime in 2018, up from 47% of small businesses in 2017 and 55% in 2016.



Quote for the day:


"Strategy is not really a solo sport even if you're the CEO." -- Max McKeown