Daily Tech Digest - March 25, 2019

Why Big Banks Are Losing To Tech Giants Over Open Banking

uncaptioned image
42% disagree that collaboration with fintech’s is needed for retail banks to innovate faster. Michal Kissos Hertzog, CEO of Pepper, said of the research: It highlights the size of the disconnect between traditional banks and their customers. Banks are not innovating fast enough, and the value proposition and consumer experience is nowhere it should be. It's not for lack of trying but the reality is that banks are failing to go fully digital and are falling further behind. However, it’s not all bad news - banks still retain consumer trust which is a position of tremendous strength and decision-makers understand how they need to improve. Only time will tell if they are able to deliver.” For banks in the U.K., research shows that decision-makers believe traditional retail banks are struggling to compete in the digital era. The vast majority (82%) say banks aren’t innovating fast enough to meet changing consumer demands for digital services, with almost half (48%) thinking that these banks are at least three years behind fintech rivals.



Finding real strength in numbers through data partnerships

Over the last two years, we’ve seen some form of the following paragraph on a presentation slide at almost every data-focused conference attended. The quote has been stolen, and re-stolen, from a TechCrunch article by Tom Goodwin, in which he said: "Uber, the world’s largest taxi company, owns no vehicles. Facebook, the world’s most popular media owner, creates no content. Alibaba, the most valuable retailer, has no inventory. And Airbnb, the world’s largest accommodation provider, owns no real estate. Something interesting is happening." Each of those companies has created massive value by crafting data partnership approaches and then delivering a value greater than any one dataset could provide on its own. This is data innovation, combined with masterfully executed consumer marketing and user experience. Each one embraced the Amazon vision of open data structures, internally and externally, to power their go-to-market value proposition. In other words, when it comes to data partnerships, the whole is often greater than the sum of its parts.


The Benefits Of Edge Computing In IoT


Edge computing in IoT implies having autonomous systems of devices at these endpoints (or the edge) that simultaneously gather information and respond to the information without having to communicate with a remotely constructed data center. Instead of having remote data centers and computational servers, the processing of data can be done right where the data is collected, eliminating the need for constant connectivity to centralized control systems and the problems inherently associated with such setups.  For instance, a software company that sells cloud-based mobile applications can have cloud servers based in multiple locations closer to users instead of in a single location that may lead to undesirable latency and a single point of failure in case of any mishap. If the centralized servers failed due to some reason, all application users would lose their data and access to services at once. Additionally, the servers would also have to deal with heavy traffic, causing latency and inefficiency. On the contrary, a decentralized system would ensure that all the data pertinent to specific users would be hosted in the closest data center them among multiple ones, minimizing latency and limiting the impact of any potential failure. 


Cohesity plans to put backup data to good use

As with Isilon’s OneFS file system, Cohesity’s SpanFS distributes storage across several nodes, ensures redundancy of data, indexes data by means of metadata and shares it across the network, NAS-style. SpanFS is not limited to physical nodes and can integrate its capacity with the cloud. It has replication functionality that allows it to continue activity from a remote site or the cloud in case of an incident. In addition to NFS and SMB access, it can share data via the object storage S3 protocol, that is widely used for cloud applications. SpanFS is part of Cohesity’s DataPlatform, which in part comprises access to admin functionality, including configuration, deduplication, replication, monitoring. Also among these is SnapTree, which allows use of cloned content to, for example, run project tests with real data. DataPlatform software can come on hardware from HPE, Dell or Cisco as appliances or in virtual appliance format. As an option, the Helios SaaS console allows the centralisation of administration for multiple DataPlatform clusters across a number of cloud sites.


Containers, cloud-centric app services boost support for DevOps world


The underlying proxy technology also provides transparent routing to multiple back-end components, Transport Layer Security (TLS) termination, etc, and crosscutting concerns (i.e. logging, security and data transfer) at the edge of systems. This is particularly valuable within an API gateway – the entry point into microservices-based applications from external API clients.  Further, F5 is introducing a new cloud-native application services platform, specifically designed for the apps your DevOps and AppDev teams care about most. One significant innovation is itsService Mesh incubation, Aspen Mesh. “While container orchestration tools like Kubernetes have solved microservice build and deploy issues, many runtime challenges remain unsolved,” said Kara Sprague, senior vice president and general manager of Application Services Business Unit at F5. “Our fully supported service mesh makes it easy to manage the complexity of microservice architecture.” 


Are You Setting IT Up To Stifle Your Innovation?

uncaptioned image
The fact is that manufacturing organizations are a bit late to enterprise self-service analytics, or should I say self-service data management, compared to more centrally managed or highly regulated organizations like financial services or healthcare companies. Such organizations have already been dabbling in big data, cloud, and machine learning with varying degrees of success for a decade. Many deployed self-service analytics environments years ago. Nowadays, they are experiencing the “trough of disillusionment,” setting them up to finally realize the fruits of artificial intelligence (AI) adoption. They’ve learned that going back to basics around data quality, governance, cataloging, and cloud-based data integration to facilitate “data democratization” is needed to take full advantage of more advanced technologies. Manufacturers can avoid the mistakes and costly learnings of other industries by doing it right the first time. However, their traditional plant-centric approach and tactile-oriented innovation viewpoint permeate – and potentially limit – IT-related innovation.


IT needs to make mobile unified communications a priority

Double-exposure shot of a businesswoman using a mobile phone, binary code and statistical graphs..
The need for safe, reliable, and easy-to-use communications tools has given rise to unified communications (UC), a strategy that integrates multiple communications modalities under a single management and security umbrella. The result is more effective communication, improved collaboration, and a boost to security and regulatory policies. Now that mobility is the primary networking vehicle for end users, it’s time for IT departments to make mobile unified communications (MUC) a priority. The most important benefit of MUC is the ability of organizations to finally leave behind the uncontrolled, untracked mish-mash of consumer-centric, carrier, and third-party communications tools traditionally applied over the years. Communications are a critical organizational resource; MUC is a much easier vehicle to manage and scale, and MUC offers the visibility and control that’s essential to enterprise IT deployments. These advantages will enable MUC to become the dominant provisioning strategy and mechanism for organizational communications over the next five to 10 years.


Ransomware, Cryptojacking, and Fileless Malware: Which is Most Threatening?

The drama of the subtitle actually understates the danger of fileless malware. Of ransomware, cryptojacking, and fileless malware, fileless malware is both the youngest and perhaps the most dangerous. Fileless malware, as the name suggests, doesn’t behave as traditional malware. Malware usually downloads a file onto the victim device or enterprise environment; this allows the legacy antivirus solutions to locate and remove them. Fileless malware doesn’t do this. Instead, it uploads a program to a native process on the endpoint such as Java or PowerShell. Then fileless malware forces the native program to run its code, which performs the malicious task concealed behind its normal processes. Legacy endpoint security systems, which depend on traditional threat signatures, can’t possibly detect these attacks. Often, fileless malware leaves no trace of itself behind. Hackers increasingly adopt fileless malware attacks because, especially against legacy solutions, they prove largely successful.


How to make sure your artificial intelligence project is heading the right way


"The research highlights how everyone involved in the use of AI and big data must have wider discussions about the outcome you're looking for, such as better health, and then work backwards to issues like data sharing and information security. You should always start with the outcome," he says. Baker suggests business leaders looking to ensure they focus on the right objectives for AI and data should consider establishing a public ethics board. Just like companies have executive boards to make decisions, these ethics panels can help organisations that are using emerging technology to make publicly minded decisions. "We know some tech companies, like Deep Mind, already do this," says Baker. "Don't assume that you know what the public wants or that the market research you conduct into public opinions is correct. You need to actually have an ethics panel, and discuss what the issues are and what the needs of the public really are."


Small businesses hit hardest by cyber crime costs


The average cost of cyber attacks to small businesses was £65,000 in damaged assets, financial penalties and business downtime. The puts the total cost of cyber crime across all UK small businesses in 2018 at an estimated £13.6bn. This represents 80% of the financial impact of cyber attacks on all UK business in the past year, with a third reporting that they were hit by cyber crime. The survey, conducted by research consultancy Opinium, found that while phishing emails claimed the greatest number of victims (25%), ransomware attacks were the most financially damaging, costing victims £21,000 each on average. Although the trend for large businesses to fall victim at the highest rate continued, with seven in every 10 companies of more than 250 people being hit, the rate at which small companies succumbed to cyber criminals reached its highest level since Beaming started surveying business leaders in 2016. Nearly two-thirds (63%) of small businesses reported being a victim of cyber crime in 2018, up from 47% of small businesses in 2017 and 55% in 2016.



Quote for the day:


"Strategy is not really a solo sport even if you're the CEO." -- Max McKeown


No comments:

Post a Comment