Daily Tech Digest - January 10, 2022

The positive side of ransomware for data transformation

The new architecture should be more responsive and interface with enterprise data via a smart API that can automate the processes by which data is fetched, merged, transformed, secured, and delivered, all without users having to put in a request to the data team. It must work with application data generated outside the data warehouse and programmatically combine it with data from either side of the wall, as well as with sources external to the organization. It must also be fully programmable, eliminating the need to predetermine exactly what data is going to be combined with what data. The “smarts” of this smart API means users can make data requests that may seem simple to them but that set off complex automated processes that deliver data within seconds or minutes, not days or weeks. One of the important functions of this API must be to perform automated backups on an optimized and context-aware schedule, including rapid backups to the cloud as well as slower backups to physical media in-house or remote. Such a system will enable enterprises to say, “No, thanks” to cybercriminals demanding payment to undo their ransomware’s damage on a company’s data.


How COVID-19 made zero trust the right approach to modernise networks

Zero trust networks operate under a principle of least-privilege, meaning that all programmes, processes, devices, or users are limited to the minimum privilege required to carry out their functions. The best analogy is to think of it like the government or military’s “need-to-know” policy. Access rights don’t need to be too restrictive as they are balanced against privileges that can range from ‘full access’ to ‘no rights at all’, depending on the conditions. Naturally, as remote working became the norm, these protocols made the most sense for dispersed networking environments. Though, many organisations may have quickly found that traditional detection and prevention technologies were insufficient to mitigate the threats posed by opportunistic actors on remote environments. While information into how previous attacks were carried out provided some context, it could only go so far with this new way of working in a highly distributed and decentralised way. The key for many was to plan for the worst and assume a breach had occurred.


Analyst predictions 2022: The future of data management

There’s a huge gap there in practice and knowledge. Also to a lesser extent, there’s a technology gap which is basically in the self-service technologies that will help teams essentially govern data; through the full life cycle, from develop, from selecting the data from building the pipelines from determining your access control, looking at quality, looking at basically whether the data is fresh or whether it’s trending off course. So my prediction is that it will receive the first harsh scrutiny this year. You are going to see some organizations and enterprises declare premature victory when they build some federated query implementations. You’re going to see vendors start to “data mesh-wash” their products, be it a pipelining tool, ELT [extract, transform, load process], a catalog or federated query tool. Vendors will be promoting how they support data mesh. Hopefully nobody’s going to call themselves a data mesh tool because data mesh is not a technology. We’re going to see one other thing come out of this. And this harks back to the metadata that Sanjeev was talking about and the data catalog.


How Artificial Intelligence is Changing the Payment Gateway Industry

Artificial intelligence and machine learning in the payment industry can reduce fraud detection significantly. An AI-powered payment gateway looks at a range of factors and provides a risk score. For example, suppose a merchant has a good record. It will be assigned a low-risk score. But, a time zone, location, or IP address that is previously not known will get a higher risk score. Several other factors are taken into account. Then, it is determined if the transaction is fraudulent. This way, it becomes easier to analyze data in a better way. AI can go through a larger volume of data and detect even unexpected anomalies. A financial institution or payment industry using AI can process real-time transactions. Technology is continually evolving; it is finding new advanced ways to differentiate between fraudulent and real transactions. Thus, AI provides the payment gateway industry with the ability to process a vast number of transactions with accuracy and lower error rates.


Eight resolutions to help navigate the new hybrid office model

The cloud and “on demand” models have become hugely important for helping users access the applications they need to do work from anywhere. While this shift to the cloud has its productivity benefits, it has not come without its security challenges. It is important to remember that cloud environments are not automatically secure when they are first created. Securing them requires knowledge and time. To keep business safe, security controls must span all environments – providing 360-degree application protection for both the application surface and the cloud application infrastructure. 5. Know your suppliers. The SolarWinds vulnerability highlighted the need for companies to thoroughly evaluate the tools and services they integrate into their operations. This includes the careful installation and configuration of the product or service, tracking patches and new releases from the vendor, and monitoring for any suspicious behavior. In a highly sensitive environment, some companies may choose not to use third-party products or services.


Avoiding the two-tier workforce in the third workplace

Most crucial of all, though, Farque believes, is for leaders and managers to consciously set the tone and role-model the behaviour they expect to see among their staff. Because dealing with hybrid teams equitably is not part of the usual management training rulebook, however, she advises investing in additional learning and development here. Mildon, on the other hand, believes there are lots of “little, everyday bits of inclusion” that can make a big difference too. These include simple things, such as ensuring that when remote employees dial into a meeting where others are physically present, they are invited to speak first. Alternatively, everyone could attend the event virtually. Digital tools, such as whiteboards and instant messaging, are also useful in enabling team members to communicate, collaborate and keep each other in the loop. But as Mildon points out, while there are lots of tools on the market that can help here. “It’s not necessarily about the technology,” he says. 


Why digital maturity is key to harnessing the Digital Twin

It’s a mistake to think that implementing a Digital Twin is a huge undertaking that will require the overhaul of technologies and processes. Fundamentally, so long as the business has data, and that data is captured, stored in an information management system, and understood by all, then you have the right foundation to get started. However, ultimately, a Digital Twin can only be successful if it is fed by the right data. And that is not only down to information management systems – it requires the workforce being able to read, understand, create, and communicate data as information. In other words, they need to be data literate. With the UK on the verge of a digital skills disaster, however, some business leaders may find that they’ll need to upskill their employees in this area. This will require the full backing of the C-level and a top-down leadership approach to encourage a data-driven culture that’s engrained across the company. As the saying goes, you can have all the gear but no idea – and organisations that upskill their employees in data will be the ones capable of bringing advanced concepts such as Digital Twins to fruition and reaping the benefits.


How Database Virtualization Could Break Vendor Lock-in

While the concept of database virtualization has started attracting attention and several products have become available, there is still a lot of skepticism and confusion. To be fair, the notion of virtualization has been used loosely in the context of databases. From data virtualization to the abstraction of database files, the label virtualization is used liberally. The concept of data virtualization has been around for decades and may warrant a clean delineation. Data virtualization is the idea of using a universal query language across all applications. While this seems beneficial in the long run, it requires all applications to be rewritten to use this artificial language. This does not solve the original problem: In order to overcome the vendor lock-in, the enterprise has to, well, overcome the vendor lock-in in the first place? Put differently, data virtualization is certainly aspirational, but it does not solve the actual problem of moving existing applications between databases.


2022 is shaping up to be an epic fight to protect data

COVID forever changed the way we work. Even in the highly unlikely event that the pandemic winds down this year, many people will keep working from home at least part of the time. Cybercriminals will be working, too. They'll be hunting for new ways to take advantage of the connections and devices that workers use to dial in remotely. NYU's Cappos says the cybersecurity industry will likely get a better handle on how to manage hybrid work situations, introducing new recommendations and products that boost security and make it easier for workers to connect. Consumers will also need to up their security game, Clay says. Good methods of two-factor authentication, such as biometrics and push notifications, are going to be a must. Simpler verification methods, like codes sent as SMS messages, just can't be trusted anymore. That goes for smartphones, too. Phishing, the practice of sending deceptive emails in order to get personal information, is going mobile. Similar attempts using SMS, known unimaginatively as smishing, and voice calls, which are called – you guessed it – vishing, will become more common this year as people move more of their online activity to mobile devices, Clay says.


Top tips to protect online networks in schools from cyber attacks

There’s a certain appeal that comes with free tools, especially if you have been looking at an alternative that is otherwise expensive to purchase. It’s easy to look at the free alternative and acknowledge that it isn’t exactly what you wanted, but it will do the trick. Avoid the temptation at all costs on a school set-up. An appealing free online tool could be dressed as malware or viruses, built to wreak havoc as soon as they have been downloaded. A lot of these come with Adware, which isn’t particularly dangerous but clocks up and slows your system and can showcase dangerous websites. Finally, there is the danger that a downloaded free tool will unleash spyware on your system, which integrates into your system and steals valuable data. ... Updating a school’s active directory, is quite possibly, the most basic, yet laborious, task on this list. Regardless, its importance is undeniable. This article has largely been about preventing online hackers, scammers, and cyber attackers from getting into your system. However, in some cases, those attacks can come from within, namely from former and previous students.



Quote for the day:

"Great leaders are almost always great simplifiers who can cut through argument, debate and doubt to offer a solution everybody can understand." -- General Colin Powell

No comments:

Post a Comment