Daily Tech Digest - August 08, 2024

4 Common LCNC Security Vulnerabilities and How To Mitigate Them

While LCNC platforms allow access restrictions on the data, they are applied on the client side by default. Unfortunately, a user with access to the application can bypass these restrictions and gain unauthorized access to the underlying data sources. Citizen developers might not be aware of the risk associated with default settings when configuring access rules. This can cause an external breach if the application is accessible over the internet or a report is published on the web. ... Apps and automation created on LCNC platforms are not immune to traditional web application vulnerabilities such as SQL injection. Consider a form for collecting user complaints that can be exploited by injecting SQL code, allowing an attacker from the internet to retrieve sensitive data, including usernames and salaries, from the database. This vulnerability arises when developers include user input directly in SQL queries without proper parameterization. ... Citizen developers mistakenly use LCNC applications and automation to send sensitive data through personal emails, store corporate data insecurely in public network drives, and generate and distribute anonymous access links to corporate resources. 


EU’s DORA regulation explained: New risk management requirements for financial firms

The EU says that despite the financial sector’s increased reliance on IT firms, there is a lack of specific powers to address ICT risks arising from those third parties. The act puts critical ICT third-party service providers into the scope of regulators and subject them to an oversight framework at the EU level. “DORA continues the impetus over the past decade in outsourced and third-party governance,” says Chaudhry, “with a focus on chain outsourcing and resiliency, with clarity that critical ICT third-party providers, including cloud service providers, need to be within the regulatory perimeter.” Under these rules, European Supervisory Authorities (ESAs) would have the right to access documents, carry out inspections, and subject third parties to fines if deemed necessary. ... In an early analysis of the regulation, Deloitte said that most firms in the sector would welcome the introduction of an oversight framework as it will provide more legal certainty around what is permissible, a level of assurance on the security of their assets in the cloud, and likely increase firms’ confidence and appetite for transitioning some of their activities to the cloud.


No god in the machine: the pitfalls of AI worship

The problem of theodicy has been a topic of debate among theologians for centuries. It asks: if an absolutely good God is omniscient, omnipotent and omnipresent, how can evil exist when God knows it will happen and can stop it? It radically oversimplifies the theological issue, but theodicy, too, is in some ways a kind of logical puzzle, a pattern of ideas that can be recombined in particular ways. I don’t mean to say that AI can solve our deepest epistemological or philosophical questions, but it does suggest that the line between thinking beings and pattern recognition machines is not quite as hard and bright as we may have hoped. The sense of there being a thinking thing behind AI chatbots is also driven by the now common wisdom that we don’t know exactly how AI systems work. What’s called the black box problem is often framed in mystical terms – the robots are so far ahead or so alien that they are doing something we can’t comprehend. That is true, but not quite in the way it sounds. New York University professor Leif Weatherby suggests that the models are processing so many permutations of data that it is impossible for a single person to wrap their head around it. 


Critical AWS Vulnerabilities Allow S3 Attack Bonanza

The researchers first uncovered Bucket Monopoly, an attack method that can significantly boost the success rate of attacks that exploit AWS S3 buckets — i.e., online storage containers for managing objects, such as files or images, and resources required for storing operational data. The issue is that S3 storage buckets were designed to use predictable, easy-to-guess AWS account IDs instead of a unique identifier for each bucket name using a hash or qualifier. "Sometimes the only thing that an attacker needs to know about an organization is their public account ID for AWS, which is not considered sensitive data right now, but we recommend it is something that an organization should keep as a secret," Kadkoda says. To mitigate the issue, AWS changed the default configurations. "All of the services have been fixed by AWS in that they no longer create the bucket name automatically," he explains. "AWS now adds a random identifier or sequence number if the desired bucket name already exists." Security researchers and AWS customers have long debated whether AWS account IDs should be public or private. 


Data Ethics: New Frontiers in Data Governance

While morals concern subjective notions of good and bad, and laws concern the limits of what is socially acceptable, Aiken and Lopez define ethics as “the difference between what you have the right to do and what is the right thing to do.” Navigating that crucial difference is rarely cut and dried even in simple, day-to-day personal interactions. Still, within the world of data, ethical questions can quickly take on multiple dimensions and present challenges unique to the field. Assessing data ethics can be decidedly confusing, for as Lopez pointed out, “Not all things that are bad for data are actually bad for the world … and vice versa.” Whereas the ethical actions and judgments that we make as private individuals tend to play out within a limited set of factors, the implications of even the most innocuous events within large-scale data management can be huge. Company data exists in “space,” potentially flowing between departments and projects, but privacy agreements and other safeguards that apply for some purposes may not apply to others. Data from spreadsheets authored for in-house analytics, for example, might violate a client privacy agreement if it migrates to open cloud storage.


How network segmentation can strengthen visibility in OT networks

First, it’s crucial to have a comprehensive understanding of the data flow within the environment — knowing what information needs to move and where. Often, technical documentation about operational design is outdated or incomplete, missing details about current data flows and usage. Second, most visibility tools in this space require specific network configurations because traditional antivirus or endpoint protection software isn’t typically viable for these devices. Therefore, it’s necessary to have mechanisms for routing traffic to inspection points. Since many OT networks are designed for resilience and uptime rather than cybersecurity, reconfiguring them to enable traffic inspection can be challenging. Network segmentation projects are time-consuming, expensive, and may lead to operational downtime, which is usually unacceptable in OT environments. The visibility tool story requires the identification of legacy technologies which tend to run rampant in OT networks and won’t support the changes necessary to feed the tools. These can include unmanaged switches, network devices that don’t support RSPAN, and outdated or oversubscribed cabling infrastructure.


Is The AI Bubble About To Burst?

While it is said that AI could add around $15 trillion to the value of the global economy, recent earnings reports from the likes of Google and Tesla have been less than stellar, leading to the recent dips in share prices. At the same time, there are reports that the general public is becoming more distrustful of AI and that businesses are finding it difficult to make money from it. Does this mean that the AI revolution—touted as holding the solution to problems as diverse as curing cancer and saving the environment—is about to come crashing down around our ears? ... However, it's important to note that even these tech giants aren't immune to external pressures. The ongoing Google antitrust case, for instance, could have far-reaching implications not just for Google, but for other major players in the tech industry as well. Nvidia is already facing two separate antitrust probes from the U.S. Department of Justice, focusing on its acquisition of Run:ai and alleged anti-competitive practices in the AI chip market. These legal and regulatory challenges could potentially reshape the landscape for Big Tech's AI ambitions. It's also worth mentioning that while the established tech companies have diversified revenue streams, there are newer players like OpenAI and Anthropic that are primarily focused on AI. 


Overcoming Human Error in Payment Fraud: Can AI Help?

Scammers usually target accounts payable departments, which processes payments to suppliers and vendors. They typically pose as an existing supplier and send fraudulent invoices to an organization or even digitally gain access to a company's AP processes to authorize large payments, said Infosys. ... Accounts payable automation solutions can flag minute discrepancies in invoices, such as a new address or new bank account details, that manual process might miss. Alerts can prompt companies to follow up with their vendors to verify the legitimacy of invoices before processing payments. ... Businesses see the potential for AI to reduce fraud losses in B2B payments. Companies can use AI to examine historical data to identify patterns, detect anomalies and automate routine tasks such as data entry and calculations. They can use crowdsourced data from vendors to streamline processes and enhance trust. Technologies that provide end-to-end visibility of the entire B2B payment ecosystem offer a comprehensive view, helping detect and prevent issues arising from human errors. Some organizations have launched AI-based initiatives to fight fraud, but the it's too soon to see results. 


Post-quantum encryption: Crypto flexibility will prepare firms for quantum threat, experts say

For enterprises, there are two big challenges that come with quantum computers. First of all, we don’t know when the day will come when a quantum computer breaks classical encryption, making it hard to plan for. It would be tempting to put off solving the problem until the quantum computers are here – and then it will be too late. Second, there is the ‘collect now, decrypt later’ threat. Major intelligence agencies may be – and almost certainly are – collecting any and all data they can get their hands on, planning ahead for a future where they can decrypt it all. “They’ve been doing it forever,” Lyubashevsky says. ... One problem, he says, is that encryption is often buried deep inside code libraries and third-party products and services. Or fourth or fifth party. “You have to get a cryptographic bill of materials to discover the cryptography inside – and that’s not easy,” he says. And that’s just the first challenge. Once all the encryption is identified, it needs to be replaced with a modern, flexible system. And that’s not always possible if parts of the system that are beyond your control have older encrypted hard-coded.


Study backer: Catastrophic takes on Agile overemphasize new features

"Testing is kind of one of those tools that are there, but in order for testing to actually be able to work at all you need to know what you're testing. So you need good requirements to outline the non-functional requirements that are there." Such as reliability. "The interesting thing is that a lot of people, I think, in the Agile community, a lot of the Agile fundamentalists will argue that user stories are sufficient. These essentially just describe functional behavior, but they lack a generalizable specification or nonfunctional requirements." "And so I think that's one of the key flaws. When you end up looking at the most dogmatic application of Agile, we just have user stories, but you've lacked that generalizable specification." ... For software engineering, however, things are less rosy. He points to an interpretation of DevOps where issues don't really matter as long as the system recovers from them, and velocity and quality are never in conflict. "This has led to absolutely catastrophic outcomes in the past." However, it is organizational transformation, where a methodology and mindset branded as "Agile" is applied across a business, which is where the wheels can really come off. 



Quote for the day:

"Nobody who has ever given his best has regretted it." -- George Halas

No comments:

Post a Comment