Daily Tech Digest - June 26, 2024

How Developers Can Head Off Open Source Licensing Problems

There are proactive steps developers can take as well. For instance, developers can opt for code that isn’t controlled by a single vendor. “The other side, beyond the licensing, is to look and to understand who’s behind the license, the governance, policy,” he said. Another option to provide some cushion of protection is to use a vendor that specializes in distributing a particular open source solution. A distro vendor can provide indemnification against exposure, he said. They also provide other benefits, such as support and certification to run on specific hardware set-ups. Developers can also look for open source solutions that are under a foundation, rather than a single company, he suggested, although he cautioned that even that isn’t a failsafe measure. “Even foundations are not bulletproof,” he said. “Foundations provide some oversight, some governance and some other means to reduce the risk. But if ultimately, down the path, it ends up again being backed up by a single vendor, then it’s an issue even under a foundation.”


Line of Thought: A Primer on State-Sponsored Cyberattacks

A cyberattack may be an attractive avenue for a state actor and/or its affiliates since it may give them the ability to disrupt an adversary while maintaining plausible deniability.15 It may also reduce the risk of a retaliatory military strike by the victim.16 That’s because actually determining who was behind a cyberattack is notoriously difficult: attacks can be shrouded behind impersonated computers or hijacked devices and it may take months before actually discovering that an attack has occurred.17 Some APTs leverage an approach called “living off the land” which enables them to disguise an attack as ordinary network or system activities.18 Living off the land enabled one APT actor to reportedly enter network systems in America’s critical infrastructure and conduct espionage—reportedly with an eye toward developing capabilities to disrupt communications in the event of a crisis.19 The attack occurred sometime in 2021, but, due to the stealthy nature of living off the land techniques, wasn’t identified until 2023.


Taking a closer look at AI’s supposed energy apocalypse

Determining precisely how much of that data center energy use is taken up specifically by generative AI is a difficult task, but Dutch researcher Alex de Vries found a clever way to get an estimate. In his study "The growing energy footprint of artificial intelligence," de Vries starts with estimates that Nvidia's specialized chips are responsible for about 95 percent of the market for generative AI calculations. He then uses Nvidia's projected production of 1.5 million AI servers in 2027—and the projected power usage for those servers—to estimate that the AI sector as a whole could use up anywhere from 85 to 134 TWh of power in just a few years. To be sure, that is an immense amount of power, representing about 0.5 percent of projected electricity demand for the entire world (and an even greater ratio in the local energy mix for some common data center locations). But measured against other common worldwide uses of electricity, it's not representative of a mind-boggling energy hog. A 2018 study estimated that PC gaming as a whole accounted for 75 TWh of electricity use per year, to pick just one common human activity that's on the same general energy scale


Stepping Into the Attacker’s Shoes: The Strategic Power of Red Teaming

Red Teaming service providers are spending years preparing their infrastructure to conduct Red Teaming exercises. It is not feasible to quickly build a customized infrastructure for a specific customer; this requires prior development. Tailoring the service to a particular client can take anywhere from one to four months. During this period, preliminary exploration takes place. Red Teams use this time to identify and construct a combination of infrastructure elements that will not raise alarms among SOC defenders. ... The focus has shifted towards building a more layered defense, driven by Covid restrictions, remote work and the transition to the cloud. As companies enhance their defensive measures, there is a growing need to conduct Red Teaming projects to evaluate the effectiveness of these new systems and solutions. The risk of increased malicious insider activity has made the hybrid model increasingly relevant for many Red Teaming providers. This approach is neither a complete White Box, where detailed infrastructure information is provided upfront, nor traditional Red Teaming.


Six NFR strategies to improve software performance and security

Based on their analysis and discussions with developers, the researchers identified six key points: Prioritization and planning: NFRs should be treated with as much priority as other requirements. They should be planned in advance and reviewed throughout a development project. Identification and discussion: NFRs should be identified and discussed early in the development process, ideally in the design phase. During the evolution of the software, these NFRs should be revisited if necessary. Use of technologies allied with testing: The adequacy of the NFR can be verified through technologies already approved by the market, where the NFRs associated with those projects satisfy the project's complexity. Benchmarks: Using benchmarks to simulate the behavior of a piece of code or algorithm under different conditions is recommended, since it allows developers to review and refactor code when it is not meeting the project-specified NFRs. Documentation of best practices: By keeping the NFRs well-documented, developers will have a starting point to address any NFR problem when they appear.


Exploring the IT Architecture Profession

In IT architecture, it takes many years to gain the knowledge and skills required to be a professional architect. In my opinion, at the core of our profession are our knowledge and skills in technology. This is what we bring to the table; it is our knowledge and expertise in both business and technology that make the IT architecture profession unique. In addition to business and technology skills, it is essential that the architect possesses soft skills such as leadership, politics, and people management. These are often undervalued. When communicating IT architecture and what an IT architect does, I notice that there are a number of recurring aspects: scope, discipline, domain, and role. ... Perhaps the direction for the profession is to focus on gaining consensus around how we describe scope, domain and discipline rather than worrying too much about titles. An organisation should be able to describe a role from these aspects and describe the required seniority. At the end of the day, this was a thought-provoking exercise and with regards to my original problem, the categorisation of architecture views, I found that scope was perhaps the simplest way to organise the book.


Why collaboration is vital for achieving ‘automation nirvana’

Beeson says that one of the main challenges of implementing automation is getting different teams to collaborate on creating automation content. He explains that engineers and developers often have their own preferred programming language or tools and can be reluctant to share content or learn something new. “A lack of collaboration prevents the ‘automation nirvana’ of removing humans from complex processes, dramatically reducing automation benefits,” he says. “Individuals tend to be reluctant to contribute if they don’t have confidence in the automation tool or platform. “Automation content developers want the automation language to be easy to learn, compatible with their technology choices and provide control to ensure the content they contribute is not misused or modified.” ... When it comes to the future of automation, Beeson has no shortage of thoughts and predictions for the sector, especially relating to the role of automation in defence. “Defence is not immune from the ‘move to cloud’ trend, so hybrid cloud automation is becoming ever more prevalent in the sector,” he says


Securing the digital frontier: Crucial role of cybersecurity in digital transformation advisory

Advisory services have the expertise to perform in-depth technical security assessments to identify and help prioritize vulnerabilities in an organization’s infrastructure. These assessments include the use of specialised tools and manual testing to do a comprehensive assessment. Systems are examined to validate if they are following security best practices and prescribed industry standards. ... Advisors help organisations develop threat models to identify potential attack vectors and assess associated risks. Several methodologies like STRIDE, Kill Chain and PASTA are used to systematically analyse threats and risks. ... An organisation’s security is only as good as its weakest link, and generally, the weakest link is an individual of the organisation. Advisory services undertake regular training to educate and inform employees on security best practices. They can also support with simulation training such as phishing simulations and develop comprehensive security awareness programs that cover topics like secure password practices, data handling, data privacy, and incident reporting.


Delving Into the Risks and Rewards of the Open-Source Ecosystem

While some risk is inevitable, enterprise teams need to have an understanding of that risk and use open-source software accordingly. “As a CISO, the biggest risk I see is for organizations not to be intentional about how they use open-source software,” says Hawkins. “It's extremely valuable to build on top of these great projects, but when we do, we need to make sure we understand our dependencies. Including the evaluation of the open-source components as well as the internally developed components is key to being able to accurately [understand] our security posture.” ... So, it isn’t feasible to ditch open-source software, and risk is part of the deal. For enterprises, that reality necessitates risk management. And that need only increases as does reliance on open-source software. “As we move towards cloud and these kind of highly dynamic environments, our dependency on open-source is going up even higher than it ever did in the past,” says Douglas. If enterprise leaders shift how they view open-source software, they may be able to better reap its rewards while mitigating its risks.


Rethinking physical security creates new partner opportunities

Research conducted by Genetec has showed a 275% increase in the number of end users wanting to take more physical security workloads to the cloud. Research also indicates that many organisations aren’t treating SaaS and cloud as an ‘all or nothing’ proposition. However, while a hybrid-cloud infrastructure provides flexibility, it also has implications in being the gateway to the physical security cloud journey. Organisations needs to ensure that there are tools in place that can protect data regardless of their location. ... Organisations that aren’t able to keep up with the upgrade cycle often become subject to the consumption gap. This is where the end user can see the platform evolving with new features and functionality, but are unable to take advantage of all of it. The bigger the consumption gap, the more likely it’s to be holding the organisation back from physical security best practices. SaaS promises to close that gap because it keeps organisations on the latest software version. Importantly, their solution is updated in a way that is pre-approved by the organisation and on a timeframe of their choosing.



Quote for the day:

"Without growth, organizations struggle to add talented people. Without talented people, organizations struggle to grow." -- Ray Attiyah

No comments:

Post a Comment