Daily Tech Digest - June 13, 2023

AI and tech innovation, economic pressures increase identity attack surface

In the new attack observed by Microsoft, the attackers, which the company track under the temporary Storm-1167 moniker, used a custom phishing toolkit they developed themselves and which uses an indirect proxy method. This means the phishing page set up by the attackers does not serve any content from the real log-in page but rather mimics it as a stand-alone page fully under attackers' control. When the victim interacts with the phishing page, the attackers initiate a login session with the real website using the victim-provided credentials and then ask for the MFA code from the victim using a fake prompt. If the code is provided, the attackers use it for their own login session and are issued the session cookie directly. The victim is then redirected to a fake page. This is more in line with traditional phishing attacks. "In this AitM attack with indirect proxy method, since the phishing website is set up by the attackers, they have more control to modify the displayed content according to the scenario," the Microsoft researchers said.


Revolutionizing DevOps With Low-Code/No-Code Platforms

With non-IT professionals developing applications, there is a higher risk of introducing vulnerabilities that could compromise the security of the application and the organization. Additionally, the lack of oversight and governance could lead to poor coding practices and technical debt. For instance, the use of new-generation iPaaS platforms by citizen integrators has made it difficult for security leaders to have full visibility into the organization’s valuable assets. Attackers are aware of this and have already taken advantage of improperly secured app-to-app connections in recent supply chain attacks, such as those experienced by Microsoft and GitHub. ... As organizations try to integrate low-code and no-code applications with legacy systems or other third-party applications, technical challenges can arise. For example, if an organization wants to integrate a low-code application with an existing ERP system, it may face challenges in terms of data mapping and synchronization. Some low-code and no-code applications are built to export data and share it well, but when it comes to integrating event triggers, business logic, or workflows, these software solutions hit limits. 


Rethinking AI benchmarks: A new paper challenges the status quo of evaluating AI

One of the key problems that Burnell and his co-authors point out is the use of aggregate metrics that summarize an AI system’s overall performance on a category of tasks such as math, reasoning or image classification. Aggregate metrics are convenient because of their simplicity. But the convenience comes at the cost of transparency and lack of detail on some of the nuances of the AI system’s performance on critical tasks. “If you have data from dozens of tasks and maybe thousands of individual instances of each task, it’s not always easy to interpret and communicate those data. Aggregate metrics allow you to communicate the results in a simple, intuitive way that readers, reviewers, or — as we’re seeing now — customers can quickly understand,” Burnell said. “The problem is that this simplification can hide really important patterns in the data that could indicate potential biases, safety concerns, or just help us learn more about how the system works, because we can’t tell where a system is failing.”


A Practical Guide for Container Security

Developers and DevOps teams have embraced the use of containers for application deployment. In a report, Gartner stated, "By 2025, over 85% of organizations worldwide will be running containerized applications in production, a significant increase from less than 35% in 2019." On the flip side, various statistics indicate that the popularity of containers has also made them a target for cybercriminals who have been successful in exploiting them. According to a survey released in a 2023 State of Kubernetes security report by Red Hat, 67% of respondents stated that security was their primary concern when adopting containerization. Additionally, 37% reported that they had suffered revenue or customer loss due to a container or Kubernetes security incident. These data points emphasize the significance of container security, making it a critical and pressing topic for discussion among organizations that are currently using or planning to adopt containerized applications.


6 finops best practices to reduce cloud costs

Centralizing cloud costs from public clouds and data center infrastructure is a key finops concern. The first thing finops does is to create a single-pane view of consumption, which enables cost forecasting. Finops platforms can also centralize operations like shutting down underutilized resources or predicting when to shift off higher-priced reserved cloud instances. Platforms like Apptio, CloudZero, HCMX FinOps Express, and others can help with shift-left cloud cost optimizations. They also provide tools to catalog and select approved cloud-native stacks for new projects. ... “Today’s developers now have a choice between monolithic cloud infrastructure that locks them in and choosing to assemble cloud infrastructure from modern, modular IaaS and PaaS service providers,” says Kevin Cochrane, chief marketing officer of Vultr. “By choosing the latter, they can speed time to production, streamline operations, and manage cloud costs by only paying for the capacity they need.” As an example, a low-usage application may be less expensive to set up, run, and manage on AWS Lambda with a database on AWS RDS, rather than running it on AWS EC2 reserved instances.


Artificial Intelligence: A Board of Directors Challenge – Part II

It is essential for organizations to dedicate time and effort to consider the potential unintended consequences or “unknown unknowns” of AI deployments. This will prevent adverse outcomes that may arise if AI is deployed without proper consideration. To achieve this, it is necessary to understand the Rumsfeld Knowledge Matrix. The Rumsfeld Knowledge Matrix is a conceptual framework introduced by Donald Rumsfeld, the former United States Secretary of Defense, to categorize and analyze knowledge and information based on different levels of certainty and awareness. The matrix consists of four quadrants: Known knowns: These are things that we know and are aware of. They represent information that is well understood and can be easily articulated. I call these “Facts.” Known unknowns: These are things that we know we don’t know. In other words, there are gaps in our knowledge or information which we are aware of and recognize as areas where further research or investigation is needed. We need to ask These ” Questions “


How to achieve cyber resilience?

Instead of relegating security development to a forgettable annual calendar reminder, a continuous approach must keep security at the forefront of mind throughout the year. Security threats also need to be brought to life with realistic simulation exercises. This approach will provide a much more engaging experience for participants and a far more accurate indication of their abilities. Real-life exercises give far more insight into an individual’s mindset and potential than a certification’s often rote, static nature. Security teams must be ready to respond rapidly and confidently to the latest emerging threats, aligned with industry best practices. They must have the right skills, from closing off newly discovered zero days, to mitigating serious incoming threats like attacks exploiting Log4Shell. But they must also be able to apply them calmly and in control even if they face a looming crisis. This capability can only be developed through continuous exercise.


The IT talent flight risk is real: Are return-to-office mandates the right solution?

Most workers require location flexibility when considering a job change. In addition, most workers in an IT function would only consider a new job or position that allows them to work from a location of their choosing. Requiring employees to return fully on-site is also a risk to DEI. Underrepresented groups of talent have seen improvements in how they work since being allowed more flexibility. For example, most women who were fully on-site prior to the pandemic, but have been remote since, report their expectations for working flexibly have increased since the beginning of the pandemic. Employees with a disability have also found a vast improvement to the quality of their work experience. Since the pandemic, Gartner research shows that knowledge workers with a disability have found the extent to which their working environment helps them be productive has improved. In a hybrid environment for this population, perceptions of equity have also improved, as they have experienced higher levels of respect and greater access to managers.


Common Cybersecurity Risks to ICS/OT Systems

Protecting ICS/OT systems from cyberthreats is crucial for ensuring the resilience of critical infrastructure. Recent cyberattacks on ICS/OT systems have highlighted the potential impact of these attacks on critical infrastructure and the need for organizations to prioritize cybersecurity for their ICS/OT systems. By being aware of common cybersecurity risks and taking proactive steps to mitigate them, organizations can protect their ICS/OT systems and maintain operational resilience. The above-mentioned incidents demonstrate that cyberattacks on ICS/OT systems can cause physical harm, financial losses and public safety risks. Organizations must protect their ICS/OT systems from cyberthreats, such as conducting regular vulnerability assessments, implementing network segmentation and providing employee training on cybersecurity best practices. Compliance with relevant regulations and standards and collaboration between IT and OT teams can also help mitigate cybersecurity risks to ICS/OT systems.


10 emerging innovations that could redefine IT

The most common paradigm for computation has been digital hardware built of transistors that have two states: on and off. Now some AI architects are eyeing the long-forgotten model of analog computation where values are expressed as voltages or currents. Instead of just two states, these can have almost an infinite number of values, or at least as much as the precision of the system can measure accurately. The fascination in the idea comes from the observation that AI models don’t need the same kind of precision as, say, bank ledgers. If some of the billions of parameters in a model drift by 1%, 10% or even more, the others will compensate and the model will often still be just as accurate overall. ... The IT department has a big role in this debate as it tests and deploys the second and third generation of collaboration tools. Basic video chatting is being replaced by more purpose-built tools for enabling standup meetings, casual discussions, and full-blown multi-day conferences. The debate is not just technical. Some of the decisions are being swayed by the investment that the company has made in commercial office space.



Quote for the day:

"When you accept a leadership role, you take on extra responsibility for your actions toward others." -- Kelley Armstrong

No comments:

Post a Comment