The unhappy reality of cloud security in 2023
Configuration problems are often the most significant risk to cloud data and the
most often overlooked. Show me a breach, and I’ll show you something stupid that
allowed it to happen. One recent example is a large car manufacturer that had
more than two million customers’ data exposed due to misconfigurations in its
cloud storage systems. Rarely are properly configured security systems bypassed
to gain access to data. Often, storage systems are left exposed or databases
need more encryption. ... Not only are APIs provided by the cloud vendors, APIs
are also built into business applications. They provide “keys to the kingdom”
and are often left as open access points to business data. Other emerging
threats include the use of generative AI systems to automate fakery. As I
covered here, these AI-driven attacks are occurring now. As bad actors get
better at leveraging AI systems (often free cloud services), we’ll see automated
attacks that can work around even the most sophisticated security systems. It
will be tough to keep up with the new and innovative ways attacks can occur.
How Automation Enables Remote Work
In a remote work setup, effective communication is paramount. Automation tools
such as Slack and Microsoft Teams facilitate better communication by automating
tasks like scheduling meetings, sending reminders and translating messages.
These tools can also automate the process of organizing and archiving
conversations, making it easier to retrieve information when needed.
Additionally, they can automate the process of updating team members about
project changes or important announcements. These features ensure that all team
members are on the same page, enhancing collaboration, reducing the chances of
miscommunication and ultimately leading to a more cohesive and efficient team.
Automation in human resources (HR) is a game-changer in remote work settings. HR
automation software can streamline recruitment, automating resume sorting,
interview scheduling and follow-up emails. It can also enhance onboarding,
automating welcome emails and account setups. Performance management can be
improved with automated feedback collection and goal tracking.
Self-healing code is the future of software development
It’s easy to imagine a more iterative process that would tap in the power of
multi-step prompting and chain of thought reasoning, techniques that research
has shown can vastly improve the quality and accuracy of an LLM’s output. An AI
system might review a question, suggest tweaks to the title for legibility, and
offer ideas for how to better format code in the body of the question, plus a
few extra tags at the end to improve categorization. Another system, the
reviewer, would take a look at the updated question and assign it a score. If it
passes a certain threshold, it can be returned to the user for review. If it
doesn’t, the system takes another pass, improving on its earlier suggestions and
then resubmitting its output for approval. We are lucky to be able to work with
colleagues at Prosus, many of whom have decades of experience in the field of
machine learning. I chatted recently with Zulkuf Genc, Head of Data Science at
Prosus AI. He has focused on Natural Language Processing (NLP) in the past,
co-developing an LLM-based model to analyze financial sentiment, FinBert, that
remains one of the most popular models at HuggingFace in its category.
Why an ecosystem helps IT projects move forward
To support the data strategy set by the company’s chief data officer, the team
needed to specify the capabilities required from a data platform with the
company’s tech strategy, which is about being cloud-first. Stuart Toll, senior
enterprise architect at LGIM, said that time to market, integration time and
skills were among the criteria used to assess the data platform providers. For
Toll, while LGIM could have probably made any data platform work, he said “we
are an asset management firm”. “We buy where we can and only build to
differentiate.” This influenced the company’s data integration strategy. LGIM
did not want to be in the business of stitching lots of tools together, as Matt
Bannock, head of data engineering at LGIM, explained. ... Bannock said that with
some tools, IT departments need to spend time on data integration. “Being able
to just start working with the data, start running the calculation and start
generating the output is much more valuable to us than the potential half a
percent advantage we could achieve if we created our own ecosystem,” he said.
“There’s a lot of benefit in buying into an ecosystem.”
Key Considerations When Hiring a Chief Information Security Officer
Look for candidates who possess a deep understanding of cybersecurity
technologies, risk management frameworks, and regulatory compliance.
Experience in managing security incidents, implementing security controls, and
developing effective security strategies is also crucial. ... A CISO must
understand the business landscape in which the organization operates. They
should align security objectives with overall business goals and demonstrate a
keen understanding of the organization’s risk appetite. A CISO with business
acumen can effectively prioritize security investments, articulate the value
of security measures to executive management, and build a security program
that supports the organization’s strategic objectives. ... The field of
cybersecurity is ever-evolving, with new threats emerging regularly. It is
crucial for a CISO to stay up-to-date with the latest trends, technologies,
and best practices in information security. Look for candidates who
demonstrate a commitment to continuous learning, involvement in industry
forums, and participation in relevant certifications and conferences.
10 things every CISO needs to know about identity and access management (IAM)
CISOs must consider how to move away from passwords and adopt a zero-trust
approach to identity security. Gartner predicts that by 2025, 50% of the
workforce and 20% of customer authentication transactions will be
passwordless. ... Identity threat detection and response (ITDR) tools reduce
risks and can improve and harden security configurations continually. They can
also find and fix configuration vulnerabilities in the IAM infrastructure;
detect attacks; and recommend fixes. By deploying ITDR to protect IAM systems
and repositories, including Active Directory (AD), enterprises are improving
their security postures and reducing the risk of an IAM infrastructure breach.
... Attackers are using generative AI to sharpen their attacks on the gaps
between IAM, PAM and endpoints. CrowdStrike’s Sentonas says his company
continues to focus on this area, seeing it as central to the future of
endpoint security. Ninety-eight percent of enterprises confirmed that the
number of identities they manage is exponentially increasing, and 84% of
enterprises have been victims of an identity-related breach.
Decentralized Storage: The Path to Ultimate Data Privacy and Ownership
A move towards decentralization opens up a significant possibility for
individual users; monetization. Data sovereignty would allow users to monetize
their data and available storage space. Contributing towards the network
storage would allow users to earn passive income purely from allowing other
users to store data on their drives. This could be an alarming concept for
users at first. But the realization that only you could access your data on
the network using your key, regardless of which node it is stored in, should
significantly help overcome this fear. Decentralization also has important
implications for businesses and organizations. For example, companies can
reduce the risks associated with data breaches and protect customer
information more effectively, allowing for more trust with customers in the
long term. Organizations could also contribute to network storage on a larger
scale, allowing for new economic opportunities.
Too Much JavaScript? Why the Frontend Needs to Build Better
Often, it boils down to one common problem: Too much client-side JavaScript.
This is not a cost-free error. One retailer realized they were losing $700,000
a year per kilobyte of JavaScript, Russell said. “You may be losing all of the
users who don’t have those devices because the experience is so bad,” he said.
That doesn’t mean developers are wrong to ship client-side JavaScript, which
is why Russell hates to be prescriptive about how to handle the problem.
Sometimes, it makes sense depending on the data model and whether it has to
live on the client so that you can access the next email (think Gmail) or
paint the next operation quickly (think Figma). The usage tends to correlate
with very long sessions, he said. But developers should realize, too, that
some frameworks prioritize this approach. “The premise of something like
React, the premise of something like Angular, is that [the] data model is
local,” he said. “So if that premise doesn’t meet the use case, then those
tools just fundamentally don’t make sense,” he said. “You really do have to
shoehorn them in for some kind of an exogenous reason, then you hope that it
plays out.
The hardest part of building software is not coding, it’s requirements
Is the idea behind using AI to create software to just let those same
stakeholders talk directly to a computer to create a SMS based survey? Is AI
going to ask probing questions about how to handle all the possible issues of
collecting survey data via SMS? Is it going to account for all the things that
we as human beings might do incorrectly along the way and how to handle those
missteps? In order to produce a functional piece of software from AI, you need
to know what you want and be able to clearly and precisely define it. There
are times when I’m writing software just for myself where I don’t realize some
of the difficulties and challenges until I actually start writing code. Over
the past decade, the software industry has transitioned from the waterfall
methodology to agile. ... So many software projects using waterfall have
failed because the stakeholders thought they knew what they wanted and thought
they could accurately describe it and document it, only to be very
disappointed when the final product was delivered. Agile software development
is supposed to be an antidote to this process.
Beyond Backups: Evolving Strategies for Data Management and Security
As businesses continue to generate more data, the need to revamp data
management services including the implementation of effective data backup and
recovery strategies has become central. Comprehensive data backup continues to
evolve, and AI and ML have become potent tools in this field, revolutionizing
the way organizations approach data backup and recovery. In 2023, we will
witness an increase in the adoption of AI/ML technologies such as
self-monitoring and management of IT assets and automation along with
orchestration of IT activities across on-premises and the cloud. AI will play
an increasingly important role both for malicious purposes and to build more
proactive and pre-emptive strategies. To enable a competitive advantage to all
our customers, our data protection capabilities are fuelled by AI/ML at their
core. Our self-driving backup uses AI and ML to automate backup and recovery
operations and management, including setup, monitoring, deep visibility
real-time insights, and service level agreement (SLA) tracking.
Quote for the day:
"The problem with being a leader is
that you're never sure if you're being followed or chased." --
Claire A. Murray
No comments:
Post a Comment