Banking Tech Forecast: Cloudy, With a Chance of Cyber Risk
The breakneck pace of adoption has also resulted in a shortage of security
experts who understand the overlapping, yet unique needs of the two industries.
The cybersecurity sector has faced a shortage of skilled security professionals
for most of its existence. Cloud solutions help mitigate this, because security
can be integrated into the infrastructure and managed in a centralized place,
Betz said. "Even then, financial institutions are still expected to conduct due
diligence and oversight of third parties. This ability to evaluate security in a
complex environment requires a high level of skill, which will continue to be
highly sought-after, she said. Hiring and retraining existing staff to meet the
volume of needed workers is a challenge too, in addition to the regulatory
landscape wanting to force multi-cloud infrastructure for resiliency, Leach
said. This means that a financial institution may be required to support
multiple cloud service providers that operate differently and have different
approaches to security assets.
Data Mesh: A Pit Stop on the Road to a Data-Centric Culture
The process behind these benefits is also noteworthy. Most importantly, the
notion of data decentralization is deceptively simple, and potentially
revolutionary. Think of how IT consumerization has upended traditional
technology implementation: Where IT specialists once made all the decisions on
which tools to buy for business professionals and dictated how all that hardware
and software was to be used, those end users now call the shots. They freely buy
the devices they want and download the apps they like, then wait for IT to catch
up. This provides enormous benefits. With data mesh we’re seeing similar
movement toward data democratization. When line-of-business teams and other
constituencies within the enterprise gain unprecedented access, and even
ownership, of business data that was previously guarded, it accelerates
collaboration and enables custom strategies to solve specific business problems.
Data access also becomes simpler when interfaces and navigation are not just
user-friendly but attuned to the priorities of specific functions, rather than
having a more generic or enterprise-wide approach.
5 key mistakes IT leaders make at board meetings
It’s important to avoid speaking technical jargon, but sometimes you’re asked
to define a technical term or explain a technology. One approach both Puglisi
and I recommend is to answer technical questions with analogies from your
industry. We both worked in the construction industry, so, for example, we
might help these executives understand Scrum in software development by
comparing it to design-build and agile construction project methodologies. ...
Sometimes you need a spark to create a sense of urgency, but don’t take this
approach too far. I once heard a CISO say, “If you can’t convince the board,
then scare them,” which might get a CISO a yes to an investment, but lose
credibility over time. CISOs who are natural presenters and storytellers can
connect with the board using these skills, but only if given sufficient time
to use this approach. If presenting isn’t your best skill, or you only have a
few minutes to present, storytelling may confuse directors, says Tony
Pietrocola, president and co-founder of AgileBlue.
Beyond Browsers: The Longterm Future of JavaScript Standards
Developers don’t want to write their code multiple times to run on different
serverside runtimes, which they have to do today. It slows down development,
increases the maintenance load and may put developers off supporting more
platforms like Workers, Snell noted. Library and framework creators in
particular are unhappy with that extra work. “They don’t want to go through
all that trouble and deal with these different development life cycles on
these different runtimes with different schedules and having to maintain these
different modules.” That’s a disadvantage for the runtimes and platforms as
well as for individual developers wanting to use tools like database drivers
that are specifically written for one runtime or another, he pointed out. “We
talk to developers who are creating Postgres drivers or MongoDB drivers and
they don’t want to rewrite their code to fit our specific API. It would be
great if it was an API that worked on all the different platforms.” WinterCG
is trying to coordinate what Ehrenberg calls “a version of fetch that makes
sense on servers” as well as supporting Web APIs like text encoder and the set
timeout APIs in a compatible way across runtimes.
EU judgment sinks Meta’s argument for targeted ads
Article 6(1)(b) of the GDPR establishes that practice could only be justified
on condition that if the data is not processed, the contract between the user
and the service operator can’t be fulfilled. This contractual necessity for
data processing is usually understood rather more narrowly. For example, it
enables an online retailer to provide a customer’s address to a courier, which
is clearly necessary data processing under the terms of the contract between
the store and the customer. Meta had relied on Article 6(1)(b) as its main
justification for data processing for targeted advertising by claiming that
targeted advertising was part of the service it contractually owes its users –
a clause it introduced as part of a change to its terms of service (ToS) made
at the stroke of midnight on 25 May 2018, which is the precise second the GDPR
first came into force. Max Schrems, founder of Austria-based data protection
campaign group NOYB, argued that Meta seemed to have taken the view that it
could just “add random elements” to the contract, i.e. its ToS, covering
personalised advertising, to avoid offering users a yes or no consent
option.
Cloud Equity Group’s Sean Frank Talks AI Mingling with the Cloud
What AI is going to allow cloud to do is really kind of predict those changes
that are needed. So instead of reacting to slowdowns because you’re running
out of memory, you’re running out of processing power -- it could predict when
those things are going to happen based on analyzing historical data and then
it can allocate the resources dynamically. For a small organization or a small
environment, it may not sound like a big task but as you’re thinking about
these larger enterprises that might be running hundreds or thousands of
servers that are all running different programs and they interact, if one
thing goes down it can affect the rest of the ecosystem. It really is a very
important aspect in being able to help make sure that everything is up and can
be maintained. From a maintenance standpoint as well, it really tends to a
very significant benefit. Right now, maintenance in general, in IT and
cloud-based infrastructure, is largely reactive. There’s a problem, the user
reports the problem, and someone from the IT department will then investigate
the problem.
Is Zero Trust Achievable?
Microsegmentation is often the biggest hurdle businesses will face when
implementing Zero Trust and many decide to forego it. Because the architecture
is now based on security need, it can be complex to implement, particularly
over on-premise private networks which tend to be flat and have high levels of
implicit trust. Microsegmentation projects can run on for months and failure
rates are high. Forrester’s Best Practices for Zero Trust Microsegmentation
found that out of the 14 vendors who attempted to microsegment their private
networks, 11 failed and concluded that in order to succeed, senior management
buy-in is needed that oversees the removal of implicit trust between
identities. However, there are other issues that can see projects flounder.
Many organisations have legacy systems in situ, for instance, that were
designed to function on a perimeterised network so assume trust.
Realistically, the business may have to build the ZTNA around these until they
become are retired or replaced by cloud-based SaaS alternatives, suggests the
National Cyber Security Centre in its Zero Trust Architecture Principles.
Data Governance: The Oft-Overlooked Pillar of Strong Data Quality
Ungoverned data typically originates because someone in the organization, such
as a data analyst, produces data without establishing strong governance
policies to control how the data will be shared, secured, and maintained over
time. Data producers may also lack an understanding of the underlying data
they are working with and the business rules for aggregating or disseminating
various KPIs related to the data. The decision to share data without first
sorting out these issues is not usually the result of malfeasance. On the
contrary, ungoverned data often emerges because someone in the organization
creates data that other people need, and in the rush to ensure that they can
start using the data, sharing begins before anyone creates a plan in place for
governing the data over the long term. It doesn’t help, either, that it’s
common for businesses to have IT solutions in place that make it easy to share
data, but not to govern it. The typical IT organization implements software
that can store and distribute information across an organization, but the IT
department has little or no knowledge of how different business units will use
the data.
Why cyberpsychology is such an important part of effective cybersecurity
Cyberpsychologists and enterprise cybersecurity practitioners both stress the
need to better understand how people interact with technology to create a
stronger cybersecurity posture. They point to statistics showing that most
breaches involve some sort of human misstep. Verizon's 2023 Data Breach
Investigations Report, for example, found that "74% of all breaches include
the human element, with people being involved either via error, privilege
misuse, use of stolen credentials or social engineering." As Huffman says,
hackers "don't want to go toe-to-toe with your firewall. They don't want to
challenge your antivirus, because that's very difficult, not when they can
exploit the largest vulnerability on every network on the planet right now --
that's us, people. Cybercriminals are not just hacking computers; they are
hacking humans. Because ... unlike computers, we actually respond to
propaganda." Psychology gets at why humans do what they do, says Huffman,
founder of cybersecurity services firm Handshake Leadership.
CISA's New 'CyberSentry' Program to Tighten ICS Security
The CyberSentry program is part of the several cybersecurity provisions in the
National Defense Authorization Act signed by U.S. President Joe Biden in
December 2021. The act provisioned $768 billion in defense spending including
the cybersecurity component of various national and federal agencies. The
program aims to support CISA's efforts to defend U.S. critical infrastructure
networks operators that support national critical functions such as power and
water supply, banks and financial institutions and healthcare by monitoring
both known and unknown malicious activity affecting IT and OT networks. The
CyberSentry program is based on a mutual agreement between CISA and
participating critical infrastructure partners. The program is voluntary and
is provided at no additional fees or equipment costs to the participating
partners. Under the program, CISA harnesses sensitive government information
and provides visibility and mitigation of cyber threats targeting critical
infrastructure.
Quote for the day:
"You can only lead others where you
yourself are willing to go." -- Lachlan McLean
No comments:
Post a Comment