FCA examining big tech disruption of financial services
Warnings of financial services sector competition being harmed have been
expressed by the UK regulator, as Amazon, Apple, Google and Meta look to
continue innovating in the industry, reports the Financial Times. The FCA will
ask the corporations — which all hold FCA permits for payment processing in the
UK — for their perspectives on how Silicon Valley could expand into payments,
deposits, credit and insurance. All four companies hold payment action permits,
with Amazon and Apple also having some permissions regarding consumer credit and
insurance. While the watchdog acknowledges that big tech involvement of
financial services would bring “increased efficiency” and “healthy competition”
in the short term, it states that this could lead to longer term exploitation of
ecosystems and data stores, to “lock consumers in”. The body has also suggested
that tech companies generally should share customer data with traditional
financial service institutions.
The ins and outs of migrating SQL Server to the cloud
A homogeneous migration between an on-prem version of SQL Server and the RDS
equivalent can actually be relatively easy, says Ayodele. The only change
necessary is an alteration of the system schema. RDS has built-in stored
procedures for management purposes that are not in the on-prem SQL Server
engine. So, customers can simply migrate the database itself to avoid corrupting
the RDS system schema. The next step will be to use native tools or AWS Database
Migration Service (AWS DMS) to port the data across from the source to the
destination. With AWS DMS, the source database remains operational during this
process to minimize downtime. DMS can use change data capture (CDC) technology
to keep track of ongoing changes in the source database during migration. Once
the migration is done, the final steps will be to run the RDS version as a
replica and then switch over to the RDS primary database instance when ready.
There are some best practices that customers should follow when migrating,
Ayodele says.
Big 3 Public Cloud Providers Highlight Cost Control Capabilities
"The long-term trends that are driving cloud adoption continue to play an even
stronger role during uncertain macroeconomic times," Pichai said. "As companies
globally are looking to drive efficiencies, Google Cloud's open infrastructure
creates a valuable pathway to reduce IT costs and modernize." Using the cloud to
do more with less was also a theme echoed by Microsoft CEO Satya Nadella during
his company's earnings call. Moving to the cloud helps organizations align their
spend with demand and mitigate risk around increasing energy costs and supply
chain constraints, Nadella said. Microsoft is also very optimistic about the
growth of hybrid cloud in addition to public cloud services. Nadella said that
Microsoft now has more than 8,500 customers for its Azure Arc technology, which
is more than double the number a year ago. "We're also seeing more customers
turn to us to build and innovate with infrastructure they already have," he
said. "With Azure Arc, organizations like Wells Fargo can run Azure services,
including containerized applications across on-premises, edge, and multicloud
environments."
Delivering visibility requires a new approach for SecOps
The biggest challenge most organizations face when operationalizing these
frameworks is the fact that the data/information they need is siloed across
multiple data systems and cybersecurity tools. Security data lives in multiple
places, with organizations using a variety of data logging systems like Splunk,
Snowflake, or other data lakes as the foundation for threat hunting and
research. The security operations center (SOC) will layer on platforms for
Security Information and Event Management (SIEM), Extended Detection and
Response (XDR), and other tools on top of these data lakes to help analyze data
and correlate events (e.g., Crowdstrike Falcon Data Replicator or email security
systems, such as Proofpoint or Tessian). Security analysts can spend hours
exporting data from these systems, tagging and normalizing the information, and
ingesting that data into their SIEMs and SOARs before they can begin to detect,
hunt, triage, and respond to threats. But next-gen tools are being developed
that address this exact issue with automation and machine-learning.
The growing role of audit in ESG information integrity and assurance
Like audits of financial statements and the internal control over financial
reporting, third-party assurance enhances the reliability of ESG information
and builds confidence among stakeholders. To do this, auditors conduct
attestation engagements to provide assurance that ESG information is presented
in accordance with certain criteria. “We help management and the board feel
confident in the reported ESG information, which is important given the
increased focus and attention from external stakeholders,” explains Whittaker.
More specifically, ESG assurance obtained from a certified public accountant,
“involves the evaluation of processes, systems, and data, as appropriate, and
then assessing the findings in order to support an opinion based on an
examination [reasonable assurance] or conclusion based on a review [limited
assurance],” according to the Center for Audit Quality. Because companies are
at different stages of their sustainability journeys, the breadth of ESG
assurance engagements is vast.
If you’re going to build something from scratch, this might be as good a time as in a decade
the environment for launching a start-up was really crazy the past five years.
And the truth is that if you’re going to build something from scratch, this
might be as good a time as you’ve had in a decade. Real estate? You can get
all the real estate you want. People used to fret about lease cost, but that’s
all gone. And while people get caught up on whether the money’s cheap or not,
getting rid of the distraction of all that cheap money may be a good thing.
That whole mentality of, oh, your competitor raised $100 million, now you have
to raise $100 million. All those things have evaporated—for the better, I’d
say. A huge thing is that your access to talent is way better. It was so hard
to get, but now it’s a lot cheaper than it was. There are layoffs happening.
And then hybrid has opened up the people you can get. I’ve heard some pretty
amazing stories. Jennifer Tejada, who runs PagerDuty, says they went into the
pandemic at 85 percent Bay Area employees and came out at 25 percent.
Good Governance: 9 Principles to Set Your Organization up for Success
Good corporate governance requires that records and processes are
transparent and available to shareholders and stakeholders. Financial
records should not be inflated or exaggerated. Reporting should be presented
to shareholders and stakeholders in ways that enable them to understand and
interpret the findings. Transparency means that stakeholders should be
informed of key corporate contacts and told who can answer questions and
explain reports, if necessary. Corporations should provide enough
information in their reports so that readers get a complete view of the
issues. ... All too often, the corporate world’s focus can be taken up by
sudden crises and controversies. A timely response to the unexpected is
crucial, with corporations that practice good governance usually able to
prioritize swift and honest communication with shareholders and
stakeholders. ... Many corporations also consider the environmental impact
as they perform their duties and responsibilities.
How a Marketing Tool is Becoming the Healthcare Industry’s Security Nightmare
“Consumer activity tracking for the purpose of marketing is not a fit for
the health sector,” Mike Hamilton, CISO of cybersecurity firm Critical
Insight and former CISO for the city of Seattle, argues. “Because of
regulatory oversight by the [US] Department of Health and Human Services, as
well as the privacy statutes coming out of states, like the California
Consumer Privacy Act, this is not information that is germane to the health
sector mission, and its possession creates significant liability.” ... “This
could have been prevented by the use of other analytic tools to understand
patient usage rather than a marketing technique that is designed to gather
and share so much information that is outside the scope of the intended
purpose.” “At least dozens of the nation's top hospitals use tracking pixels
for millions of patients. That may be changing fast due to new laws and
lawsuits that will force organizations to change course drastically,” Paul
Innella, CEO of TDI, a global cybersecurity company in the banking and
healthcare spaces, tells InformationWeek.
Cranefly Cyberspy Group Spawns Unique ISS Technique
ISS logs record data such as webpages visited and apps used. The Cranefly
attackers are sending commands to a compromised Web server by disguising
them as Web access requests; IIS logs them as normal traffic, but the
dropper can read them as commands, if they contain the strings Wrde, Exco,
or Cllo, which don't normally appear in IIS log files. "These appear to be
used for malicious HTTP request parsing by Geppei — the presence of these
strings prompts the dropper to carry out activity on a machine," Gorman
notes. "It is a very stealthy way for attackers to send these commands." The
commands contain malicious encoded .ashx files, and these files are saved to
an arbitrary folder determined by the command parameter and they run as
backdoors (i.e., ReGeorg or Danfuan). Gorman explains that the technique of
reading commands from IIS logs could in theory be used to deliver different
types of malware if leveraged by threat actors with different goals.
Mitigating the risks of artificial intelligence compromise
The fundamental actions required from any security approach is to protect,
detect, attest, and recover from any modifications to coding, whether
malicious or otherwise. The best way to fully secure a compromised AI is
applying a “trusted computing” model that covers all four AI elements.
Starting with the data set aspect of a system, a component such as a Trusted
Platform Module (TPM) is able to sign and verify that any data provided to
the machine has been communicated from a reliable source. A TPM can ensure
the safeguarding of any algorithms used within an AI system. The TPM
provides hardened storage for platform or software keys. These keys can then
be used to protect and attest the algorithms. Furthermore, any deviations of
the model, if bad or inaccurate data is supplied, can be prevented through
applying trusted principles focusing on cyber resiliency, network security,
sensor attestation, and identity.
Quote for the day:
"The very essence of leadership is
that you have to have vision. You can't blow an uncertain trumpet." --
Theodore Hesburgh
No comments:
Post a Comment