Ransomware: Attacks could be about to get even more dangerous and disruptive
Ransomware attacks have become more powerful and lucrative than ever before –
to such an extent that advanced cyber-criminal groups have switched to using
it over their traditional forms of crime – and it's very likely that they're
just going to become even more potent in 2021. For example, what if
ransomware gangs could hit many different organisations at once in a
coordinated attack? This would offer an opportunity to illicitly make a large
amount of money in a very short amount of time – and one way malicious hackers
could attempt to do this is by compromising cloud services with ransomware.
"The next thing we're going to see is probably more of a focus on cloud.
Because everyone is moving to cloud, COVID-19 has accelerated many
organisations cloud deployments, so most organisations have data stored in the
cloud," says Andrew Rose, resident CISO at Proofpoint. We saw a taster of the
extent of the widespread disruption that can be caused when cyber criminals
targeted smartwatch and wearable manufacturer Garmin with ransomware. The
attack left users around the world without access to its services for days. If
criminals could gain access to cloud services used by multiple organisations
and encrypt those it would cause widespread disruption to many organisations
at once.
Overcoming Data Scarcity and Privacy Challenges with Synthetic Data
Synthetic data is data that is artificially generated rather than collected by
real-world events. It is data that serves the purpose of resembling a real
dataset but is entirely fake in nature. Data has a distribution, a shape that
defines the way it looks. Picture a dataset in a tabular format. We have all
these different columns and there are hidden interactions between the columns,
as well as inherent correlations and patterns. If we can build a model to
understand the way the data looks, interacts, and behaves, then we can query
it and generate millions of additional synthetic records that look, act, and
feel like the real thing. Now, synthetic data isn’t a magical process. We
can’t start with just a few poor-quality data points and expect to have a
miraculous high-quality synthetic dataset from our model. Just like the old
saying goes, "garbage in, garbage out," in order to create high-quality
synthetic data, we need to start with a dataset that is both high-quality and
plentiful in size. With this, it is possible to expand our current dataset
with high-quality synthetic data points.
How Brexit Could Help London Evolve From A Fintech Center Into A DeFi Hub
The popularity of DeFi—using crypto technology to recreate traditional
financial instruments such as loans and insurance—has exploded over the last
year or so, growing to a $16 billion global market. The price of ethereum,
the world's second largest cryptocurrency by value, has soared this year as
investors pour funds into DeFi projects that are built on top of it.
"There's more and more DeFi innovators in London," said Stani Kulechov, the
founder and chief executive of London-based technology company and DeFi
protocol Aave, speaking over the phone. "Up until recently, fintechs and
banks have been all about innovating on the front-end—the user experience.
Now, DeFi is helping the back-end innovate." Aave, a money market for
lending and borrowing assets, has become one of the top DeFi protocols since
it was created in 2017 and was given an Electronic Money Institution license
in July by the U.K.'s Financial Conduct Authority (FCA). "I think we'll see
London emerge as a hub for DeFi," added Kulechov. The City of London, a
financial powerhouse rivaled only by New York, is currently under threat as
the U.K. prepares to end its transition out of the European Union at the end
of this month.
Outlook 2021: Designing data governance policies to promote domestic startups
With more and more startups relying on data driven business models and
analytics for improving the service/product, and using data for their
competitive advantage, data governance laws with steep compliances are a cause
for worry. The regulations will have a direct effect on how the businesses
deal with data available to them, and that is on the market. The
regulatory uncertainty in matters pertaining to handling data and the drawing
economic value from data, causes indirect impact on long term innovation and
investments as well. Investors that are looking for facilitating growth in the
domestic market are also deeply concerned about the current trend of steep
compliance, excessive government access to data and regulatory uncertainty. In
this context, the commonalities in the two frameworks are pertinent to note.
Firstly, both the PDP and the NPD framework restrict cross border data flows,
citing reasons pertaining to sensitivity of data that underlies it. While
the concerns regarding harm are valid, the solution to address the concerns
might be misplaced. The assumption is that security is better served if the
data is stored within the territorial limits of the country and that rests on
shaky grounds.
Why cybersecurity tools fail when it comes to ambiguity
"Cybersecurity is very good at identifying activities that are black or
white--either obviously bad and dangerous or clearly good and safe," writes
Margaret Cunningham, PhD, psychologist and principal research scientist at
Forcepoint's Innovation Lab, in her research paper Exploring the Gray Space
of Cybersecurity with Insights from Cognitive Science. "But, traditional
cybersecurity tools struggle with ambiguity--our algorithms are not always
able to analyze all salient variables and make a confident decision whether
to allow or block risky actions." For example, an employee accessing
sensitive files after company business hours might not be a security
issue--the person could be traveling and in a different time zone. "We don't
want to stop the person from doing work because the access is flagged as an
unapproved intrusion due to the time," says Cunningham. "Building the
capability to reason across multiple factors, or multiple categories, will
help prevent the kinds of concrete reasoning mistakes that result in false
positives and false negatives in traditional cyber toolsets." The success of
cybercriminals, admits Cunningham, is in large part due to their ability to
quickly morph attack tools, and cybersecurity tech cannot keep pace.
The Benefits of Automating Data Lineage in the Initial Phases of a Data Governance Initiative
If you are putting in place a data governance framework you can’t put controls
and data quality reports on every single piece of data throughout your
organisation. But if you have data lineage it will help you identify the areas
where your data is most at risk of something going wrong, enabling you to put
in place appropriate checks, controls and data quality reports. Having data
lineage also allows you to speed up data discovery. So many organisations have
vast quantities of data that would be valuable to them, if only they knew it
existed. Finally, as I mentioned at the start of this article for many
industries there is a regulatory requirement to have data lineage in place.
It’s clear that having data lineage has lots of benefits, but on so many
occasions data lineage is captured and documented manually. Whether you do
data lineage automatically or manually you will achieve the benefits mentioned
above, but taking a manual approach to data lineage requires considerable
effort. When I first started capturing data lineage I tried starting at the
beginning, where data first comes into the organisation and tried to follow it
as it flowed. However, this approach fails because a lot of people who produce
or capture data have absolutely no idea where it goes.
Why Credit Karma Crafted a Tool to Automate Its DevOps Cycle
Unruh says part of his challenge when he joined Credit Karma about three
years ago was to increase efficiency of releasing code across the company.
The engineers there had been using an older Jenkins-style system, he says,
which served as a generic job runner. Developing products on that system
meant clearing a few hurdles along the way, Unruh says, including jumping
through a remote desktop running on a Windows computer. On top of that,
teams building new microservices were required to write custom deployment
code to move production forward, he says. That would be the basis for the
job for the system to execute the service, Unruh says. That meant everything
was different because every team took their own approach, he says, which
slowed them down. “It linearly required 15 steps just to deploy your service
into production,” Unruh says. “It was really cumbersome and there was no way
for us to standardize.” Looking for ways to improve efficiency, he wanted to
eliminate the need to jump to another host just to access the system. Unruh
says he also sought to end the need for custom code for deploying a service.
“I just build a service and I can deploy it,” he says.
Q&A on the Book Retrospectives Antipatterns
Retrospectives antipatterns are patterns I have seen recurring in many
retrospectives, and the way I have described them in the book is in the
context you would normally find them, the antipattern "solution" that is
often used for various reasons, such as haste, ignorance, or fear, and the
refactored solution to this antipattern. Some of the antipatterns have a
refactored solution that will get you out of the pickle right away, but for
some of the others it is more a warning of things to avoid, because if you
find yourself in that antipattern there is nothing better to do than to
consider other options for the next retrospective. ... The prime directive
was written by Norm Kerth in his book "Project Retrospectives: A Handbook
for Team Review" and it goes like this: "Regardless of what we discover, we
understand and truly believe that everyone did the best job they could,
given what they knew at the time, their skills and abilities, the resources
available, and the situation at hand." It basically means that when we enter
a retrospective we should strive to be in the mindset that allows us to
think that everybody did the best they could at all times, given the
circumstances.
Here’s How CIOs Can Create More Inclusive Cultures In Their Tech Teams
Often, diversity and inclusion outcomes are directly linked to recruitment and
outreach efforts. But while many people fret about flaws in the education
system that seem to discourage young women from pursuing tech-related
subjects, Barrett has found in her work with Girls Who Code that the problem
lies elsewhere. It’s not a lack of interest amongst female students, she said.
Instead, it’s the culture of the technology industry. Girls who complete the
organization’s program go on to major in computer science at a rate of 15
times the national average. But, Barrett noted, “our girls still don’t feel
welcome in tech.” According to a recent report by Girls Who Code and
consulting firm Accenture, it’s possible to lower the attrition rate for
female employees by 70% over the next decade. The study’s recommendations
include establishing supportive parental leave policies, creating external
goals and targets around diversity, providing workplace support for women and
creating inclusive networking opportunities. Role models are also crucial. “We
know very often that women report that it’s hard to be what they can’t see,”
Barrett said. “It’s hard to feel connected to an organization when they don’t
see women in tech thriving.”
Commonwealth entities left to self-assess security in cloud procurement
Macquarie Government managing director Aidan Tudehope said he was disappointed
by the decision to discontinue the CCSL certification regime. "This is about
more than simply the physical geographic location where data is stored. Data
sovereignty is about the legal authority that can be asserted over data
because it resides in a particular jurisdiction, or is controlled by a cloud
service provider over which another jurisdiction extends," he said. "Data
hosted in globalised cloud environments may be subject to multiple overlapping
or concurrent jurisdictions as the debate about the reach of the US CLOUD Act
demonstrates. As the ACSC points out, globalised clouds are also maintained by
personnel from outside Australia, adding another layer of risk." He believes
the only way to guarantee Australian sovereignty is ensuring data is hosted in
an Australian cloud, in an accredited Australian data centre, and is
accessible only by Australian-based staff with appropriate government security
clearances. "Taken alongside Minister Robert's planned sovereign data policy,
this guide opens new opportunities for Australian cloud service providers," he
said.
Quote for the day:
"The most important quality in a leader is that of being acknowledged as such." -- Andre Maurois
No comments:
Post a Comment