Manufacturing is becoming a major target for ransomware attacks
For cyber criminals, manufacturing makes a highly strategic target because in
many cases these are operations that can't afford to be out of action for a
long period of time, so they could be more likely to give in to the demands of
the attackers and pay hundreds of thousands of dollars in bitcoin in exchange
for getting the network back. "Manufacturing requires significant uptime in
order to meet production and any attack that causes downtime can cost a lot of
money. Thus, they may be more inclined to pay attackers," Selena Larson,
intelligence analyst for Dragos, told ZDNet. "Additionally, manufacturing
operations don't necessarily have the most robust cybersecurity operations and
may make interesting targets of opportunity for adversaries," she added. The
nature of manufacturing means industrial and networking assets are often
exposed to the internet, providing avenues for hacking groups and ransomware
gangs to gain access to the network via remote access technology such as
remote desktop protocol (RDP) and VPN services or vulnerabilities in unpatched
systems. As of October 2020, the company said there were at least 108
advisories containing 262 vulnerabilities impacting industrial equipment found
in manufacturing environments during the course of this year alone.
Humanitarian data collection practices put migrants at risk
“Instead of helping people who face daily threats from unaccountable
surveillance agencies – including activists, journalists and people just
looking for better lives – this ‘aid’ risks doing the very opposite,” said PI
advocacy director Edin Omanovic. To overcome the issues related to
“surveillance humanitarianism”, the report recommends that all UN humanitarian
and related bodies “adopt and implement mechanisms for sustained and
meaningful participation and decision-making of migrants, refugees and
stateless persons in the adoption, use and review of digital border
technologies”. Specifically, it added that migrants, refugees and others
should have access to mechanisms that allow them to hold bodies like the UNHCR
directly accountable for violations of their human rights resulting from the
use of digital technologies, and that technologies should be prohibited if it
cannot be shown to meet equality and non-discrimination requirements. It also
recommends that UN member states place “an immediate moratorium on the
procurement, sale, transfer and use of surveillance technology, until robust
human rights safeguards are in place to regulate such practices”. A separate
report on border and migration “management” technologies published by European
Digital Rights (EDRi), which was used to supplement the UN report ...
Machine Learning Testing: A Step to Perfection
Usually, software testing includes Unit tests, Regression tests and
Integration tests. Moreover, there are certain rules that people follow: don’t
merge the code before it passes all the tests, always test newly introduced
blocks of code, when fixing bugs, write a test that captures the bug. Machine
learning adds up more actions to your to-do list. You still need to follow
ML’s best practices. Moreover, every ML model needs not only to be tested but
evaluated. Your model should generalize well. This is not what we usually
understand by testing, but evaluation is needed to make sure that the
performance is satisfactory. ... First of all, you split the database into
three non-overlapping sets. You use a training set to train the model. Then,
to evaluate the performance of the model, you use two sets of
data: Validation set - Having only a training set and a testing set is
not enough if you do many rounds of hyper parameter-tuning (which is always).
And that can result in over fitting. To avoid that, you can select a small
validation data set to evaluate a model. Only after you get maximum accuracy
on the validation set, you make the testing set come into the game;
and Test set (or holdout set) - Your model might fit the training dataset
perfectly well. ...
How The Future Of Deep Learning Could Resemble The Human Brain
For deep learning, the model training stage is very similar to the initial
learning stage of humans. During early stages, the model experiences a mass
intake of data, which creates a significant amount of information to mine for
each decision and requires significant processing time and power to determine
the action or answer. But as training occurs, neural connections become
stronger with each learned action and adapt to support continuous learning. As
each connection becomes stronger, redundancies are created and overlapping
connections can be removed. This is why continuously restructuring and
sparsifying deep learning models during training time, and not after training
is complete, is necessary. After the training stage, the model has lost most
of its plasticity and the connections cannot adapt to take over additional
responsibility, so removing connections can result in decreased accuracy.
Current methods such as the one unveiled in 2020 by MIT researchers where
attempts are made to make the deep learning model smaller post-training phase
have reportedly seen some success. However, if you prune in the earlier stages
of training when the model is most receptive to restructuring and adapting,
you can drastically improve results.
Quantum Computing: A Bubble Ready to Burst?
If there is a quantum bubble, it’s inflated both by the new flurry of
Sycamore-type academic work and a simultaneous push from private corporations
to develop real-world quantum applications, like avoiding traffic jams, as a
form of competitive advantage. We’ve known about the advantages that quantum
physics can offer computing since at least the 1980s, when Argonne physicist
Paul Benioff described the first quantum mechanical model of a computer. But
the allure of the technology seems to have just now bitten enterprising
businesspeople from the tiniest of startups to the largest of conglomerates.
“My personal opinion is there’s never been a more exciting time to be in
quantum,” says William Hurley. Strangeworks, the startup he founded in 2018,
serves as a sort of community hub for developers working on quantum
algorithms. Hurley, a software systems analyst who has worked for both Apple
and IBM, says that more than 10,000 developers have signed up to submit their
algorithms and collaborate with others. Among the collaborators—Austin-based
Strangeworks refers to them as “friends and allies”—is Bay Area startup
Rigetti Computing, which supplies one of the three computers that Amazon Web
Services customers can access to test out their quantum algorithms.
C++ programming language: How it became the invisible foundation for everything, and what's next
As of September 2020, C++ is the fourth most popular programming language
globally behind C, Java and Python, and – according to the latest TIOBE index
– is also the fastest growing. C++ is a general-purpose programming language
favored by developers for its power and flexibility, which makes it ideal for
operating systems, web browsers, search engines (including Google's), games,
businesses applications and more. Stroustrup summarizes: "If you have a
problem that requires efficient use of hardware and also to handle significant
complexity, C++ is an obvious candidate. If you don't have both needs, either
a low-level efficient language or a high-level wasteful language will do." Yet
even with its widespread popularity, Stroustrup notes that it is difficult to
pinpoint exactly where C++ is used, and for what. "A first estimate for both
questions is 'everywhere'," he says. "In any large system, you typically find
C++ in the lower-level and performance-critical parts. Such parts of a system
are often not seen by end-users or even by developers of other parts of the
system, so I sometimes refer to C++ as an invisible foundation of everything."
Cybercrime To Cost The World $10.5 Trillion Annually By 2025
Cybercrime has hit the U.S. so hard that in 2018 a supervisory special agent
with the FBI who investigates cyber intrusions told The Wall Street Journal
that every American citizen should expect that all of their data (personally
identifiable information) has been stolen and is on the dark web — a part of
the deep web — which is intentionally hidden and used to conceal and promote
heinous activities. Some estimates put the size of the deep web (which is not
indexed or accessible by search engines) at as much as 5,000 times larger than
the surface web, and growing at a rate that defies quantification. The dark
web is also where cybercriminals buy and sell malware, exploit kits, and
cyberattack services, which they use to stirke victims — including businesses,
governments, utilities, and essential service providers on U.S. soil. A
cyberattack could potentially disable the economy of a city, state or our
entire country. In his 2016 New York Times bestseller — Lights Out: A
Cyberattack, A Nation Unprepared, Surviving the Aftermath — Ted Koppel reveals
that a major cyberattack on America’s power grid is not only possible but
likely, that it would be devastating, and that the U.S. is shockingly
unprepared.
Role of FinTech in the post-COVID-19 world
As the global economy recovers from COVID-19, one particular area of focus for
FinTech is financial inclusion. According to the World Bank, there are
currently around 1.7 billion unbanked individuals worldwide, and FinTechs will
be central to efforts to integrate these people into the global banking
system. Doing so will help to mitigate the economic and social impact of the
pandemic. According to Deloitte, FinTechs, in strategic partnerships with
financial institutions, retailers and government sectors across jurisdictions,
can help democratise financial services by providing basic financial services
in a fair and transparent way to economically vulnerable populations. Digital
finance is also expanding in other areas. Health concerns in the COVID-19 era
have made physical cash payments less practical, opening the door to an
increase in digital payments and e-wallets. Though cash use was predicted to
decline in any case, COVID-19 has hurried that decline, due to concerns that
handing over money can cause human to human transmission of the virus.
According to a Mastercard survey looking at the implications of the
coronavirus pandemic, 82 percent of respondents worldwide viewed contactless
as the cleaner way to pay, and 74 percent said they will continue to use
contactless payment post-pandemic.
DNS cache poisoning poised for a comeback: Sad DNS
Here's how it works: First, DNS is the internet's master address list. With
it, instead of writing out an IPv4 address like "173.245.48.1," or an IPv6
address such as "2400:cb00:2048:1::c629:d7a2," one of Cloudflare's many
addresses, you simply type in "http://www.cloudflare.com," DNS finds the right
IP address for you, and you're on your way. With DNS cache poisoning, however,
your DNS requests are intercepted and redirected to a poisoned DNS cache. This
rogue cache gives your web browser or other internet application a malicious
IP address. Instead of going to where you want to go, you're sent to a fake
site. That forged website can then upload ransomware to your PC or grab your
user name, password, and account numbers. In a word: Ouch! Modern defense
measures -- such as randomizing both the DNS query ID and the DNS request
source port, DNS-based Authentication of Named Entities (DANE), and Domain
Name System Security Extensions (DNSSE) -- largely stopped DNS cache
poisoning. These DNS security methods, however, have never been deployed
enough, so DNS-based attacks still happen. Now, though researchers have found
a side-channel attack that can be successfully used against the most popular
DNS software stacks, SAD DNS.
CIOs tasked to make healthcare infrastructure composable
The composable healthcare organization is a healthcare organization that can
reconfigure its capabilities -- both its business and operating model -- at the
pace of market change. We have lived in a world and in an industry where there's
been stable business and operational models. If you're a provider organization
or a payer organization or a life sciences company, those heritage business
models have been pretty stable. That's in terms of how organizations think,
their culture, the way their business is architected -- so the organizational
structures, the way they collaborate, all the way down to the way we've
architected technology. They've really done that in service of a relatively
stable business and operating model. What we're marking here are three main
points. On a very simple level it's this: Adaptability is more important than
ever, adaptability is more possible than ever, adaptability can be done by the
people who you and I are speaking to -- the people you're reporting for and the
people we work with on the Gartner health team. The idea of adaptability is
nothing new to CIOs, in general. If you go back to when many of today's CIOs
were in high school or even in college, there was reusable code, object-oriented
programming -- we've just gone through a decade-and-a-half of more data services
and agile development.
Quote for the day:
"If you genuinely want something, don't wait for it--teach yourself to be impatient." -- Gurbaksh Chahal
No comments:
Post a Comment