The New Face of Data Management
Despite the data explosion, IT organizations haven’t necessarily changed storage
strategies. They keep buying expensive storage devices because unassailable
performance is required for critical or “hot” data. The reality is that all data
is not diamonds. Some of it is emeralds and some of it is glass. By treating all
data the same way, companies are creating needless cost and complexity. ... Yet
as hot data continues to grow, the backup process becomes sluggish. So, you
purchase expensive, top-of-line backup solutions to make this faster, but you
still need ever-more storage for all these copies of your data. The ratio of
unique data (created and captured) to replicated data (copied and consumed) is
roughly 1:9. By 2024, IDC expects this ratio to be 1:10. Most organizations are
backing up and replicating data that is in fact rarely accessed and better
suited to low-cost archives such as in the cloud. Beyond backup and storage
costs, organizations must also secure all of this data. A one-size-fits-all
strategy means that all data is secured to the level of the most sensitive,
critically important data.
Technology and the future of modern warfare
This digital revolution points to a new kind of hyper-modern warfare. Artificial
intelligence is a good example of this. If AI can read more data in a minute
than a human can read in a year, then its value to militaries is immeasurable.
In a recent interview with The Daily Telegraph, the current Chief of General
Staff, General Sir Mark Carleton-Smith, has acknowledged that “we are already
seeing the implications of artificial intelligence, quantum computing and
robotics, and how they might be applied on the battlefield”. Machine learning,
for instance, has already been used to harvest key grains of intelligence from
the chaff of trivial information that usually inundates analysts. All this is
not to say, however, that there will be a complete obsolescence of traditional
equipment and means. The British Army remains an industrial age organisation
with an industrial skill set, but one which is confronted by innovation
challenges. Conventional threats can still materialise at any time. The recent
stationing of Russian troops along the Ukrainian border and within the Crimea –
in addition to the manoeuvring of its naval forces in the Sea of Azov – is a
case in point.
Attacking Natural Language Processing Systems With Adversarial Examples
The attack can potentially be used to cripple machine learning translation
systems by forcing them to either produce nonsense, or actually change the
nature of the translation; to bottleneck training of NLP models; to misclassify
toxic content; to poison search engine results by causing faulty indexing; to
cause search engines to fail to identify malicious or negative content that is
perfectly readable to a person; and even to cause Denial-of-Service (DoS)
attacks on NLP frameworks. Though the authors have disclosed the paper’s
proposed vulnerabilities to various unnamed parties whose products feature in
the research, they consider that the NLP industry has been laggard in protecting
itself against adversarial attacks. The paper states: ‘These attacks exploit
language coding features, such as invisible characters and homoglyphs. Although
they have been seen occasionally in the past in spam and phishing scams, the
designers of the many NLP systems that are now being deployed at scale appear to
have ignored them completely.’
When done right, network segmentation brings rewards
Segmentation is an IT approach that separates critical areas of the network to
control east-west traffic, prevent lateral movement, and ultimately reduce the
attack surface. Traditionally, this is done via an architectural approach –
relying on hardware, firewalls and manual work. This can often prove cumbersome
and labor intensive, which is a contributing factor in 82% of respondents saying
that network segmentation is a “huge task.” ... Modern segmentation uses a
software-based approach that is simpler to use, faster to implement and is able
to secure more critical assets. The research shows that organizations that
leverage the latest approach to segmentation will realize essential security
benefits, like identifying more ransomware attacks and reducing time to mitigate
attacks. “The findings of the report demonstrate just how valuable a strong
segmentation strategy can be for organizations looking to reduce their attack
surface and stop damaging attacks like ransomware,” said Pavel Gurvich, SVP,
Akamai Enterprise Security.
Neural networks can hide malware, and scientists are worried
As malware scanners can’t detect malicious payloads embedded in deep learning
models, the only countermeasure against EvilModel is to destroy the malware.
The payload only maintains its integrity if its bytes remain intact.
Therefore, if the recipient of an EvilModel retrains the neural network
without freezing the infected layer, its parameter values will change and the
malware data will be destroyed. Even a single epoch of training is probably
enough to destroy any malware embedded in the DL model. However, most
developers use pretrained models as they are, unless they want to fine-tune
them for another application. And some forms of finetuning freeze most
existing layers in the network, which might include the infected layers. This
means that alongside adversarial attacks, data-poisoning, membership
inference, and other known security issues, malware-infected neural networks
are a real threat to the future of deep learning.
5 Key Skills Needed To Become a Great Data Scientist
Data Scientists should develop the habit of critical thinking. It helps in
better understanding the problem. Unless the problems are understood to the
most granular level the solution can’t be good. Critical thinking helps in
analyzing the different options and helps in choosing the right one. While
solving data science problems it is not always a good or bad decision. A lot
of options lie in the grey area between good and bad. There are so many
decisions involved in a data science project. Like, choosing the right set
of attributes, the right methodology, the right algorithms, the right
metrics to measure the model performance, and so on. ... Coding skills are
as much important to a data scientist as eyes are for an artist. Anything
that a data scientist would do requires coding skills. From reading data
from multiple sources, performing exploratory analysis on the data, building
models, and evaluating them. ... Math is another important skill to be
understood by data scientists. It will be OK for you to not be aware of some
of the math concepts while learning data science. It will not be possible to
excel as a data scientist without understanding the math concepts.
DeepMind Now Wants To Study The Behaviour Of Electrons, Launches An AI Tool
Density functional theory (DFT) describes matter at the quantum level, but
popular approximations suffer from systematic errors that have arisen from
the violation of mathematical properties of the exact functional. DeepMind
has overcome this fundamental limitation by training a neural network on
molecular data and on fictitious systems with fractional charge and spin.
The result was the DM21 (DeepMind 21) tool. It correctly describes typical
examples of artificial charge delocalization and strong correlation and
performs better than traditional functionals on thorough benchmarks for
main-group atoms and molecules. The company claims that DM21 accurately
models complex systems such as hydrogen chains, charged DNA base pairs, and
diradical transition states. The tool DM21 is a neural network to achieve
the state of the art accuracy on large parts of chemistry and to accelerate
scientific progress; the code has been open-sourced.
Are venture capitalists misunderstood?
The arrival of growth investors put traditional VCs under further pressure.
Whereas Rock and his successors regularly got involved long before there was a
product to market, others eventually realized there were opportunities further
down the line, and provided vast amounts of capital to established firms that
they believed had the potential to become many times larger. These investors,
who included Yuri Milner and Masayoshi Son, were irresistible to ambitious
tech companies. Unlike VCs, which demanded equity in exchange for funding,
Milner and Son did not even want to sit on the board. Mallaby argues that huge
capital injections by growth investors (and the VCs that chose to compete with
them) resulted in greater control for entrepreneurs, but also weaker corporate
governance and ultimately over-reach and ill-discipline. “Precisely at the
point when tech companies achieved escape velocity and founders were apt to
feel too sure of themselves,” Mallaby writes, “the usual forms of private or
public governance would thus be suspended.”
IT security: 4 issues to watch in 2022
If infosec had a greatest hits album, basic security hygiene would be track one.
Year in, year out, the root cause of many security incidents can be traced back
to the fundamentals. A wide range of threats, from ransomware to cloud account
hijacking to data leakage, owe much of their efficacy to surprisingly simple
missteps, from a misconfigured setting (or even a default setting left
unchanged) to an over-privileged user to unpatched software. ... This begs the
question: What are the basics? Things like password hygiene and system patching
apply across the board, but you also need to identify and agree with colleagues
on “the basics” required in your specific organization. That gives you a
collective standard to work toward and measure against. Moreover, the word
“basic” doesn’t do some of the fundamentals justice “In my world, the basics are
patch management, secure configuration, threat modeling, DAST and SAST scanning,
internal and external vulnerability scanning, penetration testing, defense
against phishing attacks, third-party vulnerability assessments, backup and
disaster recovery, and bespoke security training,” Elston says.
Growing an Experiment-Driven Quality Culture in Software Development
When people share their thoughts and wishes, there might be different needs
underlying them. For example, tech leadership can express their desire for
global quantitative metrics, while they might actually need to figure out which
impact they want to have in the first place and which information they need to
have this impact. Remember the teams falling back to everyday business? The
systemic part plays a huge role to consider in your experiments. For example, if
you’re setting out to improve the quality culture of a team, think about what
kind of behavior contributing to quality gets rewarded and how. If a person does
a great job, yet these contributions and the resulting impact are not valued,
they probably won’t get promoted for it. The main challenges usually come back
to people’s interactions as well as the systems in which we are interacting.
This includes building trustful relationships and shaping a safe, welcoming
space where people can bring their whole authentic selves and have a chance to
thrive.
Quote for the day:
"The strong do what they have to
do and the weak accept what they have to accept." -- Thucydides
No comments:
Post a Comment