Why Culture Is the Greatest Barrier to Data Success
Achieving data success is a journey, not a sprint. Companies desire to
accelerate their efforts to become data-driven, but consistency, patience, and
steadfastness pay off in the long run. Companies that set a clear course, with
reasonable expectations and phased results over a period of time, get to the
destination faster. Develop a plan. Create a data strategy for your company if
you do not already have one. If you do have a data strategy, make sure that it
is updated annually to reflect changes in the business and the ongoing and
rapid evolution of emerging data management capabilities. Define your future
state, and build an execution road map that will take you from your current
state to the target outcome. It is hard to reach any destination without a
good road map. Companies need to maintain a long-term view and stick to it
while making periodic adjustments. Patience, persistence, and commitment are
the ingredients for ensuring a successful long-term outcome. Organizations
must evolve and change the ways in which they structure current business
processes if they expect to become more data-driven. In short, companies must
be prepared to think differently.
Silver Peak SD-WAN Collects Aruba's ClearPass Treatment
According to Lunetta, ClearPass was a natural place to start the integration
efforts. “Security has always been central to Aruba’s network solutions and is
top of mind for every customer these days, especially with the increase of
remote working and proliferation of IoT devices on the network,” he said.
Aruba’s ClearPass offering was announced in April 2019, to help enterprises
cope with the growing number of IoT and connected devices on the network.
ClearPass device insights is a terminal that employs machine learning to
automate the discovery and fingerprinting of connected devices. When paired
with Aruba’s ClearPass Policy Manager, customers can dynamically segment
security capabilities, making it possible to authenticate and enforce policies
based on device type and the needs of the user. Silver Peak customers will be
able to identify and block unauthorized users from access applicants or other
services at the WAN edge long before they get to the cloud or private data
center. “I think the biggest benefit will be adding more intelligence to the
segmentation capabilities from Silver Peak,” said John Grady, network security
analyst at ESG, in an email to SDxCentral. “By adding agentless device
visibility and context, as well as the automation and policy control from
ClearPass, SilverPeak becomes that much more attractive, especially relative
to IoT.”
Ransomware Alert: Pay2Key
Over the past week, an exceptional number of Israeli companies reported
ransomware attacks. While some of the attacks were carried out by known
ransomware strands like REvil and Ryuk, several large corporations experienced
a full blown attack with a previously unknown ransomware variant names
Pay2Key. As days go by, more of the reported ransomware attacks turn out to be
related to the new Pay2Key ransomware. The attacker followed the same
procedure to gain a foothold, propagate and remotely control the infection
within the compromised companies. The investigation so far indicates the
attacker may have gained access to the organizations’ networks some time
before the attack, but presented an ability to make a rapid move of spreading
the ransomware within an hour to the entire network. After completing the
infection phase, the victims received a customized ransom note, with a
relatively low demand of 7-9 bitcoins (~$110K-$140K). The full scope of these
attacks is still unraveling and is under investigation; but we at Check Point
Research would like to offer our initial analysis of this new ransomware
variant, as well as to provide relevant IOC’s to help mitigate possible
ongoing attacks. ... Analyzing Pay2Key ransomware operation, we were unable to
correlate it to any other existing ransomware strain, and it appears to be
developed from scratch.
Blazor: Full stack C# and Microsoft's pitch for ASP.NET Web Form diehards
Blazor is not very like web forms but has some things in common. One is that
developers can write C# everywhere, both on the server and for the browser
client. Microsoft calls this “full stack C#”. “Blazor shares many
commonalities with ASP.NET Web Forms, like having a reusable component model
and a simple way to handle user events,” wrote the authors. The Blazor
framework comes in several guises. The initial concept, and one of the
options, is Blazor WebAssembly (Wasm). The .NET runtime is complied to Wasm,
the application is compiled to a .NET DLL, and runs in the browser,
supplemented by JavaScript interop. ... Blazor is designed for single-page
applications and is reminiscent of Silverlight – Microsoft’s browser plugin in
which ran .NET code in the browser - but with an HTML/CSS user interface.
There are two other Blazor application models. Blazor Server runs on the
server and supports a thin browser client communicating with WebSockets
(ASP.NET SignalR). The programming model is the same, but it is a thin client
approach which means faster loading and no WebAssembly required; it can even
be persuaded to run in IE11.
What is data architecture? A framework for managing data
According to Data Management Book of Knowledge (DMBOK 2), data architecture
defines the blueprint for managing data assets by aligning with organizational
strategy to establish strategic data requirements and designs to meet those
requirements. On the other hand, DMBOK 2 defines data modeling as, "the
process of discovering, analyzing, representing, and communicating data
requirements in a precise form called the data model." While both data
architecture and data modeling seek to bridge the gap between business goals
and technology, data architecture is about the macro view that seeks to
understand and support the relationships between an organization's functions,
technology, and data types. Data modeling takes a more focused view of
specific systems or business cases. There are several enterprise architecture
frameworks that commonly serve as the foundation for building an
organization's data architecture framework. DAMA International's Data
Management Body of Knowledge is a framework specifically for data management.
It provides standard definitions for data management functions, deliverables,
roles, and other terminology, and presents guiding principles for data
management.
Using machine learning to track the pandemic’s impact on mental health
Using several types of natural language processing algorithms, the researchers
measured the frequency of words associated with topics such as anxiety, death,
isolation, and substance abuse, and grouped posts together based on
similarities in the language used. These approaches allowed the researchers to
identify similarities between each group’s posts after the onset of the
pandemic, as well as distinctive differences between groups. The researchers
found that while people in most of the support groups began posting about
Covid-19 in March, the group devoted to health anxiety started much earlier,
in January. However, as the pandemic progressed, the other mental health
groups began to closely resemble the health anxiety group, in terms of the
language that was most often used. At the same time, the group devoted to
personal finance showed the most negative semantic change from January to
April 2020, and significantly increased the use of words related to economic
stress and negative sentiment. They also discovered that the mental health
groups affected the most negatively early in the pandemic were those related
to ADHD and eating disorders.
‘Digital Mercenaries’: Why Blockchain Analytics Firms Have Privacy Advocates Worried
Gladstein and other advocates see this sort of blockchain analysis as an
extension of governmental surveillance, along the lines of when the National
Security Agency (NSA) was secretly gathering extensive metadata on the
American public, not to mention the agency’s work abroad. Gladstein argues
that when it comes to payment processors like Square and even exchanges, they
can make a case they work hard to protect customer privacy. But if you start a
blockchain surveillance company (as companies such as Chainalysis, CipherTrace
and Elliptic have done), that’s not a defense because the explicit purpose of
the company is to participate in the de-anonymization process.
De-anonymization is a process that has different components, one being the use
of the blockchain to trace where funds go. “Natively speaking, Bitcoin
is very privacy-protecting because it’s not linked to your identity or your
home address or your credit card history,” said Gladstein. “It’s just a
freaking random address, right? And the coins are moved from one address to
another. To pair these to a person and destroy their privacy requires
intentional or unintentional doxxing.”
Kubernetes Security Best Practices to Keep You out of the News
Building secure containers requires scanning them for vulnerabilities —
including Linux system packages, as well as application packages for dynamic
languages like Python or Ruby. App developers might be accustomed to scanning
application dependencies, but now that they are shipping an entire operating
system with their app, they have to be supported in securing the OS as well.
To support this effort at scale, consider using a tool like Cloud Native
Buildpacks, which allows a platform or ops team to make standardized container
builds that developers can use to drop their application into — completely
replacing the Dockerfile for a project. These centralized builds can be kept
up-to-date so that developers can focus on what they’re good at rather than
having to be jacks-of-all-DevOps-trades. Container image scanning tools scan
the layers of a built image for known vulnerabilities, and are indispensable
in keeping your builds and dependencies up-to-date. They can be run during
development and in CI pipelines to shift security practices left, giving
developers the earliest notice of a vulnerability. The best practice is to
strip your container down to the minimum needed to run the application. A
great way to ruin an attacker’s day is to have a container with no shell!
Gitpaste-12 Worm Targets Linux Servers, IoT Devices
This script sets up a cron job it downloads from Pastebin. A cron job is a
time-based job scheduler in Unix-like computer operating systems. The cron job
calls a script and executes it again each minute; researchers believe that
this script is presumably one mechanism by which updates can be pushed to the
botnet. It then downloads a script from GitHub
(https://raw[.]githubusercontent[.]com/cnmnmsl-001/-/master/shadu1) and
executes it. The script contains comments in the Chinese language and has
multiple commands available to attackers to disable different security
capabilities. These include stripping the system’s defenses, including
firewall rules, selinux (a security architecture for LinuxR systems), apparmor
(a Linux kernel security module that allows the system administrator to
restrict programs’ capabilities), as well as common attack prevention and
monitoring software. The malware also has some commands that disable cloud
security agents, “which clearly indicates the threat actor intends to target
public cloud computing infrastructure provided by Alibaba Cloud and Tencent,”
said researchers. Gitpaste-12 also features commands allowing it to run a
cryptominer that targets the Monero cryptocurrency.
Data Strategies for Efficient and Secure Edge Computing Services
There is a long list of design questions that comes with executing an IoT
network: where does computation happen? Where and how do you store and encrypt
data? Do you require encryption for data in motion or just at rest? How do you
coordinate workflows across devices? And finally, how much does this cost?
While this is an intimidating list, we can build good practices that have
evolved both prior to the advent of IoT and more recently with the increasing
use of edge computing. First, let’s take a look at computation and data
storage. When possible, computation should happen close to the data. By
minimizing transmission time, you reduce the overall latency for receiving
results. Remember that distributing computation can increase overall system
complexity, creating new vulnerabilities in various endpoints, so it’s
important to keep it simple. One approach is to do minimal processing on
IoT devices themselves. A data collection device may just need to package a
payload of data, add routing and authentication to the payload, then send it
to another device for further processing. There are some instances, however,
where computing close to the collection site is necessary.
Quote for the day:
"Superlative leaders are fully equipped to deliver in destiny; they locate eternally assigned destines." -- Anyaele Sam Chiyson
No comments:
Post a Comment