The Next Evolutions of DevOps
The old adage is that complexity is like an abacus: You can shift complexity
around, but it never really goes away. With the movement to shift responsibility
left to development teams, this also means that associated complexity is
shifting to the development teams. Modern platform engineering teams provide the
infrastructure (compliant Kubernetes clusters) to teams and any workload that is
run on those clusters is up to the development team that owns it. Typically,
development teams then focus on features and functionality. ... If you are a
DevOps or platform engineer, making your internal customers—your development
teams—successful is a great goal to work toward. Crucial to this is
disseminating expertise. This can be in the form of automation and education. A
common practice with the DevSecOps movement is to have some sort of scanning
step as part of the build or deployment process, disseminating the internals as
far as how the scan is performed, what happens if something is found, etc.
Fast-paced dash to digital leaves many public services exposed
When organisations introduce new solutions to their technology stack, protection
capabilities need to be extended to cover it. But faced with a global pandemic
that no one could’ve seen coming, businesses needed to innovate fast, and their
security measures failed to keep pace. This created a vulnerability lag, where
systems and data have been left unprotected and open to attack. Veritas’
Vulnerability Lag Report explores how this gap between innovation and protection
is affecting a variety of organisations, public and private; only three-fifths
(61%) believe their organisation’s security measures have fully kept up since
the implementation of COVID-led digital transformation initiatives. This means
39% are experiencing some form of security deficit. While such swift digital
transformation has delivered a wealth of benefits for public sector
organisations, there is a dark side to this accelerated innovation. In the rush
to digitally transform, security has taken a back seat. As a result, there may
be significant gaps just waiting for cyber criminals to exploit for their own
gain.
Towards Better Data Engineering: Mostly People, But Also Process and Technology
Traditional software engineering practices involve designing, programming, and
developing software that is largely stateless. On the other hand, data
engineering practices focus on scaling stateful data systems and dealing with
different levels of complexity. ... Setting up a data engineering culture is
therefore crucial for companies to aim for long-term success. “At Sigmoid,
these are the problems that we’re trying to tackle with our expertise in data
engineering and help companies build a strong data culture,” said Mayur. With
expertise in tools such as Spark, Kafka, Hive, Presto, MLflow, visualization
tools, SQL, and open source technologies, the data engineering team at Sigmoid
helps companies with building scalable data pipelines and data platforms. It
allows customers to build data lakes, cloud data warehouses and set up DataOps
and MLOps practices to operationalize the data pipelines and analytical model
management. Transitioning from a software engineering environment to data
engineering is a significant ‘cultural change’ for most companies.
Performing Under Pressure
Regardless of the task, pressure ruthlessly diminishes our judgment,
decision-making, focus, and performance. Pressure moments can disrupt our
thoughts, prevent us from thinking clearly, feel frustrated, and make us act
in undesirable ways. The adverse impact of pressure on our cognitive skills
can downgrade our performance, make us perform below our capability, commit
more errors and increase the likelihood of failure. Pressure can even make us
feel embarrassed and shameful when we do fail because we can act in a way that
we will otherwise not act and say or do unusual things. Consider these
pressure moments. Stepping out of an important client meeting and wondering
“why did I make that joke. I was so stupid” or failing to share your opinion
while participating in a critical decision meeting and thinking afterward,
“Why didn’t I speak up? We could have made a better decision.” Pressure can
either result in wrongful action or inaction. Such events make it much more
difficult to deal with the pressure next time. But there are things you can do
to diminish the effects of pressure on your performance.
Behavioral biometrics: A promising tool for enhancing public safety
There are several promising applications in the field of behavioral
biometrics. For computer-based identity verification, there are solutions that
allow identification based on keystrokes—the frequency and patterns of which
prove to be individual enough to recognize identity. Due to the nature of
typing, the models can also get better because they can continuously monitor
and analyze keystroke data. Software developers tend to also customize
confidence thresholds depending on the use case. However, in some cases, the
reliability of this behavioral biometric factor is limited to the
circumstances. On a different keyboard, individual patterns may differ, and
physical conditions like carpal tunnel syndrome or arthritis may affect unique
abilities. The lack of benchmarks makes it difficult to compare different
providers’ trained algorithms in these cases, providing room for false
marketing claims. Image analysis for image recognition can provide more data
for behavioral research. Gait and posture biometrics are rapidly becoming
useful tools, even if they do not yet match the accuracy and robustness of
traditional biometric approaches.
Privacy in Decentralized Finance: Should We Be Concerned?
It is alarming that the pace of DeFi’s growing influence is so fast-paced
because many of the issues it presents are not addressed or solved enough in
depth. People are investing in all sorts of cryptocurrency before they even
educate themselves on how to manage private keys properly. Coupled with the
lag in robust protective regulation, the general lack of awareness for
DeFi’s threats to privacy inevitably results in large populations of users
that are vulnerable to attack. Though some progress has been made at the
state level to set standards for blockchain, there is a greater need for
industry standardization at the international level. Additionally, the rapid
expansion of blockchain technology in many industries is not met with
sufficient safety protocols. As such, cybercriminals are aggressively taking
action to target both users and exchanges of cryptocurrency in its
under-secured state. On the flip side, there are some aspects about DeFi
that are directly beneficial to protecting the privacy of users. When
comparing the decentralized network that DeFi uses to a centralized one,
DeFi’s “peer-to-peer” model is preferable because it prevents a “single
source of failure”.
Hackers Exploit MS Browser Engine Flaw Where Unpatched
The modus operandi of these attackers parallels that of the Iranian
attackers, in that it follows the same execution steps. But the researchers
did not specify whether the intent of this campaign appeared to be data
exfiltration. AhnLab did not respond to Information Security Media Group's
request for additional information. With multiple attackers actively
exploiting CVE-2021-40444, firms using Microsoft Office should immediately
update their software to the latest version as a prevention measure, say
researchers from EST Security, which discovered yet another campaign
targeting the vulnerability. In this case, the campaign used communications
that attempted to impersonate the president of North Korea's Pyongyang
University of Science and Technology. "The North Korean cyberthreat
organization identified as the perpetrator behind this campaign is actively
introducing document-based security vulnerabilities such as PDF and DOC
files to customized targeted attacks such as CVE-2020-9715 and
CVE-2021-40444," the EST Security researchers say. CVE-2020-9715 is a
vulnerability that allows remote attackers to execute arbitrary code on
affected installations of Adobe Acrobat Reader DC.
Data Mesh: an Architectural Deep Dive
Data mesh is a paradigm shift in managing and accessing analytical data at
scale. Some of the words I highlighted here are really important, first of
all, is the shift. I will justify why that's the case. Second is an
analytical data solution. The word scale really matters here. What do we
mean by analytical data? Analytical data is an aggregation of the data that
gets generated running the business. It's the data that fuels our machine
learning models. It's the data that fuels our reports, and the data that
gives us an historical perspective. We can look backward and see how our
business or services or products have been performing, and then be able to
look forward and be able to predict, what is the next thing that a customer
wants? Make recommendations and personalizations. All of those machine
learning models can be fueled by analytical data. What does it look like?
Today we are in this world with a great divide of data. The operational data
is the data that sits in the databases of your applications, your legacy
systems, microservices, and they keep the current state.
Google Data Studio Vs Tableau: A Comparison Of Data Visualization Tools
Business analysts and data scientists rely on numerous tools like PowerBI,
Google Data Studio, Tableau, and SAP BI, among others, to decipher
information from data and make business decisions. Coming from one of the
best companies in the world, Google Data Studio, launched in 2016, is a data
visualisation platform for creating reports using charts and dashboards.
Tableau, on the other hand, was founded more than a decade before Google
Data Studio in 2003 by Chris Stolte, Pat Hanrahan, and Christian Chabot.
Tableau Software is one of the most popular visual analytics platforms with
very strong business intelligence capabilities. The tool is free, and the
user can log in to it by using their Google credentials. Over the years, it
has become a popular tool to visualise trends in businesses, keep track of
client metrics, compare time-based performance of teams, etc. It is a part
of the Google Marketing Platform and downloads data from Google’s marketing
tools to create reports and charts. Recently, Google announced that users
can now include Google Maps in embedded reports in Google Data Studio.
5 Trends Increasing the Pressure on Test Data Provisioning
Not only is the pace of system change growing; the magnitude of changes
being made to complex systems today can be greater than ever. This presents
a challenge to slow and overly manual data provisioning, as a substantial
chunk of data might need updating or replacing based on rapid system
changes. A range of practices in development have increased the rate and
scale of system change. The adoption of containerization, source control,
and easily reusable code libraries allow parallelized developers to rip and
replace code at lightning speed. They can easily deploy new tools and
technologies, developing systems that are now intricately woven webs of
fast-shifting components. A test data solution today must be capable of
providing consistent test data “journeys” based on the sizeable impact of
these changes across interrelated system components. Data allocation must
occur at the pace with which developers chop-and-change reusable and
containerised components.
Quote for the day:
"One must be convinced to convince,
to have enthusiasm to stimulate the others." -- Stefan Zweig
No comments:
Post a Comment