NIST Updates Cybersecurity Engineering Guidelines
NIST’s publication is a resource for computer engineers and other professionals
on the programming side of cybersecurity efforts. “This publication addresses
the engineering-driven perspective and actions necessary to develop more
defensible and survivable systems, inclusive of the machine, physical, and human
components that compose those systems and the capabilities and services
delivered by those systems,” the document reads. Spanning over 200 pages, the
publication takes a holistic approach to systems engineering. NIST researchers
give an overview of the objectives and concepts of modern security systems,
primarily regarding the protection of a system's digital assets. One of the key
updates NIST authors made in the latest version of the publication was a fresh
emphasis on security assurances. In software systems engineering, assurance is
represented by the evidence that a given system’s security procedures are robust
enough to mitigate asset loss and prevent cyber attacks. Ron Ross, an NIST
fellow and one of the authors of the document, told Nextgov that system
assurances act as justifications that a security system can operate effectively.
9 ways that cybersecurity may change in 2022
On the plus side, digital wallets can ensure the identity of the user in
business or financial transactions, reduce fraud and identity theft, and shrink
the cost and overhead for organizations that typically create physical methods
of authentication. On the minus side, a person can be at risk if their mobile
device is lost or stolen, a device without power due to an exhausted battery is
of little use when trying to present your digital IT, and any digital
verification that requires connectivity will fail if there's no cellular or
Wi-Fi available. ... Shadow or zombie APIs pose a security risk, as they're
typically hidden, unknown and unprotected by traditional security measures. More
than 90% of attacks in 2022 will focus on APIs, according to Durand. And for
organizations without the right type of API controls and security practices,
these shadow APIs will become the weak link. ... Information technology and
operational technology will collide as IT teams assume responsibility for the
security of physical devices. This trend will require interoperability between
IT and OT, leading to a convergence of technology to determine who can
physically get in a building and who can access key applications.
First for software, agile is a boon to manufacturing
Overall, applying agile methodologies should be a priority for every
manufacturer. For aerospace and defense companies, whose complex projects have
typically followed the long time horizons of waterfall development, agile design
and development are needed to propel the industry into the age of urban air
mobility and the future of space exploration. ... Over the past decade, agile
software development has focused on DevOps—”development and operations”— which
creates the interdisciplinary teams and culture for application development.
Likewise, design companies and product manufacturers have taken the lessons of
agile and reintegrated them into the manufacturing life cycle. As a result,
manufacturing now consists of small teams iterating on products, feeding
real-world lessons back into the supply chain, and using software tools to speed
collaboration. In the aerospace and defense industry, well known for the
complexity of its products and systems, agile is delivering benefits.
Observability, AI And Context: Protecting APIs From Today's (And Tomorrow's) Attacks
Today's digital economy is built on a foundation of APIs that enable critical
communications, making it possible to deliver a richer set of services faster to
users. Unfortunately, today's security solutions focus on an outmoded way of
thinking. Most current organizations deploy security solutions and practices
that revolve around network security, intrusion detection and mitigating
application vulnerabilities. However, for modern API-driven applications that
have become the de-facto deployment model for applications that operate in the
cloud, these traditional security practices simply do not scale to meet the
challenges of today's organizations. Due to the incredible complexity of APIs,
as well as the breadth and depth of their deployment across organizations,
security and IT teams need to tackle this problem in a structured process that
takes into account API application security best practices and procedures that
constantly evaluate an organization's APIs, the level of their security posture
and their ability to automate remediated security actions when they are
attacked.
2022 will be the year we all start talking about online data collection
From uncovering trends to conducting market research, there are countless
reasons why businesses collect publicly available web data from their
competitors. Though the competitors in question often also engage in data
collection themselves, most will regularly block access attempts and make site
changes to prevent their public data from being accessed, even though the
information targeted is on public display. All this could be about to change.
While it may seem counterintuitive – after all, why would you want to give away
information to your competitors – some businesses are beginning to realise that
it’s in their best interests to allow their public data to be collected by
responsible, well-defined, and compliant data practitioners. Firstly, preventing
data collection is like a game of whack-a-mole: When you block one tactic, smart
practitioners will simply find another. Secondly, accepting some forms of data
collection will enable businesses to accurately distinguish between organic user
traffic and collector traffic, giving them a clearer insight into what data is
being collected and by whom.
Omnichannel E-commerce Growth Increases API Security Risk
API-led connectivity overcomes obstacles that retailers face gathering data from
disparate systems to then consolidate the data into monolithic data warehouses.
Since each individual system updates separately, information may be out-of-date
by the time it hits the database. APIs enable retailers to build an application
network that serves as a connectivity layer for data stores and assets in the
cloud, on-premises or in hybrid environments. As a result, mobile applications,
websites, IoT devices, CRM and ERP systems (order management, point of sale,
inventory management and warehouse management) can all work as one coherent
system that connects and shares data in real-time. ... The downside to this
rapid growth and development in e-commerce has been a concerning rise in API
security attacks. Here, threat actors have executed numerous high-profile
breaches against public-facing applications. For example, developers use APIs to
connect resources like web registration forms to various backend systems. This
tasking flexibility, however, also creates an entrance for automated attacks.
Collaborative Governance Will Be The Driver of The API Economy
Most companies with API programs don’t have advanced API management tools, and
they can only do a couple of releases a year from inception to production.
Collaborative governance, with an automated platform, is the future to plug the
gap from a business standpoint and help them get to market quicker and faster. A
whole team would understand how APIs mature and prepare responses for the
varying requirements. ... Collaborative governance democratizes the API building
process as anybody in a team should be able to build, manage, and maintain APIs.
Add a low-code, results-driven platform or AI-assisted development tools to the
mix, and developers won’t always need to learn about new tools and technologies
from scratch or interact with multiple parties. Through centralizing ownership
using version-controlled configuration, enterprises can avoid the disruption
caused by manual errors or configuration changes and enable reusability. Time to
production is also reduced due to continuous integration and delivery
(CI/CD).
How AI helps essential businesses respond to climate change
Underpinning the AI-based forecast platform, is a convolutional neural network
(CNN) model. This extracts features from radar reflectivity and meteorological
satellite images. This is supported by a trained machine-learning model, which
is capable of performing highly accurate and close-to-real-time local weather
forecasting in minutes. Meanwhile, a generative adversarial network (GAN) works
to generate forecast images with exceptional clarity and detail. One of the
benefits of this AI-based prediction model, is that it outperforms the
traditional physics-based model; for example, the Global/Regional Assimilation
and PrEdiction System (GRAPES) requires hours to generate forecasting data,
which is far behind the pace needed for organisations that need to make near
real-time decisions based on anticipated weather events. Some of the data is
conveyed via high-resolution imagery with one-kilometre grid spacing, with
updates every 10 minutes providing fresh insights, enabling real-time decisions
to be made to plans or arrangements based on unfolding or predicted weather
events.
Stargate gRPC: The Better Way to CQL
In 2008, Google developed, open-sourced, and released Protocol Buffers — a
language-neutral mechanism for serializing structured data. In 2015, Google
released gRPC (also open source) to incorporate Protocol Buffers into work to
modernize Remote Procedure Call (RPC). gRPC has a couple of important
performance characteristics. One is the improved data serialization, making data
transit over the network much more efficient. The other is the use of HTTP/2,
which enables bidirectional communication. As a result, there are four call
types supported in gRPC: Unary calls; Client-side streaming
calls; Server-side streaming calls; and Bidirectional calls, which are a
composite of client-side and server-side streaming. Put all this together and
you have a mechanism that is fast — very fast when compared to other HTTP-based
APIs. gRPC message transmission can be 7x to 10x faster than traditional REST
APIs. In other words, a solution based on gRPC could offer performance
comparable to native drivers.
2022 promises to be a challenging year for cybersecurity professionals
One thing the pandemic has demonstrated is an unprecedented shift in endpoints,
workloads, and where data and applications reside. Today, the Federal workforce
remains mostly remote and telework is being conducted over modern endpoints such
as mobile devices and tablets, and the applications and productivity tools are
now cloud-hosted solutions. To be effective, those additional endpoints and
mobile devices need to be included in the Agency’s asset inventory, the devices
need to be managed and validated for conformance with the Agency’s security
policies, and the identities of the user and their device must be known and
validated. Additionally, the applications that are cloud-hosted must be included
in the zero-trust framework including being protected by strong, conditional
access controls, effective vulnerability management and automated patch
management processes. I am optimistic that we can make great strides towards
improving cybersecurity in 2022, if we are smart and pragmatic about
prioritization, risk management, and leveraging automation to help us work
smarter not harder.
Quote for the day:
"Making those around you feel invisible
is the opposite of leadership." -- Margaret Heffernan
No comments:
Post a Comment