Introduction to SPIFFE/SPIRE
The Secure Production Identity Framework For Everyone (SPIFFE) is a
specification for workload identity. According to Gilman, the easiest way to
think about SPIFFE is as a passport. Similar to how people are issued passports
in a common shape with a barcode and standard information, SPIFFE dictates the
standard methods to prove and validate the identity of a service. It’s like
bringing the “Sign in with Google” experience to the software services
themselves, he adds. There are three key components in SPIFFE. First, SPIFFE
specifies that services shall identify themselves with what’s called a SPIFFE
ID, which is defined as a URI in the format of spiffe://trust-domain-name/path.
These IDs are then encoded into a SPIFFE Verifiable Identity Document or SVID.
SVIDs aren’t so much a document type themselves — instead, they support either
X.509 or JWT document types. Last but not least, SPIFFE specifies a workload API
that issues and rotates these SVIDs, along with the keys needed to validate
them. SPIRE is the code that implements the SPIFFE specification—you can think
of it as a production-ready SPIFFE runtime environment.
C-Suite Businesses are placing a high priority on cybersecurity
The C-suite now frequently discusses cybersecurity in boardroom discussions.
IT and business leaders have historically had difficulty cooperating on cyber
risk management, but this disagreement seems to be worse than ever right now.
According to a study, over 90% of IT decision-makers think their organisation
would be willing to forego cybersecurity in favour of other objectives. Such a
strategy for short-term gains are not worth the risk regarding cybersecurity,
which includes monetary losses and reputational harm. An organisation must
resolve this business-IT conflict and come to a consensus on cyber risk as a
crucial component of business risk in order to succeed in the post-pandemic
era of hybrid or remote workforces. Organisations will be able to maximise
their commercial opportunities and prevent pricey breaches by using this to
better identify, communicate, and mitigate cyber risk across the workplace.
Additionally, research shows that 38% of business decision-makers and 50% of
IT leaders believe the C-Suite fully comprehends cyber dangers.
Disadvantages of industrial IoT
Using IIoT creates massive amounts of data. That wouldn’t matter, were it not
for the fact that this information needs to be processed quickly in order to
be of any use. Especially when applied to digital operations, data processing
is key to success. Additionally, all this generated information brings matters
of privacy and security into question. IoT itself is a relatively new concept,
and protecting the data that it collects will require companies to find
different and more efficient ways to sort through digital assets. At the very
least, businesses operating with IIoT technology should be sure to invest in
secure cloud computing infrastructure. Without strong digital assets, IIoT
implementation will become even more complicated and risky than it already is.
... Transitioning to IIoT is costly. Regardless of the need for new systems,
as mentioned above, current IoT expenses are already high. This is because
IIoT uses sophisticated software to analyze productivity and predict future
trends and issues. It is also capable of deploying smart-sensing software for
use in technology and agricultural businesses. Combined with the network that
IIoT provides to companies, the expense of developing a digital strategy can
be hefty.
10 future trends for working with business leaders
CEOs of the world’s largest companies tell IDC that they already make around
30% of their revenue from digital products, and they expect that proportion to
grow in the years to come. IDC identifies three dimensions along which
enterprises can achieve this growth. First, they can exploit new channels:
e-commerce, mobile apps, or the creation of new distribution paths such as
enabling the circular economy. Second, they can adopt additional revenue
models: pay-per-use, subscriptions, dynamic pricing, transaction fees, or
payment for outcomes. And third, they can seek to monetize new digital assets:
data, intellectual property, or virtual objects. Developing such new revenue
streams requires that CIOs keep pressing ahead with digital spending. “If you
pause, you’re already behind,” he says. Building new products may involve
skills that CIOs don’t yet have on their roster. “You have to have the right
mix of in-house and partners that can enable quicker development,” says
Powers.
How to prepare for a SOC 2 audit – it’s a big deal, so you’d better get ready
“Companies tend to write their controls down and never look at them again, so
preparing for the audit is an appropriate time to look at and update them if
they don’t reflect what you’re doing,” says Paul Perry, a member of the
Emerging Trends Working Group with the governance group ISACA and the
Security, Risk and Controls Practice Leader with accounting and advisory firm
Warren Averett. Auditors want to see well-documented policies, but they also
want to see them in action to verify that organizations are doing in
day-to-day practice what those policies say they should be doing. For example,
software engineers may be testing code, but they need to do so in a manner
that follows the process and documentation requirements outlined in the
organization’s policies. That’s the kind of action auditors will want to see,
Yawn says. Review security and privacy controls to ensure they’re aligned with
the organization’s own security and privacy policies as well as regulatory
requirements and industry best practices.
Does data need to come from a single source of truth?
Using potentially flawed data in the decision-making process not only leads to incorrect decision-making, but can have a negative impact on future data operations. If there isn’t real clarity about where the source of the data is, what it’s quality is and what it really means, how can employees really trust that data? And if they can’t trust it, the consequences can be serious, with executives developing a negative view of data-driven decision making and underinvesting in future data projects. It’s a vicious data circle that can end in a business not fully realising the true value from arguably its most important asset. It is crucial, therefore, that data is trusted and accurate, but ensuring data is reliable across multiple different sources is another challenge entirely The key is giving employees a single pane of glass through which to see all of the available data. This not only provides a single point of reference for employees that allows them to search for data on a reliable platform, but also gives them access to data from a wide range of different sources such as CRM or ERP systems.
How Chipmakers Are Implementing Confidential Computing
"Everybody wants to continue to reduce the attack surface of data," says
Justin Boitano, vice president and general manager of Nvidia's enterprise and
edge computing operations. "Up to this point, it is obviously encrypted in
transit and at rest. Confidential computing solves the encrypted in-use at the
infrastructure level." Nvidia is taking a divergent approach to confidential
computing with Morpheus, which uses artificial intelligence (AI) to keep
computer systems secure. For example, Morpheus identifies suspicious user
behavior by using AI techniques to inspect network packets for sensitive data.
"Security analysts can go and fix the security policies before it becomes a
problem," Boitano says. "From there, we also realize the big challenges — you
have to kind of assume that people are already in your network, so you have
also got to look at the behavior of users and machines on the network." Nvidia
is also using Morpheus to establish security priorities for analysts tracking
system threats.
Memory-Based Cyberattacks Become More Complex, Difficult To Detect
There are two broad classifications of memory attacks. The first involves
attacks on storage devices that are used to boot or load an operating system
or software for a machine. Greenberg said that often, but not always, these
require physical access to the machine to mount an effective attack on the
storage, although an already compromised machine may further corrupt the
storage such that the machine remains permanently compromised until it is
completely erased and restarted. Encryption can help protect these storage
devices. The second involves RAM devices that store temporary data. These
devices are more likely to be attacked through the machine itself, including
through internet-connected attacks. Physical attacks on RAM are also a
possibility. Most systems’ security comes from physical security combined with
built-in memory protection and run-time security provided through the system.
“But as new ways of exploiting cybersecurity weaknesses are discovered over
time, more advanced memory types tend to contain mitigating features for those
methods,” Greenberg said.
The new CIO security priority: Your software supply chain
Whether it’s components, APIs, or serverless functions, most organizations
underestimate what they’re using by an order of magnitude unless they run
routine inventories, Worthington points out. “They find out that some of these
APIs aren’t using proper authentication methods or are maybe not written in a
way that they expected them to be and maybe some of them are even deprecated,”
she says. Beyond vulnerabilities, evaluating the community support behind a
package is as important as understanding what the code does because not all
maintainers want the burden of having their code treated as a critical
resource. “Not all open source is made the same,” she warns. “Open source
may be free to download but certainly the use of it is not free. Your use of
it means that you as are responsible for understanding the security posture
behind it, because it’s in your supply chain. You need to contribute back to
it. Your developers need to participate in fixing vulnerabilities,” says
Worthington, who suggests organizations should also be prepared to contribute
monetarily, either directly to open-source projects or to initiatives that
support them with resources and funds.
The Future of DevOps Is No-Code
For DevOps, the starting point for upskilling is to train non-DevOps personnel
to become effective members of the DevOps team. And this is where no-code and
low-code DevOps tools come in. With no-code and low-code tools, even complete
development novices can learn to build websites and applications. If someone
has enough computer knowledge to drag-and-drop, they can probably learn
no-code tools. And those with a little more computer savvy can put low-code
tools to good use. As their name suggests, no-code and low-code tools
facilitate software and application development with minimal need for writing
or understanding code. Instead of building code, developers rely on visual,
drag-and-drop processes to piece together pre-made functionality. So instead
of needing to understand the intricacies of specific programming languages,
developers need only have a good feel for the business’s needs, the overall
application architecture, and the application’s workflows.
Quote for the day:
"You don't have to hold a position in
order to be a leader." -- Anthony J. D'Angelo
No comments:
Post a Comment