Universal Stablecoins, the End of Cash and CBDCs: 5 Predictions for the Future of Money
Many of the features that decentralized finance, or DeFi, brings to the table
will be copied by regular finance in the future. For instance, there’s no reason
that regular finance can’t copy the automaticity and programmability that DeFi
offers, without bothering with the blockchain part. Even as regular finance
copies the useful bits from DeFi, DeFi will emulate regular finance by pulling
itself into the same regulatory framework. That is, DeFi tools will become
compliant with anti-money laundering/know your customer (AML/KYC) rules,
Securities and Exchange Commission-registered or licensed with the Office of the
Comptroller of the Currency (OCC). And not necessarily because they are forced
to do so. (It’s hard to force a truly decentralized protocol to do anything.)
Tools will comply voluntarily. Most of the world’s capital is licit capital.
Licit capital wants to be on regulated venues, not illegal ones. To capture this
capital, DeFi has no choice but to get compliant. The upshot is that over time
DeFi and traditional finance (TradFi) will blur together.
10 Rules for Better Cloud Security
Security in the cloud is following a pattern known as the shared responsibility
model, which states that the provider is only responsible for security ‘of’ the
cloud, while customers are responsible for security ‘in’ the cloud. This
essentially means that to operate in the cloud, you still need to take your
share of work for secure configuration and management. The scope of your
commitment can vary widely because it depends on the services you are using: if
you’ve subscribed to an Infrastructure as a Service (IaaS) product, you are
responsible for OS patches and updates. If you only require object storage, your
responsibility scope will be limited to data loss prevention. Despite this great
diversity, there are some guidelines that apply no matter what your situation
is. And the reason for this is simply because all the cloud vulnerabilities are
essentially reduced to one thing: misconfigurations. Cloud providers have put at
your disposal powerful security tools, yet we know that they will fail at some
point. People make mistakes, and misconfigurations are easy.
Unit testing vs integration testing
Tests need to run to be effective. One of the great advantages of automated
tests is that they can run unattended. Automating tests in CI/CD pipelines is
considered a best practice, if not mandatory according to most DevOps
principles. There are multiple stages when the system can and should trigger
tests. First, tests should run when someone pushes code to one of the main
branches. This situation may be part of a pull request. In any case, you need to
protect the actual merging of code into main branches to make sure that all
tests pass before code is merged. Set up CD tooling so code changes deploy only
when all tests have passed. This setup can apply to any environment or just to
the production environment. This failsafe is crucial to avoid shipping quick
fixes for issues without properly checking for side effects. While the
additional check may slow you down a bit, it is usually worth the extra time.
You may also want to run tests periodically against resources in production, or
some other environment. This practice lets you know that everything is still up
and running. Service monitoring is even more important to guard your production
environment against unwanted disruptions.
Vulnerability Management | A Complete Guide and Best Practices
Managing vulnerabilities helps organizations avoid unauthorized access, illicit
credential usage, and data breaches. This ongoing process starts with a
vulnerability assessment. A vulnerability assessment identifies, classifies, and
prioritizes flaws in an organization's digital assets, network infrastructure,
and technology systems. Assessments are typically recurring and rely on scanners
to identify vulnerabilities. Vulnerability scanners look for security weaknesses
in an organization's network and systems. Vulnerability scanning can also
identify issues such as system misconfigurations, improper file sharing, and
outdated software. Most organizations first use vulnerability scanners to
capture known flaws. Then, for more comprehensive vulnerability discovery, they
use ethical hackers to find new, often high-risk or critical vulnerabilities.
Organizations have access to several vulnerability management tools to help look
for security gaps in their systems and networks.
How Web 3.0 is Going to Impact the Digital World?
The concept of a trustless network is not new. The exclusion of any so-called
“trusted” third parties from any sort of virtual transactions or interactions
has long been an in-demand ideology. Considering how data theft is a prominent
concern among internet users worldwide, trusting third parties with our data
doesn’t seem right. Trustless networks ensure that no intermediaries interfere
in any online transactions or interactions. A close example of truthfulness is
the uber-popular blockchain technology. Blockchain is mostly used in
transactions involving cryptocurrencies. It defines a protocol as per which only
the individuals participating in a transaction are connected in a peer-to-peer
manner. No intermediary is involved. Social media enjoys immense popularity
today. And understandably so, for it allows us to connect and interact with
known ones and strangers sans any geographical limits. But firms that own social
media platforms are few. And these few firms hold the information of millions of
people. Sounds scary right?
Is TypeScript the New JavaScript?
As a static language, TypeScript performs type checks upon compilation, flagging
type errors and helping developers spot mistakes early on in development.
Reducing errors when working with large codebases can save hours of development
time. Clear and readable code is easy to maintain, even for newly onboarded
developers. Because TypeScript calls for assigning types, the code instantly
becomes easier to work with and understand. In essence, TypeScript code is
self-documenting, allowing distributed teams to work much more efficiently.
Teams don’t have to spend inordinate amounts of time familiarizing themselves
with a project. TypeScript’s integration with editors also makes it much easier
to validate the code thanks to context-aware suggestions. TypeScript can
determine what methods and properties can be assigned to specific objects, and
these suggestions tend to increase developer productivity. TypeScript is widely
used to automate the deployment of infrastructure and CI/CD pipelines for
backend and web applications. Moreover, the client part and the backend can be
written in the same language—TypeScript.
4 signs you’re experiencing burnout, according to a cognitive scientist
One key sign of burnout is that you don’t have motivation to get any work done.
You might not even have the motivation to want to come to work at all. Instead,
you dread the thought of the work you have to do. You find yourself hating both
the specific tasks you have to do at work, as well as the mission of the
organization you’re working for. You just can’t generate enthusiasm about work
at all. A second symptom is a lack of resilience. Resilience is your ability to
get over a setback and get yourself back on course. It’s natural for a failure,
bad news, or criticism to make you feel down temporarily. But, if you find
yourself sad or angry for a few days because of something that happened at work,
your level of resilience is low. When you’re feeling burned out, you also tend
to have bad interactions with your colleagues and coworkers. You find it hard to
resist saying something negative or mean. You can’t hide your negative feelings
about things or people that can upset others. In this way, your negative
feelings about work become self-fulfilling, because they actually create more
unpleasant situations.
Spotting a Modern Business Crisis — Before It Strikes
Modern technologies such as more-efficient supply chain operations, the
internet, and social media have not only increased the pace of change in
business but have also drawn more attention to its impact on society. Fifty
years ago, oversight of companies was largely the domain of regulatory
agencies and specialized consumer groups. What the public knew was largely
defined by what businesses were required to disclose. Today, however, public
perception of businesses is affected by a diverse range of stakeholders —
consumers, activists, local or national governments, nongovernmental
organizations, international agencies, and religious, cultural, or scientific
groups, among others. ... There are a few ways businesses can identify risks.
One, externalize expertise through insurance and consulting companies that
identify sociopolitical or climate risks. Two, hire the right talent for risk
assessment. Three, rely on government agencies, media, industry-specific
institutions, or business leaders’ own experience of risk perception. A
fail-safe approach is to use all three mechanisms in tandem, if possible.
Today’s Most Vital Question: What is the Value of Your Data?
Data has latent value; that is, data has potential value that has not yet
realized. And the possession of data in of itself provides zero economic value,
and in fact, the possession of data has associated storage, management,
security, and backup costs and potential regulatory and compliance liabilities.
... Data must be “activated” or put into use in order to convert that latent
(potential) value of data into kinetic (realized) value. The key is getting the
key business stakeholders to envision where and how to apply data (and
analytics) to create new sources of customer, product, service, and
operational value. The good news is that most organizations are very clear as to
where and how they create value. ... The value of the organization’s data is
tied directly to its ability to support quantifiable business outcomes or Use
Cases. ... Many data management and data governance projects stall out because
organizations lack a business-centric methodology for determining which of their
data sources are the most valuable.
Federal watchdog warns security of US infrastructure 'in jeopardy' without action
The report was released in conjunction with a hearing on securing the nation’s
infrastructure held by the House Transportation and Infrastructure Committee on
Thursday. Nick Marinos, the director of Information Technology and Cybersecurity
at GAO, raised concerns in his testimony that the U.S. is “constantly operating
behind the eight ball” on addressing cyber threats. “The reality is that it just
takes one successful cyberattack to take down an organization, and each federal
agency, as well as owners and operators of critical infrastructure, have to
protect themselves against countless numbers of attacks, and so in order to do
that, we need our federal government to be operating in the most strategic way
possible,” Marinos testified to the committee. According to the report, GAO has
made over 3,700 recommendations related to cybersecurity at the federal level
since 2010, and around 900 of those recommendations have not been addressed.
Marinos noted that 50 of the unaddressed concerns are related to critical
infrastructure cybersecurity.
Quote for the day:
"Self-control is a critical leadership
skill. Leaders generally are able to plan and work at a task over a longer
time span than those they lead." -- Gerald Faust
No comments:
Post a Comment