Customer.io Email Data Breach Larger Than Just OpenSea
The company is not revealing how many emails are now at heightened risk of
phishing attempts as a result of the "deliberate actions" of the former
employee. Non-fungible token marketplace platform OpenSea partially divulged the
incident late last month when it warned anyone who had ever shared an email
address with it about the unauthorized transfer of contact information.
Approximately 1.9 million users have made at least one transaction on the
platform, shows data from blockchain market firm Dune Analytics. Customer.io did
not identify the other affected companies to Information Security Media Group or
specify the sectors in which they operate. The affected parties have been
alerted, the company says. The incident underscores the continuing threat posed
by insiders, who account for 20% of all security incidents, according to the
most recent Verizon Data Breach Incident Report. The costs of insider breaches,
whether caused by human error or bad actors, are going up, and the Ponemon
Institute found a 47% increase over the past two years.
Making the DevOps Pipeline Transparent and Governable
When DevOps was an egg, it really was an approach that was radically different
from the norm. And what I mean, obviously for people that remember it back then,
it was the continuous... Had nothing to do with Agile. It was really about
continuous delivery of software into the environment in small chunks,
microservices coming up. It was delivering very specific pieces of code into the
infrastructure, continuously, evaluating the impact of that release and then
making adjustments and change in respect to the feedback that gave you. So the
fail forward thing was very much an accepted behavior, what it didn't do at the
time, and it sort of glossed over it a bit, was it did remove a lot of the
compliance and regulatory type of mandatory things that people would use in the
more traditional ways of developing and delivering code, but it was a fledging
practice. And from that base form, it became a much, much bigger one. So really
what that culturally meant was initially it was many, many small teams working
in combination of a bigger outcome, whether it was stories in support of epics
or whatever the response was.
SQL injection, XSS vulnerabilities continue to plague organizations
Critical and high findings were low in mobile apps, just over 7% for Android
apps and close to 5% for iOS programs. Among the most common high and critical
errors in mobile apps identified in the report were hard-coded credentials into
apps. Using these credentials, attackers can gain access to sensitive
information, the report explained. More than 75% of the errors found in APIs
were in the low category. However, the report warns that low risk doesn’t equate
to no risk. Threat actors don’t consider the severity of the findings before
they exploit a vulnerability, it warned. Among the highest critical risks found
in APIs were function-level controls missing (47.55%) and Log4Shell
vulnerabilities (17.48%). Of all high and critical findings across companies,
the report noted, 87% were found in organizations with fewer than 200 employees.
The report identified several reasons for that, including cybersecurity being an
afterthought in relatively small organizations; a dearth of bandwidth, security
know-how, and staffing; a lack of security leadership and budget; and the speed
of business overpowering the need of doing business securely.
Three golden rules for building the business case for Kubernetes
Cost, customer service and efficiency are the three typical considerations any
business weighs up when it comes to making new investments. Will a new
initiative reduce costs in the long run, and be worth the initial expense, is
a question decision makers weigh up all the time. Kubernetes does this because
it addresses the challenge that comes in managing the potentially thousands or
tens of thousands of containers a large enterprise might have deployed. ...
The second consideration is whether the investment will mitigate the risk of
losing a customer. Is the ability to serve their needs improved as a result of
the changes? Again, Kubernetes meets the criteria here. By taking a
microservices approach to applications, it allows them and the underlying
resources they need to be scaled up or down, based on the current needs of the
organization. ... The third and final consideration is whether the new
technology or initiative will improve the ways the business operates. What
might it achieve that a business couldn’t do before?
Infrastructure-as-Code Goes Low Code/No Code
The cross-disciplinary skills required by IaC — someone with security,
operations and coding experience — is a niche, Thiruvengadam told The New
Stack. The San Jose, Calif.-based DuploCloud targets that need with a
low-code/no-code solution. “The general idea with Duplo cloud is that you can
use infrastructure-as-code, but you just have to write a lot less lines of
code,” he said. “A lot of people who don’t have all the three skill sets still
can operate at the same scale and efficiency, using this technology — that’s
fundamentally the core advantage.” Unlike some solutions, which rely on
ready-made modules or libraries, Thiruvengadam said that DuploCloud uses a low
code interface to put together the rules for its rules-based engine, which
then runs through the rules to produce the output. The self-hosted
single-tenant solution is deployed within the customer’s cloud account.
Currently, it supports deployment on Amazon Web Services, Microsoft Azure and
Google Cloud, and it can run on-premise as well.
The Compelling Implications of Using a Blockchain to Record and Verify Patent Assignments
Smart contracts could be used to put various types of conditions and
obligations on a patent asset. For example, companies might incentivize their
inventors to disclose more inventions by placing an obligation on all future
owners of an asset to pay the inventors some percentage of future licensing,
sales, settlements, or judgments involving to that asset (e.g., the inventors
get 10% of the total value of such transactions). This would allow inventors
of commercially-valuable patents to enjoy the financial benefits of their
inventions in a fashion that is more equitable than, say, a one-time nominal
payout upon filing or grant. Since patents can only be asserted when all
owners agree to do so, such contracts would have to clearly separate ownership
of a patent asset from an obligation of the owner to compensate a previous
owner for the asset's future revenue. Another potential use of smart contracts
would be for ownership of an issued patent to revert to its previous owner
should the current owner fail to pay maintenance fees on time.
Breaking down the crypto ecosystem
According to several Indian and global reports reduction in transaction costs
is further expected to propel this market growth in the next few years. In
line with global trends, increasing adoption of the digital currency by
businesses coupled with talks of a government-backed digital currency in the
country, is further anticipated to bolster the growth of the cryptocurrency
market. In the present Web 2.0 environment establishing trust and creating
social identities of the network participants has been an uphill task that the
ecosystem is unable to overcome. And since almost all economic value is traded
based on human relationships, it is a fundamental roadblock to innovation and
growth in Web 2.0. However, the outburst of cryptocurrency and Blockchain has
fuelled a rapid transition towards Web 3.0 where we have witnessed exponential
growth especially in enablers like NFTs, which has made possible acquiring,
storing, and distributing economic value among users. In fact, the
introduction of SBTs (SoulBound Tokens) could be the final piece in the puzzle
for the Web 3.0 ecosystem.
What is observability? A beginner's guide
For decades, businesses that control and depend on complex distributed systems
have struggled to deal with problems whose symptoms are often buried in floods
of irrelevant data or those that show high-level symptoms of underlying
issues. The science of root cause analysis grew out of this problem, as did
the current focus on observability. By focusing on the states of a system
rather than on the state of the elements of the system, observability provides
a better view of the system's functionality and ability to serve its mission.
It also provides an optimum user and customer experience. Observability is
proactive where necessary, meaning it includes techniques to add visibility to
areas where it might be lacking. In addition, it is reactive in that it
prioritizes existing critical data. Observability can also tie raw data back
to more useful "state of IT" measures, such as key performance indicators
(KPIs), which are effectively a summation of conditions to represent broad
user experience and satisfaction.
10 trends shaping the chief data officer role
CDOs may need to rethink cybersecurity in response to the growth of the data
sources and volume of data, said Christopher Scheefer, vice president of
intelligent industry at Capgemini Americas. These new and nontraditional data
streams require additional methods of securing and managing access to data.
"The importance of cybersecurity in a pervasively connected world is a trend
many CDOs cannot ignore due to the growing threats of IP infringement,
regulatory risks and exposure to a potentially damaging event," Scheefer said.
Rethinking and reimagining cybersecurity is no small feat. The level of
complexity of integrating connected products and operations into the business
presents an incredible amount of risk. Establishing proper governance, tools
and working with cybersecurity leadership is critical. It is the CDO's job to
ensure the business does not constrain itself, limiting external connections
and services that could bring competitive advantage and paths to growth,
Scheefer said.
Streamlining Unstructured Data Migration for M&A and Divestitures
It’s common to take all or most of the data from the original entity and dump
it onto storage infrastructure at the new company. While this may seem like
the simplest way to handle a data migration, it’s problematic for several
reasons. First, it’s highly inefficient. You end up transferring lots of data
that the new business may not actually need or records for which the mandatory
retention period may have expired. A blind data dump from one business to
another also increases the risk that you’ll run afoul of compliance or
security requirements that apply to the new business entity but not the
original one. For instance, the new business may be subject to GDPR data
privacy mandates because of its location in Europe. But if you simply move
data between businesses without knowing what’s in the data or which mandates
it needs to meet, you’re unlikely to meet the requirements following the
transfer. Last but not least, blindly moving and storing data deprives you of
the ability to trace the origins of data after the fact.
Quote for the day:
"Leaders must be good listeners. It's
rule number one, and it's the most powerful thing they can do to build
trusted relationships." -- Lee Ellis
No comments:
Post a Comment