Cloud may be overpriced compared to on-premises systems
Public cloud computing prices have been creeping up because they are offered by
for-profit companies that must generate a profit. Running a public cloud service
is costly, and the billions invested over the past 12 years must show investors
a return. That’s why prices have been increasing, not to mention the additional
value cloud providers can offer, such as integrated AI, finops, operations,
etc. At the same time, the cost of producing hardware, such as traditional
HDD storage, has dropped to a new level of confusion. Now it’s a viable
alternative to cloud-based storage systems. Thus, it’s not just a quick decision
to pick cloud computing over traditional hardware now. ... Again, this is about
being entirely objective when looking at all potential solutions, including
cloud and on-premises. Cost being equal, cloud computing will be the better
choice nine times out of 10, but now that the prices are very different, that
may not be the case. If you’re the person making these calls, you must
consider all aspects of these solutions, including future criteria.
Leveraging Cloud DevOps to drive digital optimization and maximizing cloud benefits
Organizations need to consider several factors to ensure a successful
implementation that delivers desired business value. They must begin by
identifying the business drivers and putting in place a change management
program. They must train or upskill their employees and focus on the structural
and process changes required to foster collaboration between different
functions. Companies must define measurable goals and KPIs and establish
governance after considering the prevailing technology landscape and the future
roadmap. . They must know and assess their existing infrastructure and
portfolio, their requirements, the current challenges they face, the right cloud
mix that’s appropriate to their needs, and so on. They must plan their resources
and identify their security needs to ensure Cloud DevOps can meet these
requirements. ... The success of Cloud DevOps can be measured along four primary
dimensions – efficiency, agility, reliability, and quality.
Spatial Data Science: The Basics You Need to Know
“Spatial data science is data science on geospatial data -- location data,
navigation data, GPS data, any data that is geocoded,” Kobielus explained.
“Geospatial data science builds on and extends the capabilities of geographic
information systems.” ... When asked about principal use cases for spatial data
science, Kobielus suggested several possibilities. “A core and mainstream
enterprise application for spatial data science has been address management.
Customer information management needs to be integrated with permanent addressing
which then is geocoded so that as your customers move around you always know
what their actual address is.” Other possible uses include determining optimal
locations for things such as retail outlets or manufacturing facilities,
optimizing supply chain logistics, tracking inventory, personalizing user
experiences on mobile devices, allowing businesses to provide targeted content,
and indoor applications to help organizations optimally arrange things within
warehouses or other indoor spaces.
Forecasting Team Performance
A useful technique is systems mapping. To do it, you first, identify qualitative
factors and situations affecting your organization. By qualitative, I’m
referring to what you can't see on a neat dashboard or chart — the things that
only come up in casual 1:1 calls with people, or in water cooler conversations
and healthy retrospectives. Next, think about second-order effects. These are
the consequences of those qualitative factors, which have further implications
and so on. Researchers have shown that second-order effects are a big blindside
of the human brain; we think in terms of linear cause-and-effect and so often
miss significant domino-effect repercussions. One way of making sure you’re
sensitive to these second-order effects is to talk to people that have been in
the organization for a long time. They're often the people who are most eager to
share the types of stories that will help you draw connections between seemingly
unrelated situations.
When Do Agile Teams Make Time for Innovation?
The unintended side effects of shorter sprints and sprint commitments can be
devastating for creativity and breakthroughs. Teams that feel pressured by
time or fear of failure aren't going to feel safe to experiment. In the
absence of psychological safety, innovation recedes. It's critical that agile
teams push back against this pressure to deliver and never fail. A recent
Harvard Business Review article on psychological safety said, "In essence,
agile’s core technology isn’t technical or mechanical. It’s cultural." Or as
Entrepreneur.com put it, “Your company needs an innovation culture, not an
innovation team. You don't become an innovative company by hiring a few people
to work on it while everybody else goes through the motions” As agilists, we
must fight to establish experimentation as part of our company culture.
Companies that value innovation empower self-organizing teams to try new
things, encourage (and fund) continuous learning and improvement, solicit and
act on feedback and ideas, and emphasize collaboration and communication.
Microsoft attacked over ‘grossly irresponsible’ security practice
Yoran said the so-called shared responsibility model of cyber security
espoused by public cloud providers, including Microsoft, was irretrievably
broken if a provider fails to notify users of issues as they arise and apply
fixes openly. He argued that Microsoft was quick to ask for its users’ trust
and confidence, but in return they get “very little transparency and a culture
of toxic obfuscation”. “How can a CISO, board of directors or executive team
believe that Microsoft will do the right thing given the fact patterns and
current behaviours? Microsoft’s track record puts us all at risk. And it’s
even worse than we thought,” said Yoran. “Microsoft’s lack of transparency
applies to breaches, irresponsible security practices and to vulnerabilities,
all of which expose their customers to risks they are deliberately kept in the
dark about,” he added. A Microsoft spokesperson said: “We appreciate the
collaboration with the security community to responsibly disclose product
issues.
Managing Partnership Misfits
To get the right stakeholders on board and collaborating, project initiators
must combine engagement and containment strategies. And to do this, they need
more practical and nuanced guidance along with a new set of lenses through
which to assess the suitability of potential partners and to identify,
motivate, or control misfits. In other words, they need a tool to identify
potential fault lines in future partnerships and to help iron out or contain
misalignments. Based on in-depth studies of successful and unsuccessful
partnerships, we propose a framework that tests partner fit across three
dimensions: task-fit (what each party needs); goal-fit (what each party aims
to achieve); and relationship-fit (how each party works). How potential
partners measure up on these dimensions flags likely misalignments with a
prospective partner and allows project initiators to design ways to overcome
them. ... You are looking for a partner with the required capabilities or
resources who values the expected gains, which are not just financial rewards,
but could relate to learning, inspiration, or reputation.
Comparing Different Vector Embeddings
In the simplest terms, vector embeddings are numerical representations of
data. They are primarily used to represent unstructured data. Unstructured
data are images, videos, audio, text, molecular images and other kinds of data
that don’t have a formal structure. Vector embeddings are generated by running
input data through a pretrained neural network and taking the output of the
second-to-last layer. Neural networks have different architectures and are
trained on different data sets, making each model’s vector embedding unique.
That’s why working with unstructured data and vector embeddings is
challenging. Later, we’ll see how models with the same base fine-tuned on
different data sets can yield different vector embeddings. The differences in
neural networks also mean that we must use distinct models to process diverse
forms of unstructured data and generate their embeddings. For example, you
can’t use a sentence transformer model to generate embeddings for an image. On
the other hand, you wouldn’t want to use ResNet50, an image model, to generate
embeddings for sentences.
Could C2PA Cryptography be the Key to Fighting AI-Driven Misinformation?
The C2PA specification is an open source internet protocol that outlines how
to add provenance statements, also known as assertions, to a piece of content.
Provenance statements might appear as buttons viewers could click to see
whether the piece of media was created partially or totally with AI. Simply
put, provenance data is cryptographically bound to the piece of media, meaning
any alteration to either one of them would alert an algorithm that the media
can no longer be authenticated. You can learn more about how this cryptography
works by reading the C2PA technical specifications. This protocol was created
by the Coalition for Content Provenance and Authenticity, also known as C2PA.
Adobe, Arm, Intel, Microsoft and Truepic all support C2PA, which is a joint
project that brings together the Content Authenticity Initiative and Project
Origin. The Content Authenticity Initiative is an organization founded by
Adobe to encourage providing provenance and context information for digital
media.
Multi-modal data protection with AI’s help
First, there is a malicious mind behind the scenes thinking and scheming on
how to change a given message for exfiltration. That string for exfil is not
intrinsically tied to a medium: it could go out over Wi-Fi, mobile, browser,
print, FTP, SSH, AirDrop, steganography, screenshot, BlueTooth, PowerShell,
buried in a file, over messaging app, in a conferencing app, through SaaS, in
a storage service, and so on. A mind must consciously seek a method and morph
the message to a new medium with an adversary and their toolkit in mind to
succeed and, in this case, to get points in the hackathon. Second, a mind is
required to recognize the string in its multiple forms or modes. Classic data
loss prevention (DLP) and data protection works with blades that are
disconnected from one another: a data type is searched for with unique search
criteria and expected sampling data type and format. These can be simple, such
as credit card numbers or social security numbers in HTTP, or complex like
looking for data types that look like a contract in email attachments.
Quote for the day:
"Coaching isn't an addition to a
leader's job, it's an integral part of it." --
George S. Odiorne
No comments:
Post a Comment