5 best practices for digital twin implementation
Rather than wait until post-build, consider initiating digital twins during
the planning, design, and construction phases of your projects. At the
planning stage, this can enable plan simulation and various what-if scenario
testing prior to committing to real-world investment. Part of the benefit
of digital twins is they can address the full lifecycle from construction
twins to operational twins. The digital twins, therefore, know far more than
after-the-fact asset management systems, and the learnings and insights
captured by the twin during design and build can improve operations and
maintenance. According to Rapos, early incorporation allows for better
data collection, more accurate modeling, and immediate feedback during the
construction or development phase. It’s crucial to understand that digital
twins aren’t just a final product, but a dynamic tool that evolves and adds
value throughout the project’s life. Delaying its development can result in
missed opportunities for optimization and innovation.
Why exit the cloud? 37signals explains
37signals was a significant cloud user with a $3.2 million cloud budget for
2022. The company pledged $600,000 to procure Dell servers, envisioning
significant savings during the next five years. Of course, there were
questions, and Hansson did an excellent job of addressing them one by one in
the FAQ, such as the additional costs in terms of humans needed to run the
on-premises systems, how optimization only took them so far in the cloud, and
how they handled security requirements. Hansson also explained the limited
abilities of cloud-native applications to reduce costs and highlighted the
need for a world-class team to address security concerns, which the company
has. Notably, privacy regulations and GDPR compliance were underscored as
reasons for European companies to opt for self-owned hardware as opposed to
relying on the cloud. Of course, this is not the case for everyone. ...
Everyone is looking for a single answer, and it doesn’t exist. The
requirements of your systems will dictate what platform you should use—not
whatever seems trendy. Sometimes the cloud provides the most value, but not
always.
Size doesn’t matter!
Small enterprises are less likely to have dedicated IT staff, let alone afford
cyber security specialists. Security solutions are usually considered too
expensive(Chidukwani 2022) and their technical features come across as
overwhelmingly complex to be handled in-house. As a consequence, there is a
tendency to rely heavily on external IT vendors that provide sub-optimal support
without customized care(Benz 2020). Fear-driven, some business owners take up
the reactive route. Instead of a unified threat solution, they continue to buy
off-the-shelf security products in response to recent emerging threats, leaving
may leakages unplugged and ineffective protection. These human, financial, and
technical resource constraints create a puzzling gap between the cyber security
awareness of small business leaders and their commensurate commitment to address
the risk. Alongside the well-known construct of the ‘digital divide”, academic
literature now also acknowledges a ‘security divide’, what with lagging
investments in cybersecurity solutions coupled with increasing cyber incidents
at SMEs (Heidt et al., 2019).
Cybersecurity challenges emerge in the wake of API expansion
APIs are already the fundamental building blocks of any modern organization
today, and that will become even more evident going forward. As organizations
look to transform their digital business and enter the era of the API economy,
we expect that we will be building and using more and more APIs. That’s
especially true if we take a look at some of the trends that are happening in
technology nowadays. Things like VR/AR glasses, wearable devices, and
voice-controlled devices all require APIs to work. APIs will play a more
critical role as the world transitions to more browserless devices. All this
growth and expansion means more APIs, requests, and security challenges. The
toughest thing about API security is that, in most cases, organizations don’t
know that hackers exploit their APIs because they don’t have access to API data
in real-time. That’s why tooling, which allows you to do that, will become even
more critical.
Attackers Abuse Google OAuth Endpoint to Hijack User Sessions
OAuth enables applications to get access to data and resources to other
trusted online services and sites based on permissions set by a user, and it
is the mechanism responsible for the authentication handoff between the sites.
While the standard is certainly useful, it also presents risk to organizations
if it's not implemented correctly, and there are a number of ways attackers
can abuse vulnerable instances and the standard itself. For example, security
researchers have found flaws in its implementation that have exposed key
online services platforms such as Booking.com and others to attack. Meanwhile,
others have used malicious OAuth apps of their creation to compromise
Microsoft Exchange servers. In the case of the Google endpoint, the OAuth
exploit discovered by Prisma targets Google Chrome's token_service table to
extract tokens and account IDs of logged-in Chrome profiles, according to
CloudSEK. That table contains two "crucial" columns, titled "service (GAIA
ID)" and "encrypted_token," Karthick M explained.
Observability in 2024: More OpenTelemetry, Less Confusion
Observability has transcended its traditional association with monitoring to
find bugs and to resolve outages, and now extends its influence across
different interfaces, tools, and demonstrating enhanced openness and
compatibility to increasingly make forecasts. These frecasts can involve
predicting outages before they happen, cost shifts, resources usage and other
variables that certainly would be much harder and mostly involve trial and
error previously. ... “This means that organizations can now use a
single agent to collect observability data across their increasingly
distributed and therefore complex universe of microservices applications,”
“This could significantly simplify one of today’s most significant pain points
in observability: instrumentation. Developers can now benefit from the
continuously increasing auto-instrumentation capabilities of OpenTelemetry and
no longer have to worry about instrumenting their code for specific
observability platforms,” Volk said. However, such a freedom of choice due to
a proliferation of tools has created challenges of its own.
IT’s Key Role in Planting ESG Effort
The one thing we know about all compliance measures is that they require new
levels of integration that the company usually lacks. If you can focus on
integration work now, you will be more agile-and better prepared for ESG regs
when they hit. Keep your ears to the ground - You can learn a lot about the
directions ESG is taking from your outside audit firms, regulators and your
internal legal or regulatory department. These entities already have
information in advance on future ESG directions and what laws or regulations
are likely to be forthcoming. Do your part internally - Several years ago, I
was visiting with the CIO of a large healthcare company in the Northeast. He
told me that the company wanted to trim its carbon footprint and that the
first place the company looked for tangible results was in the data center.
“This prompted us to move more IT to the cloud, and even to build a new,
eco-friendly data center,” he said. “We virtualized servers as much as
possible, reduced energy consumption, mandated that all new equipment we
purchased used less power, and even redid the HVAC unit airflows.”
Why 2024 will be the year of ‘augmented mentality’
With this AI technology now available for consumer use, companies are rushing
to build them into systems that can guide you through your daily interactions.
This means putting a camera, microphone and motion sensors on your body in a
way that can feed the AI model and allow it to provide context-aware
assistance throughout your life. The most natural place to put these sensors
is in glasses, because that ensures cameras are looking in the direction of a
person’s gaze. Stereo microphones on eyewear (or earbuds) can also capture the
soundscape with spatial fidelity, allowing the AI to know the direction that
sounds are coming from — like barking dogs, honking cars and crying kids. In
my opinion, the company that is currently leading the way to products in this
space is Meta. Two months ago they began selling a new version of their
Ray-Ban smart glasses that was configured to support advanced AI models. The
big question I’ve been tracking is when they would roll out the software
needed to provide context-aware AI assistance.
Google flaunts concurrency, optimization as cloud rivals overhaul platforms
Kazmaier explains that Google’s approach to concurrency avoids spinning up
more virtual machines and instead improves performance on a sub-CPU level
unit. “It moves these capacity units seamlessly around, so you may have a
query which is finishing and freeing up resources, which can be moved
immediately to another query which can benefit from acceleration. All of
that micro-optimization takes place without the system sizing up. It's
constantly giving you the ideal projection of the capacity you use on the
workloads you run,” he says. A paper from Gartner earlier last year approved
of the approach. "A mix of on-demand and flat-rate pricing slot reservation
models provides the means to allocate capacity across the organization.
Based on the model used, slot resources are allocated to submitted queries.
Where slot demand exceeds current availability, additional slots are queued
and held for processing once capacity is available. This processing model
allows for continued processing of concurrent large query workloads," it
says.
As AI Advances, Who Is Looking to Its Architecture?
There is a case to be made, though, that enterprise architects have a much
more fundamental role to play in our current phase of technological
evolution than simply implementing its advancements into our workflows. AI
solutions must seek to enhance the role of the enterprise architecture and
their productivity, not attempt to supplant it. Standards are important not
just because they enable collaboration, but because they build consensus. A
successful standard draws on the insights and expertise of the whole
community of practitioners which needs to use it. In that process, many
conversations are had – and occasionally quite fraught ones – in the
interest of finding a common understanding of what a good, mature,
responsible, successful approach looks like. One that puts the human at the
center of the decision loop. The point of listing so many of AI’s potential
positive outcomes earlier in this article was not just to emphasize how
dramatic and wide-ranging its impact could be.
Quote for the day:
"People often say that motivation
doesn't last. Well, neither does bathing - that's why we recommend it
daily." -- Zig Ziglar
No comments:
Post a Comment