How to minimise technology risk and ensure that AI projects succeed
Organisations are using lots of different technologies and multiple processes
to try and manage all this, and that’s what’s causing the delay around getting
models into production and being used by the business. If we can have one
platform that allows us to address all of those key areas, then the speed at
which an organisation will gain value from that platform is massively increased.
And to do that, you need an environment to develop the applications to the
highest level of quality and internal customer satisfaction, and an environment
to then consume those applications easily by the business. Sounds like the
cloud, right? Well, not always. When you look at aligning AI, you also have to
think about how AI is consumed across an organisation; you need a method to move
it from R&D into production, but when it’s deployed, how do we actually use
it? What we are hearing is that what they actually want is a hybrid development
and provisioning environment, where this combination of technologies could run
with no issues, no matter what your development or target environment is, such
as on cloud, on-premise, or a combination.Getting a grip on basic cyber hygiene
Get started with MLOps
ESB VS KAFKA
The appropriate answer to both questions is: “Yes, but….” In spite of their similarities, ESBs and stream-processing technologies such as Kafka are not so much designed for different use cases as for wholly different worlds. True, a flow of message traffic is potentially “unbounded” – e.g., an ESB might transmit messages that encapsulate the ever-changing history of an application’s state – but each of these messages is, in effect, an artifact of a world of discrete, partitioned – i.e., atomic – moments. “Message queues are always dealing in the discrete, but they also work very hard to not lose messages, not to lose data, to guarantee delivery, and to guarantee sequence and ordering in message transmits,” said Mark Madsen, an engineering fellow with Teradata. Stream-processing, by contrast, correlates with a world that is in a constant state of becoming; a world in which – as pre-Socratic philosopher Heraclitus famously put it – “everything flows.” In other words, says Madsen, using an ESB to support stream processing is roughly analogous to using a Rube Goldberg-like assembly line of buckets – as distinct to a high-pressure feed from a hose – to fill a swimming pool.A quick rundown of multi-runtime microservices architecture
A multi-runtime microservices architecture represents a two-component model that
very much resembles the classic client-server relationship. However, the
components that define multi-runtime microservices -- the micrologic and the
mecha -- reside on the same host. Despite this, the micrologic and mecha
components still operate on their own, independent runtime (hence the term
"multi-runtime" microservices). The micrologic is not, strictly speaking, a
component that lives among the various microservices that exist in your
environment. Instead, it contains the underlying business logic needed to
facilitate communication using predefined APIs and protocols. It is only liable
for this core business logic, not for any logic contained within the individual
microservices. The only thing it needs to interact with is the second
multi-runtime microservices component -- the mecha. The mecha is a distributed,
reusable and configurable component that provides off-the-shelf primitive types
geared toward distributed services. The mecha uses declarative configuration to
determine the desired application states and manage them, often relying on plain
text formats such as JSON and YAML.Basics Of Julia Programming Language For Data Scientists
Julia is a relatively new, fast, high-level dynamic programming language.
Although it is a general-purpose language and can be used to write all kinds of
applications, much of its package ecosystem and features are designed for
high-level numerical computing. Julia draws from various languages, from the
more low-level systems programming languages like C to high-level dynamic typing
languages such as Python, R and MATLAB. And this is reflected in its optional
typing nature, its syntax and its features. Julia doesn’t have classes; it works
around this by supporting the quick creation of custom types and methods for
these types. However, these functions are not limited to the types they are
created for and can have many versions, a feature called multiple dispatching.
It supports direct calls to C functions without any wrapper API, for example,
the struct keyword used to define custom types. And instead of defining scope
based on indentation like Python, Julia uses the keyword end, much akin to
MATLAB. It would be ridiculous to summarize all its features and idiosyncrasies;
you can refer to the wiki or docs welcome page for a more comprehensive
description of Julia.NCSC publishes smart city security guidelines
Mark Jackson, Cisco’s national cyber security advisor for the UK and Ireland,
said: “The complexity of the smart cities marketplace, with multiple device
manufacturers and IT providers in play, could quite easily present cyber
security issues that undermine these efforts. The NCSC’s principles are one of
the most sophisticated pieces of government-led guidance published in Europe to
date. “The guidance set out for connected places generally aligns to cyber
security best practice for enterprise environments, but also accounts for the
challenges of connecting up different systems within our national critical
infrastructure. “With DCMS [the Department for Digital, Culture, Media and
Sport] also planning to implement legislation around smart device security, this
is indicative of a broader government strategy to level up IoT security across
the board. “This will enable new initiatives in the field of connected places
and smart cities to gather momentum across the UK – with cyber security baked
into the design and build phase. As lockdown restrictions ease and people return
to workplaces and town centres, they need assurance that their digital
identities and data are protected as the world around becomes more connected.What if the hybrid office isn’t real?
A shift to hybrid work means that people will be returning to the office both
with varying frequencies and for a new set of reasons,” says Brian Stromquist,
co-leader of the technology workplace team at the San Francisco–based
architecture and design firm Gensler. “What people are missing right now are
in-person collaborations and a sense of cultural connection, so the workplace of
the future — one that supports hybrid work — will be weighted toward these
functions.” Offices will need a way to preserve a level playing field for those
working from home and those on-site. One option is to make all meetings “remote”
if not everyone is physically in the same space. That’s a possibility Steve
Hare, CEO of Sage Group, a large U.K. software company, suggested to
strategy+business last year. According to Stromquist, maintaining the right
dynamic will require investing in technologies that create and foster
connections between all employees, regardless of physical location. “We’re
looking at tools like virtual portals that allow remote participants to feel
like they’re there in the room, privy to the interactions and side conversations
that you’d experience if you were there in person,” he says.Real-time data movement is no longer a “nice to have”
Applications and systems can “publish” events to the mesh, while others can
“subscribe” whatever they are interested in, irrespective of where they are
deployed in the factory or data centre, or the cloud. This is essential for
critical industries we rely on, such as capital markets, industry 4.0, and a
functional supply chain. Indeed, there are few industries today who can do
without as-it-happens updates on their systems. Businesses and consumers demand
extreme responsiveness as a key part of a good customer experience, and many
technologies depend on real-time updates to changes in the system. However, many
existing methods for ensuring absolute control and precision of such
time-sensitive logistics don’t holistically operate in real-time, at scale,
without data loss, and therefore open room for fatal error. From retail, which
relies on the online store being in constant communication with the warehouse
and the dispatching team, to aviation, where pilots depend on real-time weather
updates in order to carry the passengers to safety, today’s industries cannot
afford anything other than real-time data movement. Overall, when data is
enabled to move in this way, businesses can make better decisions.The Cloud Comes of Age Amid Unprecedented Change
Just look at how businesses compete. The influx of cloud technologies during the
pandemic has underlined that the technology stack is a core mode of
differentiation. Industry competition is now frequently a battle between
technology stacks, and the decisions leaders make around their cloud foundation,
cloud services and cloud-based AI and edge applications will define their
success. Look at manufacturing, where companies are using predictive analytics
and robotics to inch ever closer to delivering highly customized on-demand
products. The pandemic has forced even the most complex supply chain operations
from manufacturers to operate at the whim of changing government requirements,
consumer needs and other uncontrollable factors, such as daily pandemic
fluctuations. Pivot quickly and you’ll not only emerge as leaders of your
industry, you may even gain immeasurable consumer intimacy. A true cloud
transformation should start with a plan to shift significant capabilities to
cloud. It is more than just migrating a few enterprise applications.
Implementing a “cloud first” strategy requires companies to completely reinvent
their business for cloud by reimagining their products or services, workforce,
and customer experiences.Quote for the day:
"Don't try to be the "next". Instead, try to be the other, the changer, the new." -- Seth Godin
No comments:
Post a Comment