What’s New In Gartner’s Hype Cycle For Emerging Technologies, 2020
Gartner believes that Composite AI will be an enabling technology for
organizations that don’t have access to large historical data sets or have AI
expertise in-house to complete complex analyses. Second, Gartner believes that
Composite AI will help expand the scope and quality of AI applications. Early
leaders in this area include ACTICO, Beyond Limits, BlackSwan Technologies,
Cognite, Exponential AI, FICO, IBM, Indico, Petuum and ReactiveCore. ... The
goal of Responsible AI is to streamline how organizations put responsible
practices in place to ensure positive AI development and use. One of the most
urgent use cases of Response AI is identifying and stopping “deep fakes”
production globally. Gartner defines the category with use cases that involve
improving business and societal value, reducing risk, increasing trust and
transparency and reducing bias mitigation with AI. Of the new AI-based
additions to the Hype Cycle this year, this is one that leads all others on
its potential to use AI for good. Gartner believes responsible AI also needs
to increase the explainability, accountability, safety, privacy and regulatory
compliance of organizations as well.
How to ensure CIO and CMO alignment when making technology investment decisions
Often, total cost of ownership (TCO) for handling the complexity, maintenance
and technical debt with new platforms can turn out to be a real burden for
organisations. In fact, according to Gartner, more than three-quarters of
orgnaisations found the technology buying process complex or difficult. But is
this really surprising? Implementing the right technology solution for the
business is often challenging due to the different priorities that CMOs and
CIOs have. While for the CMO the priority is to adopt the latest innovations
as soon as possible in order to stay ahead of the competition, this need has
to fit the CIO’s focus on TCO for the long-term. The weight of these options
is what drives a wedge between those key decision makers, creating a need to
find common ground sooner. Being aligned is essential so that they can choose
the right options which will allow marketing to execute on strategy and hit
company targets on the one hand, and meet operational requirements for
maintenance, governance and risk avoidance on the other, which are top of mind
for the CIO. To ensure that the best options are selected for the business,
the CMO’s priorities need to meet those of the CIO and vice versa.
Save-to-transform as a catalyst for embracing digital disruption
In this approach, businesses evolve through infrastructure investments in
digital technologies. In turn, these technologies can deliver dramatic
improvements in competitiveness, performance and operating efficiency. In
response to the pandemic, the survey shows that organizations are evolving
into a “Save-to-Thrive” mindset, in which they are accelerating strategic
transformation actions specifically in response to challenges posed by
COVID-19 to make shifts to their operating models, products and services and
customer engagement capabilities. “The Save-to-Thrive framework will be
essential to success in the next normal as companies rely on technology and
digital enablement — with a renewed emphasis on talent — to improve their
plans for strategic cost transformation and overall enterprise performance
improvement,” said Omar Aguilar, principal and global strategic cost
transformation leader, Deloitte Consulting. “Companies that react quickly and
invest in technology and digital capabilities as they pursue the strategic
levers of cost, growth, liquidity and talent will be best-positioned to
succeed.”
How big data is solving future health challenges
Unlike many other data warehousing projects, Stringer said the focus is not
just on collecting and using data if it has a specific quality level. Instead,
when data is added to LifeCourse, its quality level is noted so researchers
can decide for themselves if the data should or should not be used in their
research. The GenV initiative relies on different technologies, but the two
core pieces are the Informatica big data management platform and Zetaris.
Informatica is used where traditional extract, transform and load (ETL)
processes are needed because of its strong focus on usability. Stringer said
this criterion was heavily weighted in the product selection process.
Usability, he said, is a strong analogue for productivity. But with a
dependence on external data sources and a need to integrate more data sources
over the coming decades, Stringer said there needed to be a way to use new
datasets wherever they resided. That was why Zetaris was chosen. Rather than
rely on ETL processes, Stringer said the Zetaris platform lets GenV integrate
data from sources where ETL is not viable.
5 Key Capabilities of a Next-Gen Enterprise Architecture
Many enterprise architects look to rationalize and centralize emerging
technologies, processes, and best practices, making them available to all
business units in a self-service mode to accelerate digital transformation and
modernization initiatives across the enterprise. By defining enterprise-wide
technology standards and tools, enterprise architects strive to plan for
reusability, reducing costs and future proofing the architecture as technology
changes and enforcing data governance and privacy policies to democratize data
so that trusted data travels securely throughout the enterprise in a
frictionless, self-serve fashion. Traditional data management solutions to
support next-gen architectures are expensive, manual, and require
time-consuming processes, while newer emerging niche vendor solutions are
fragmented. As such, they require extensive integration to stitch together
end-to-end workstreams, requiring data consumers to wait months to get useful
data. Therefore, a next-gen enterprise architecture must support the entire
data pipeline, which includes the ability to ingest, stream, integrate, and
cleanse data.
India's National Digital Health Mission: A New Model To Enhance Health Outcomes
The digital health platform that NDHM is, is guided by an architectural
blueprint called the National Digital Health Blueprint (NDHB), developed a few
months earlier. The NDHB has put in place a structure to the thinking and
approach. It established the vision and principles, architecture requirements
and specifications, applicable standards and regulations, high-priority
services, and institutional mechanisms needed to realize the mission of
digital health. The NDHB is crafted to unlock enormous benefits for citizens,
create new opportunities and financial, productivity, and transparency gains
and make a positive contribution to growth, innovation, and knowledge sharing.
A digital platform with a national footprint evokes immediate pushback as it
is generally seen to steer the narrative towards centralization. The
architecture deliberately and explicitly addresses this ‘concern’ to ensure
that India’s overall federated structure of governance is reflected in the
architecture as well. In a large country like India, where there are multiple
layers of government – national (central), state, local (urban), and local
(rural) – the responsibilities are distributed and this is guaranteed by the
constitution.
Data Governance Should Not Threaten Work Culture
The discipline of data governance must focus on knowing who these people are,
helping them to make more actionable decisions, and empowering them to become
better stewards. People who define data must know what it means to define data
better, and that includes providing meaningful business definitions for data
and managing how often data is replicated across the organization. People who
produce the data must know what quality data looks like, and they must be
evaluated on the quality of the data they produce. And, the no-brainer. People
in the organization who use the data, must understand how to use it, and
follow the rules associated with using it appropriately. That means data
consumers must follow the protection and privacy rules, the business rules,
and use the data in the ethical manner spelled out by the organization. While
people already define, produce, and use data, data governance requires that
these people consistently follow the rules and standards for the action they
take with that data. The rules and the standards are important metadata, data
about the data, that must be recorded and made available to the people across
the organization to assist in the discipline of data governance.
Defining a Data Governor
Without oversight, employees will misinterpret data, sensitive data may be
shared inappropriately, employees will lack access to necessary data, and
employees’ analysis will often be incorrect. A Data Governor will maintain and
improve the quality of data and ensure your company is compliant with any
regulations. It is a vital role to have for any informed company. With the
exploding volume of data within companies, it has become extremely difficult
for a small technical team to govern an entire organization’s data. As this
trend continues, these Data Scientists and Analysts should transition
themselves from their traditional reporting responsibilities to those of Data
Governors. In a traditional reporting role, their day was filled with
answering questions for various business groups around their needed metrics.
The shift to Data Governors finds them instead creating cleaned, documented
data products for those end business groups to explore themselves. This is
called Democratized Data Governance, where the technical team (traditionally
data gatekeepers) handles the technical aspects of governance and share the
responsibilities of analytics with the end business groups.
Blockchain for Applications ~ A Multi-Industry Solution
The workings of blockchain are somewhat common knowledge now. A decentralized
network of interconnected links that share all data among its peers, keeping a
chronological log of each transaction. Simply put- “Everything that happens in
the blockchain network is shared by all members of the network and everyone
has a record of it on their individual device” Hence, in a way these
block-chains form a binding link with each other and through this
decentralized model of information storage, it liberates from the risk &
inefficiencies of having all data stored in one place only. ... DApps or
decentralized applications function without any central server to help
interact with two parties. Blockchain users operate on mini-servers that work
simultaneously to verify and exchange data. There are 2 kinds of blockchains,
segregated on the basis of access and permissions – “Permissionless
blockchain” & “permissioned blockchain”. A permissionless network grants
full transparency and allows each member to verify transaction details,
interact with others while staying completely anonymous. Bitcoin works on a
permissionless blockchain.
How to manage your edge infrastructure and devices
Another aspect to consider when managing edge infrastructure and devices is to
invest in discovery processes. “Edge by nature creates a distributed approach –
accelerated by the current global pandemic – that needs a more flexible style of
management,” said David Shepherd, area vice-president, pre-sales EMEA at Ivanti.
“But ultimately, if we don’t know what we are managing then it becomes difficult
to even start managing in a comprehensive manner. “Effective discovery processes
allow an organisation to apply the right management policies at the right time.
As more devices start to appear at the edge, the context of the device plays a
crucial role. “This includes the type of device and the interaction it has with
the infrastructure, plus its location (often remote). Understanding what a
device is and how it interacts is again crucial to applying a comprehensive
management approach. ... “Zero-touch provisioning, for example, enables easier
onboarding of IoT devices onto an IoT cloud platform, e.g. AWS, as it enables
automatic provisioning and configuration. This prevents developer error during
the provisioning and configuration process, as well as provide a more secure
interaction between the device and platform as the security framework had
already been established on both ends during the pre-production stage.
Quote for the day:
"The hard part isn't making the decision. It's living with it." -- Jonas Cantrell
No comments:
Post a Comment