Despite recent progress, AI-powered chatbots still have a long way to go

Even state-of-the-art systems struggle to have a human-like conversation without
tripping up, clearly. But as these systems improve, questions are arising about
what the experience should ultimately look like. Values, dialects, and social
norms vary across cultures, ethnicities, races, and even sexual identities,
presenting a major challenge in designing a chatbot that works well for all
potential users. An ostensibly “safe,” polite, and agreeable chatbot might be
perceived as overly accommodating to one person but exclusionary to another.
Another unsolved problem is how chatbots should treat controversial topics like
religion, illegal activities, conspiracy theories, and politics — or whether
they should opine about these at all. A recent paper coauthored by researchers
at Meta explores the potential harm that might arise from chatbots that give
poor advice, particularly in the medical or psychological realms. In a prime
example, OpenAI’s GPT-3 language model can be prompted to tell a person to
commit suicide.
Four emerging data integration trends to assess

Modern data integration technologies focus on advanced automation, connected
data intelligence and persona-based interactive tooling, helping organisations
to accelerate various use cases and other data integration requirements.
Distributed hybrid and multicloud data is creating new integration challenges.
Data lives everywhere, so centralising it into data lakes or data hubs to
support business insights is no longer practical, especially with the explosion
of data at the edge. Forrester expects the adoption of data integration systems
to proliferate in the coming years as organisations look for supporting insights
across multicloud, hybrid cloud and edge environments. Artificial
intelligence (AI) is driving the next level of data integration solutions. New
and innovative AI features are helping enterprises to automate data integration
functions, including data ingestion, classification, processing, security and
transformation. Although AI capabilities within data integration are still
emerging, areas that technology architecture and delivery leaders can leverage
today include the ability to discover connected data, classify and categorise
sensitive data, identify duplicates and orchestrate silos.
Engineering EDA and microservices applications for performance

Microservices application architecture is taking root across the enterprise
  ecosystem. Organizing and efficiently operating microservices in multicloud
  environments and making their data available in near-real time are some of the
  key challenges enterprise architects confront with this design. Thanks to
  developments in event-driven architecture (EDA) platforms (such as Apache
  Kafka) and data-management techniques (such as data mesh and data fabrics),
  designing microservices-based applications has become much easier. However, to
  ensure that microservices-based applications perform at requisite levels, you
  must consider critical non-functional requirements (NFRs) in the design. NFRs
  address how a system operates, rather than how the system functions (the
  functional requirements). The most important NFRs involve performance,
  resiliency, availability, scalability, and security. This article describes
  designing for performance, which entails low-latency processing of events and
  high throughput. Future articles will address the other NFRs.
12 CISO resolutions for 2022

“It’s important for our security teams to have visibility into all aspects
    of cloud applications, on-prem applications, network, services, systems,
    databases, accounts, third-party providers, etc. to help fortify our
    cybersecurity defenses,” Karki explains. “Having a complete, accurate and
    appropriately prioritized inventory of all our hardware, software, and
    supply chain assets enables our security teams to take a systematic approach
    to knowing what needs to be safeguarded, what controls to implement to
    protect, defend, and respond against any adverse events, and how to identify
    and produce metrics that tell the full story about our current security
    posture.” ... Although the complexity of that mesh has been growing for
    years, Van Horenbeeck says events during the past two years such as
    SolarWinds and Log4j have reinforced for him the criticality of
    understanding all the moving parts that make up his company’s technology
    ecosystem. To that end, Van Horenbeeck has invested in technology to gain a
    fuller understanding of his own company’s IT environment. 
New kids on the blockchain - or more of the same old?

Blockchain and Distributed Ledger Technology (DLT) have been on a downward
    swing in the hype cycle. The lack of clarity about why some data needs to be
    on a decentralised network at all remains, as does the suspicion that other
    ventures may just be offloading energy costs onto customers – no minor
    concern as prices soar. Meanwhile some recent NFT releases have made
    non-fungible tokens seem like a satire on the digital economy – a
    Situationist joke. But one area where blockchain may have useful
    applications is establishing a secure digital identity, according to a
    techUK seminar this week. The Zoom event brought together four DLT
    luminaries, from finance, government, agriculture, and digital identity
    itself. The intention was to challenge misconceptions and set out a viable
    route to market, according to Laura Foster, techUK’s Programme Manager for
    Technology and Innovation. However, she then passed the chair to potentially
    the most interesting speaker, Genevieve Leveille, key founder and CEO of
    AgriLedger – a distributed-ledger app for the farming supply chain, who did
    a good job.
Enterprise architecture in the agile era: Less policing, more coaching
One principle of agile EA is not to boil the ocean by collecting every bit
    of information about an organisation before providing insights or
    recommendations, says Gordon Barnett, principal analyst at Forrester. To
    speed the process, agile EA practitioners refer to a “minimum viable
    architecture” or “just enough architecture” to meet an urgent business
    problem, making frequent changes to the EA process as needed. But, Barnett
    warns, the key is to choose the right elements to include to ensure that
    such a minimal architecture doesn’t limit its future usefulness. For
    organisations that are heavily reliant on SaaS applications and the cloud,
    “a minimum viable architecture helps hold together that distributed
    ecosystem” with technology standards and more collaborative governance
    models, even if it doesn’t provide a central repository of the distributed
    assets that now support the business, Gartner’s Blosch adds. At SYNLABS
    Jones began by concentrating on “the key pieces of information we needed to
    understand the business in terms of the application portfolio” and narrowed
    his search to, at most, “20 pieces of information about an application.”
Four Principles Every Organization Implementing Intelligent Automation Must Live By

Intelligent automation is a subset of artificial intelligence (AI). It is
    the computerization of processes traditionally carried out by people. Moving
    beyond current automation technologies (such as robotic process automation),
    intelligent automation replicates more complex processes — especially those
    that involve human decision-making. It gives organizations the opportunity
    to increase efficiency, improve customer experience and generate new
    revenues through automated digital products and services. But organizations
    that take to the sky with intelligent automation programs — without properly
    understanding the success factors — risk dropping quickly back down to
    Earth. As François Candelon, Rodolphe Charme di Carlo, Midas De Bondt and
    Theodoros Evgeniou wrote in Harvard Business Review, "For most of the past
    decade, public concerns about digital technology have focused on the
    potential abuse of personal data." But now, "attention is shifting to how
    data is used by the software."
Architecting for Resilience Panel
When it's about starting up new to the technology, there's obviously a
    strong pull towards the pre, and like, how do we connect with our supply
    chain, and CI/CDs, and what gets deployed there? Where is the source of
    truth of configurations of resiliency? Is it in my Git and my Git stuff? Is
    it in a separate system? Where should I change what? How should it change? A
    lot of the challenges are around setting up those organizational processes
    in terms of who changes what, where, and how does that get approved?
    Ultimately, then it gets to Nora's world, which is, if things do go wrong,
    who's accountable? How do I recover? Who's alerted? How quickly? Simple
    things to nail home at one point, which is, we do certain things like, ok,
    every service, there should be a team that owns it. It should have an owner.
    It's a very simple concept. You will be surprised how not implemented it is,
    like our lack of implementation of that. I've heard stories of, this went
    down, and we went down tracking, and we ended up at a service of like, who
    wrote this? This dude left three years ago.
Network from home: how data privacy and security responsibilities must be shared

Having entire organisations working remotely is a relatively new phenomenon,
    but traditional security advice remains very relevant. Remote workers should
    consider the technologies available to support their remote security needs,
    such as a password manager that allows you to use a variety of passwords and
    rotate them often, without having to remember them all, but which makes it
    difficult for hackers to access your different accounts based on a single
    master password that could have been compromised when relied upon too often.
    Equally, employees can upgrade their applications and tools to improve their
    privacy posture. Search engines and browsers such as DuckDuckGo, Brave
    Browser and Ecosia that give you more control over your privacy exposure can
    help minimise the risk of attack and personal information loss. Network
    firewalls are another tool to consider, which can help upgrade a home
    network into an environment more consistent with that of the office by
    monitoring network traffic, blocking malicious websites and allowing you to
    moderate how others access resources.
The new rules of succession planning

The problem with identifying top candidates often lies in how a short list
    is generated. Traditionally, the focus is on who the leader is without
    significant weight put on what skills he or she needs to deliver on the
    company’s strategy. If succession discussions are to be transformed into
    more of an upstream process for the board—and members are to have a clear
    understanding of what the company needs before discussing the best
    candidates—then the process must account for three distinct and entirely
    predictable challenges. Because they are predictable, these challenges can
    be anticipated and overcome. First, start with the what and not the who.
    Doing so will lay out a more realistic and substantive framework. Second,
    from this vantage point, try to explicitly minimize the noise in the
    boardroom. Ensure that the directors are using shared, contextual
    definitions of core jargon, such as strategy, agility, transformation, and
    execution. Third, root the follow-on analyses of the candidates in that
    shared understanding, and base any assessments on a factual evaluation of
    their track records and demonstrated potential in order to minimize the bias
    of the decision-makers themselves.
Quote for the day:
"Leadership is the art of giving
      people a platform for spreading ideas that work" --
      Seth Godin
 
 
No comments:
Post a Comment