Making the Leap From Data Governance to AI Governance
One of the AI governance challenges Regensburger is researching revolves around
ensuring the veracity of outcomes, of the content that’s generated by GenAI.
“It’s sort of the unknown question right now,” he says. “There’s a liability
question on how you use…AI as a decision support tool. We’re seeing it in some
regulations like the AI Act and President Biden’s proposed AI Bill Rights, where
outcomes become really important, and it moves that into the governance sphere.”
LLMs have the tendency to make things up out of whole cloth, which poses a risk
to anyone who uses it. For instance, Regensburger recently asked an LLM to
generate an abstract on a topic he researched in graduate school. “My background
is in high energy physics,” he says. “The text it generated seemed perfectly
reasonable, and it generated a series of citations. So I just decided to look at
the citations. It’s been a while since I’ve been in graduate school. Maybe
something had come up since then? “And the citations were completely
fictitious,” he continues. “Completely. They look perfectly reasonable. They had
Physics Review Letters. It had all the right formats. And at your first casual
inspection it looked reasonable.
Architecting for Industrial IoT Workloads: A Blueprint
The first step in an IIoT-enabled environment is to establish communication
interfaces with the machinery. In this step, there are two primary goals: read
data from machines (telemetry) and write data to machines Machines in a
manufacturing plant can have legacy/proprietary communication interfaces and
modern IoT sensors. Most industrial machines today are operated by
programmable logic controllers (PLC). A PLC is an industrial computer
ruggedized and adapted to control manufacturing processes—such as assembly
lines, machines, and robotic devices — or any activity requiring high
reliability, ease of programming and process fault diagnosis. However, PLCs
provide limited connectivity interfaces with the external world over protocols
like HTTP and MQTT, restricting external data reads (for telemetry) and writes
(for control and automation). Apache PLC4X bridges this gap by providing a set
of API abstractions over legacy and proprietary PLC protocols. PLC4X is an
open-source universal protocol adapter for IIoT appliances that enables
communication over protocols including, but not limited to, Siemens S7,
Modbus, Allen Bradley, Beckhoff ADS, OPC-UA, Emerson, Profinet, BACnet and
Ethernet.
6 user experience mistakes made for security and how to fix them
The challenge here is to communicate effectively with your non-experts in a
way that they understand the “what” and “why” of cybersecurity. “The goal is
to make it practical rather than condescending, manipulative, or punitive,”
Sunshine says. “You need to take down that fear factor.” So long as people
have the assurance that they can come clean and not be fired for that kind of
mistake, they can help strengthen security by coming forward about problems
instead of trying to cover them up. ... To achieve optimal results, you have
to strike the right balance between the level of security required and the
convenience of users. Much depends on the context. The bar is much higher for
those who work with government entities, for example, than a food truck
business, Sunshine says. Putting all the safeguards required for the most
regulated industries into effect for businesses that don’t require that level
of security introduces unnecessary friction. Failing to differentiate among
different users and needs is the fundamental flaw of many security protocols
that require everyone to use every security measure for everything.
5 New Ways Cyberthreats Target Your Bank Account
Deepfake technology, initially designed for entertainment, has evolved into a
potent tool for cybercriminals. Through artificial intelligence and machine
learning, these technologies fuel intricate social engineering attacks,
enabling attackers to mimic trusted individuals with astonishing precision.
This proficiency grants them access to critical data like banking credentials,
resulting in significant financial repercussions. ... Modern phishing tactics
now harness artificial intelligence to meticulously analyse extensive data
pools, encompassing social media activities and corporate communications. This
in-depth analysis enables the creation of highly personalised and contextually
relevant messages, mimicking trusted sources like banks or financial
institutions. This heightened level of customisation significantly enhances
the credibility of these communications, amplifying the risk of recipients
disclosing sensitive information, engaging with malicious links, or
unwittingly authorising fraudulent transactions. ... Credential stuffing is a
prevalent and dangerous method cybercriminals use to breach bank accounts.
This attack method exploits the widespread practice of password reuse across
multiple sites and services.
Italian Businesses Hit by Weaponized USBs Spreading Cryptojacking Malware
A financially motivated threat actor known as UNC4990 is leveraging weaponized
USB devices as an initial infection vector to target organizations in Italy.
Google-owned Mandiant said the attacks single out multiple industries,
including health, transportation, construction, and logistics. "UNC4990
operations generally involve widespread USB infection followed by the
deployment of the EMPTYSPACE downloader," the company said in a Tuesday
report. "During these operations, the cluster relies on third-party websites
such as GitHub, Vimeo, and Ars Technica to host encoded additional stages,
which it downloads and decodes via PowerShell early in the execution chain."
... Details of the campaign were previously documented by Fortgale and Yoroi
in early December 2023, with the former tracking the adversary under the name
Nebula Broker. The infection begins when a victim double-clicks on a malicious
LNK shortcut file on a removable USB device, leading to the execution of a
PowerShell script that's responsible for downloading EMPTYSPACE (aka
BrokerLoader or Vetta Loader) from a remote server via another intermedia
PowerShell script hosted on Vimeo.
Understanding Architectures for Multi-Region Data Residency
A critical principle in the context of multi-region deployments is
establishing clarity on truth and trust. While knowing the source of truth for
a piece of data is universally important, it becomes especially crucial in
multi-region scenarios. Begin by identifying a fundamental unit, an "atom,"
within which all related data resides in one region. This could be an
organizational entity like a company, a team, or an organization, depending on
your business structure. Any operation that involves crossing these atomic
boundaries inherently becomes a cross-region scenario. Therefore, defining
this atomic unit is essential in determining the source of truth for your
multi-region deployment. In terms of trust, as different regions hold distinct
data, communication between them becomes necessary. This could involve
scenarios like sharing authentication tokens across regions. The level of
trust between regions is a decision rooted in the specific needs and context
of your business. Consider the geopolitical landscape if governments are
involved, especially if cells are placed in regions with potentially
conflicting interests.
Developing a Data Literacy Program for Your Organization
Before developing a data literacy program for an organization, it is crucial
to conduct a comprehensive training needs assessment. This assessment helps in
understanding the current level of data literacy within the organization and
identifying areas that require improvement. It involves gathering information
about employees’ existing knowledge, skills, and attitudes toward data
analysis and interpretation. To conduct the needs assessment, different
methods can be employed. Surveys, interviews, focus groups, or even analyzing
existing data can provide valuable insights into employees’ proficiency levels
and their specific learning needs. By involving various stakeholders, such as
managers, department heads, and employees themselves, in this process, a
holistic understanding of the organization’s requirements can be achieved. ...
It is also beneficial to compare the program’s outcomes against predefined
benchmarks or industry standards. This allows organizations to benchmark their
progress against other similar initiatives and identify areas where further
improvements are necessary. Overall, continuously evaluating the effectiveness
of a data literacy program helps organizations understand its impact on
individuals’ capabilities and organizational performance.
Women In Architecture: Early Insights and Reflections
The question of why there so few women in architecture is a key one in our
minds. Rather than dwelling on the negative, the conversations focus on
identifying the root causes to help us move into action effectively. I have
learned that the answer to this question is incredibly nuanced and layered,
with many interrelated factors. Some root causes for fewer women in
architecture draw from the macro level context, including a similar set of
challenges experienced by women in technology. However, one of the biggest
contributors is the architecture profession itself and how it is presented.
This has been a hard truth that has asserted itself as a common thread
throughout the conversations. For example, the lack of clarity regarding the
role and value proposition of architecture, often perceived as abstract,
technical, and unattainable, poses a substantial barrier. ... However, there
is a powerful correspondence between the momentum for more diversity in
architecture and exactly what the profession needs most now. For architects of
the future to thrive, it’s not enough to excel at cognitive, architectural,
and technical competencies, but just as important to master the human
competencies such as communication, influence, leadership, and emotional
intelligence.
New York Times Versus Microsoft: The Legal Status of Your AI Training Set
One of the problems the tech industry has had from the start is product
contamination using intellectual property from a competitor. The tech industry
is not alone, and the problem of one company illicitly acquiring the
intellectual property of another and then getting caught goes back decades. If
an engineer uses generative AI that has a training set contaminated by a
competitor’s intellectual property, there is a decent chance, should that
competitor find out, that the resulting product will be found as infringing
and be blocked from sale -- with the company that had made use of that AI
potentially facing severe fines and sanctions, depending on the court’s
ruling. ... Ensuring any AI solution from any vendor contains indemnification
for the use of their training set or is constrained to only use data sets that
have been vetted as fully under your or your vendor’s legal control should be
a primary requirement for use. (Be aware that if you provide AI capabilities
to others, you will find an increasing number of customers will demand
indemnification.) You’ll need to ensure that the indemnification is adequate
to your needs and that the data sets won’t compromise your products or
services under development or in market so your revenue stream isn’t put at
risk.
How to calculate TCO for enterprise software
It’s obvious that hardware, once it has reached end-of-life, needs to be
disposed of properly. With software, there are costs as well, primarily
associated with data export. First, data needs to be migrated from the old
software to the new, which can be complex given all the dependencies and
database calls that might be required for even a single business process. Then
there’s backups and disaster recovery. The new software might require that
data to be formatted in a different way. And you still might need to keep
archived copies of certain data stores from the old system for regulatory or
compliance reasons. Another wrinkle in the TCO calculation is estimating how
long you plan to use the software. Are you an organization that doesn’t change
tech stacks if it doesn’t have to and therefore will probably run the software
for as long as it still does the job? In that case, it might make sense to do
a five-year TCO analysis as well as a 10-year version. On the other hand, what
if your company has an aggressive sustainability strategy that calls for
eliminating all of its data centers within three years, and moving as many
apps as possible to SaaS alternatives.
Quote for the day:
"One advantage of talking to yourself
is that you know at least somebody's listening." --
Franklin P. Jones
No comments:
Post a Comment