Data should be a first-class citizen in the cloud
A close cousin of the interoperability problem, data access and control are
limited in many cloud environments if not designed properly and can prevent
organizations from truly harnessing their business data. There doesn’t seem to
be a middle ground here; either data is entirely accessible or not at all.
Mostly, the controller is turned off and valuable data goes unleveraged and
systems are underoptimized. You only need to look at the rise of generative AI
systems to understand how this limitation affects the value of these systems. If
the data is not accessible, then the knowledge engines can’t be appropriately
trained. You’ll have dumb AI. This lack of control is due to opaque data
ownership models and limited data processing and storage control. The solution
is for organizations to create greater transparency and control over their data.
This includes defining access privileges, managing encryption, and deciding how
and where data is stored. This would ensure that data owners retain sovereignty
and information is still available.
Where Data Governance Must Focus in AI Era
In recent years, the ethical implications of AI have come to the forefront of
public discussion. Data governance reinforces the importance of adhering to
ethical practices in the development and deployment of AI systems. Transparency
and accountability should be the pillars upon which AI technologies are built.
Generative AI and large language models have the ability to create and
manipulate human-like content. This power must be wielded responsibly. Data
governance requires developers and organizations to embed ethical guidelines
within the AI systems themselves, ensuring that these technologies align with
society’s values and do not increase biases or the delivery of
misinformation. ... Data governance recognizes the importance of individual
autonomy in an AI-driven world. It seeks to empower individuals with the ability
to exercise control over their own data and determine how it is utilized. By
placing decision-making power in the hands of data subjects, we uphold the
fundamental principles of self-determination and personal agency.
The Need for Risk-Based Vulnerability Management to Combat Threats
In comparison to traditional and outdated approaches to vulnerability
management, a risk-based strategy enables organizations to assess the level of
risk posed by vulnerabilities. This approach allows teams to prioritize
vulnerabilities based on their assessed risk levels and remediate those with
higher risks, minimizing potential attacks in a way that is hassle-free,
continuous, and automated. Over 90% of successful
cyberattacks involve exploitation of unpatched vulnerabilities and
in result the demand for automated patch management solutions is increasing as
organization seeking a smarter and more efficient vulnerability remediation
strategy than those employed in the past. ... In the face of today’s
threats, it is crucial to have actionable insights based on risk that can
drive security remediation efforts forward. By continuously assessing your
entire attack surface, Outscan NX tools can pinpoint the most
pressing threats, saving your security team valuable time and
resources. The Outscan NX are a comprehensive suite of internal and
external network scanning and cloud security tools customized to suit the
unique needs of your organization.
13 go-to podcasts that inspire IT industry leaders today
Risky Business is a weekly cybersecurity news and current events podcast
hosted by Patrick Gray and Adam Boileau. I listen to it because they do an
excellent job curating the most relevant news and events in cybersecurity
that occurred in the previous week. Gray is a journalist with deep
cybersecurity knowledge and Boileau is an executive director at a
cybersecurity firm, so the presentation is professional and includes
insights on threat actors and motivations. ... I find Gartner’s CIO Mind
podcast to be especially insightful and relevant to the work I’m doing. It
covers a wide range of topics that CIOs are grappling with, from the
recession and cost-cutting, to staffing specialized IT roles and employee
retention. It keeps me tuned in to what others in the industry care about
and what keeps them up at night, and it gets me thinking about ways I can
improve my own organization so we can better support our clients. The
podcast also shares advice from Gartner analysts and other experts that I
can apply to my own organization and leverage to prepare for what’s coming,
such as generative AI, workforce trends, research and development investment
trends, and more.
IoT brings resource gains, sustainability to agriculture
Long-range, low-power wireless solutions equip farmers with the data they
need in order to achieve their goals of increasing yield and minimizing
environmental impact. Lacuna Space is expanding Long-Range WAN (LoRaWAN)
coverage with satellites and LoRa technology to increase connectivity for
low-coverage areas. With the ability to have reliable connectivity despite
location, more farmers around the world can gather data that enables them to
make informed decisions about irrigation, fertilization and more to improve
crop yield and monitor water usage. Farmers in areas without cellular or
Wi-Fi signals can now receive the same technological advancements as those
in more connected areas. This supports smarter agricultural practices
throughout the world, bringing access to tools that improve operations and
crop yield to more individuals in the industry. WaterBit, a precision
agriculture irrigation company, gives farmers the ability to have real-time,
low-cost IoT sensing systems that improve crop quality and yield through
optimized resource use.
Risk Assessment Using Blockchain
Blockchain technology promises new ways to conduct risk assessments; it
helps to create a distributed, transparent, and tamper-proof system for
assessing risks. Not only can this standardize and streamline the process
but also improve the accuracy and reliability of results. A point to note is
that blockchain can only increase accuracy and make the process more
efficient. It cannot replace human judgment and auditing expertise. It can
enhance the auditing process by ensuring the integrity of transactions’ and
events’ records. ... Decentralized data storage eliminates the chances of a
single point of failure, along with reducing the risk of data loss or
corruption. One of the key advantages of using blockchain technology is that
it allows for decentralized data storage. During risk assessments,
information collected can be stored on the blockchain, making it more secure
and less vulnerable to attack. Additionally, the distributed nature of
blockchain technology means that multiple stakeholders can access and update
the data, improving collaboration and ensuring that everyone is working from
the same information.
How can organizations maintain data governance when using generative AI?
The key to the reliability and trust of generative AI responses is combining
them with cognitive enterprise search technology. As mentioned, this
combination generates responses from enterprise data, and users can validate
the information source. Each answer is provided in the user’s context,
always accounting for data permissions from the data source with full
compliance. In addition, these tools ensure data is consistently up-to-date
by delta crawling. Integrating generative AI tools into a trusted knowledge
management solution allows employees to see which documents their
information came from and even provide further explanations. ...
Firstly, leadership must evaluate the potential impact of the generated
content on the organization’s reputation, brand image, and the effectiveness
it will have on the specific business unit. Legal and ethical implications
and ensuring compliance with regulations and guidelines are necessary
considerations, just like any other deployed technology.
Responsible tech ecosystems: pragmatic precursors to responsible regulation
Regulatory technology (regtech) is typically two-fold: compliance tech when
regulated firms use it and supervisory tech when regulators use it. As
regulators monitor and enforce compliance, regtech presents new
opportunities to formulate frameworks. For instance, today, AI activities of
market firms are governed under disparate regulations such as data
protection, consumer protection, financial services regulations etc.
However, threats of unfairness, explainability and accountability are yet to
be addressed. Regulatory gaps expose unmitigated risks that supervisory
technology can resolve. In a perfect world, even without prompts from
regtech, organizations should adopt measures to address these gaps and work
towards diversity and transparency, which have a direct impact on their AI
models. Not every innovation or its ensuing disruption needs to be welcomed.
We see this in the raging debate in the AI & Art spaces. Any regulator
has the moral obligation to react to emerging technology, even if
post-facto.
Crossing the Data Divide: Closing Gaps in Perception of Data as Corporate Asset
What I am suggesting is that our data leaders need to elevate their vision
and messaging to describe a new type of system that is the authoritative
reference for all enterprise data assets. This new type of system needs to
take its place next to the ERP, CRM, and HRM systems within the enterprise.
This means it must provide value for everyone, both technical and
non-technical, and also provide context for data assets that include its
trustworthiness, source, owner, experts, reviews, and much more, all wrapped
in a consumer-grade user interface experience. What is this system? I call
it a social data fabric (SDF). That term has been used lightly in the social
media world, but I am commandeering it for our purposes. I define an SDF
system as a combination of an enterprise data catalog and an internal
marketplace where employees can explore and ‘shop’ for data. The catalog
portion of the system should ingest and manage a broad number of data,
business intelligence, and data-related assets such as term glossaries,
KPIs, analytic models, and business processes.
Executive Q&A: Controlling Cloud Egress Costs
For smaller enterprises, egress charges are fairly minimal as most data
resides in a single cloud region and is accessed within that region. For
larger enterprises, the number of scenarios which incur egress fees is higher.
One such scenario is implementing a hybrid cloud for cost management or a
multicloud to make use of the latest optimized computing hardware that might
not be available in the primary cloud. For these scenarios, egress fees might
be as high as a third of the cloud service expense with naive implementations.
More optimal implementations can bring down the egress cost but still fall
short as more management complexity is introduced and operations staff needs
to be hired to compensate. The reason such fees come as a surprise is that
it's hard to predict how much data is going to be accessed across regions, and
usually this number only increases with time. ... Moving raw data across
network boundaries is infeasible. Building a federation layer to query across
all curated data is key.
Quote for the day:
"Power should be reserved for
weightlifting and boats, and leadership really involves responsibility."
-- Herb Kelleher
No comments:
Post a Comment