Technically skilled managers can be great leaders, if they do this
Outlier leaders often struggle with delegation because they hold themselves—and
their work—to impossibly high standards. To ensure that everything is perfect,
you may take on too much, which leads to burnout and stifles your team’s
potential. This approach drains your energy and can cause your team to feel
undervalued or micromanaged. Building trust in your team is crucial for
leveraging their strengths and empowering them to contribute to the
organization’s success. Trust isn’t just about assuming your team will do their
job; it’s about giving them the freedom to innovate, make mistakes, and grow.
When your team feels trusted, they are more likely to take ownership of their
work and deliver results that align with your vision. Start by delegating
smaller tasks and gradually increase the responsibility you give to team
members. ... Fostering a culture where team members feel comfortable sharing
their thoughts, ideas, and concerns is key to maintaining strong team cohesion.
It also helps outlier leaders stay connected with their teams, giving them a
better understanding of what’s working and what’s not. Make communication a
priority by holding regular team meetings focused not just on project updates
but on sharing feedback, discussing challenges, and exploring new ideas.
6 biggest healthcare security threats
Traffic from bad bots — such as those that attempt to scrape data from websites,
send spam, or download unwanted software — present another major challenge for
healthcare organizations. The problem has become especially pressing when
governments around the world began setting up new websites and other digital
infrastructure to support COVID vaccine registrations and appointments. Bad
actors bombarded these new, hastily established and largely untested sites with
a huge volume of bad-bot traffic. Imperva says it has observed a 372% increase
in bad-bot traffic on healthcare websites in the first year of the pandemic.
“Increased levels of traffic result in downtime and disruption for legitimate
human users who are trying to access critical services on their healthcare
providers’ site,” Ray says. “It might also result in increased infrastructure
costs for the organization as it tries to sustain uptime from the persistent,
burdensome level of elevated traffic.” ... Wearable and implantable smart
medical devices are a proven cybersecurity risk. These technologies certainly
offer better analysis, assisting diagnosis of medical conditions while aiding
independent living, but mistakes made in securing such medtech have exposed
vulnerable users to potential attack.
Cybercriminals Are Targetting AI Conversational Platforms
Besides the issue of retained PII stored in communications between the AI agent
and end users, bad actors were also able to target access tokens, which could be
used by enterprises for the implementation of the service with APIs of external
services and applications. According to Resecurity, due to the significant
penetration of external AI systems into enterprise infrastructure and the
processing of massive volumes of data, their implementation without proper risk
assessment should be considered an emerging IT supply chain cybersecurity risk.
The experts from Resecurity outlined the need for AI trust, risk, and security
management (TRiSM), as well as Privacy Impact Assessments (PIAs) to identify and
mitigate potential or known impacts that an AI system may have on privacy, as
well as increased attention to supply chain cybersecurity. Conversational AI
platforms have already become a critical element of the modern IT supply chain
for major enterprises and government agencies. Their protection will require a
balance between traditional cybersecurity measures relevant to SaaS
(Software-as-a-Service) and those specialized and tailored to the specifics of
AI, highlighted the threat research team at Resecurity.
Leveraging digital technologies for intralogistics optimisation
With rapid advances in digital technologies such as robotics, artificial
intelligence (AI) and the Internet of Things (IoT), companies can now optimise
their intralogistics processes more easily and achieve better results. The
following sections focus on a range of key aspects that can be significantly
improved by leveraging digital technologies in support of intralogistics. ...
Wearables are often used in warehouses to optimise worker movements, enable
picking and packing efficiency, and ensure worker safety. Wearables equipped
with augmented reality technology can be used for navigation and guiding
employees for picking and packing operations. These techniques yield benefits
not only by speeding up processes and reducing errors but also by reducing
employee training time. ... The overall benefits of intralogistics optimisation
are significant in terms of process efficiency, but securing the benefits
depends on a range of sub-domains as described above. These technologies enhance
visibility of operations by enabling real-time monitoring at every stage of
production, right from the raw material supply stage up to the final delivery of
the manufactured goods stage, which in turn can enhance efficiency and reduce
downtime and production cost.
Do we even need Product Managers?
The brand manager of the past was in many ways the product manager of the
present. And not long after we began our discussion, Shreyas brought up the
rather famous example of Procter and Gamble’s ‘Brand Man’ memo. ... And then, as
tech gave businesses the ability to slice and dice consumer categories and
cohorts and track multiple data points, the more ‘artistic’ PMs—those who made
decisions based on intuition and taste—had to find patterns that would align
with what the data suggested. But consumer data isn’t absolute truth. Humans are
irrational. They might do one thing, and say something completely different in a
survey. Or a feedback form. And as Chandra outlines, most companies are drowning
in data now, with no clue what to do with all the metrics they track. ... When
do you hire your first Product Manager? This is a question that an increasing
number of CEOs and founders are asking in the post-ZIRP era—where efficiency is
key and AI is pushing into more and more functions. So what happens when you sit
down and ask yourself, “Do I need to hire product managers?” ... The one
certainty that emerged from our discussion was that the role of a Product
Manager has to evolve and break out of the boundaries it is now enclosed in,
although some skills and characteristics will remain constant.
Scaling Uber’s Batch Data Platform: A Journey to the Cloud with Data Mesh Principles
One significant challenge that Uber has faced during the migration process is
the need to accommodate changes in data ownership and the limits set by GCS.
Data ownership changes can occur due to team reorganizations or users
reassigning assets. To address this, Uber implemented an automated process to
monitor and reassign ownership when necessary, ensuring data remains securely
stored and managed. Additionally, Uber optimized its data distribution to avoid
hitting GCS storage limits, ensuring that heavily used tables are separated into
their buckets to improve performance and make monitoring easier. ... Looking to
the future, Uber aims to further expand on its use of data mesh principles by
building a platform that allows for self-governed data domains. This will
simplify infrastructure management and enhance data governance, ultimately
creating a more agile, secure, and cost-efficient data ecosystem. The cloud
migration of Uber’s batch data platform is a significant undertaking, but
through careful planning and the development of innovative tools like DataMesh,
Uber is positioning itself for greater scalability, security, and operational
efficiency in the cloud.
ShadowLogic Attack Targets AI Model Graphs to Create Codeless Backdoors
By using the ShadowLogic method, HiddenLayer says, threat actors can implant
codeless backdoors in ML models that will persist across fine-tuning and which
can be used in highly targeted attacks. Starting from previous research that
demonstrated how backdoors can be implemented during the model’s training phase
by setting specific triggers to activate hidden behavior, HiddenLayer
investigated how a backdoor could be injected in a neural network’s
computational graph without the training phase. “A computational graph is a
mathematical representation of the various computational operations in a neural
network during both the forward and backward propagation stages. In simple
terms, it is the topological control flow that a model will follow in its
typical operation,” HiddenLayer explains. Describing the data flow through the
neural network, these graphs contain nodes representing data inputs, the
performed mathematical operations, and learning parameters. “Much like code in a
compiled executable, we can specify a set of instructions for the machine (or,
in this case, the model) to execute,” the security company notes.
What Goes Into AI? Exploring the GenAI Technology Stack
Compiling the training datasets involves crawling, compiling, and processing all
text (or audio or visual) data available on the internet and other sources
(e.g., digitized libraries). After compiling these raw datasets, engineers layer
in relevant metadata (e.g., tagging categories), tokenize data into chunks for
model processing, format data into efficient training file formats, and impose
quality control measures. While the market for AI model-powered products and
services may be worth trillions within a decade, many barriers to entry prevent
all but the most well-resourced companies from building cutting-edge models. The
highest barrier to entry is the millions to billions of capital investment
required for model training. To train the latest models, companies must either
construct their own data centers or make significant purchases from cloud
service providers to leverage their data centers. While Moore’s law continues to
rapidly lower the price of computing power, this is more than offset by the
rapid scale up in model sizes and computation requirements. Training the latest
cutting-edge models requires billions in data center investment.
Navigating the Challenges of Hybrid IT Environments in the Age of Cloud Repatriation
Cloud repatriation can often create challenges of its own. The costs associated
with moving services back on-prem can be significant: New hardware, increased
maintenance, and energy expenses should all be factored in. Yet, for some, the
financial trade-off for repatriation is worth it, especially if cloud expenses
become unsustainable or if significant savings can be achieved by managing
resources partially on-prem. Cloud repatriation is a calculated risk that, if
done for the right reasons and executed successfully, can lead to efficiency and
peace of mind for many companies. ... Hybrid cloud observability tools empower
organizations to track performance, boost productivity, ensure system health,
and swiftly resolve issues, leading to reduced downtime, fewer outages, and
enhanced service availability for both employees and customers. By enhancing
transparency and intelligence, observability tools ultimately strengthen the
resilience of the entire IT infrastructure — no matter what path a company takes
regarding the cloud. When deciding which workloads to move back on-prem versus
which to keep in the cloud, companies should carefully consider their specific
needs, such as cost constraints, performance requirements, and compliance
obligations.
Solve People Silo Problems to Solve Data Silo Problems
According to Farooq, a critical aspect of any Chief Data Officer’s role is
ensuring that data is both accessible and fit for purpose, supporting everything
from revenue generation to regulatory compliance and risk management. Achieving
these goals requires a robust data strategy that is closely aligned with the
overall business strategy. Ultimately, it's all about the data. Reflecting on
the challenges of integrating GenAI into institutions, particularly in managing
unstructured data, Farooq likens the AI journey to Gartner's hype cycle,
highlighting how innovations initially peak with inflated expectations before
experiencing a period of disillusionment. However, unlike trends such as
blockchain, Farooq believes AI is here to stay and will follow a different
trajectory, leading to lasting productivity gains. From a data governance
perspective, Farooq sees immediate practical applications for GenAI, such as
creating synthetic test data for market scenarios and writing data quality
rules. He emphasizes the importance of democratizing AI across all levels of an
organization, similar to how data literacy became crucial for everyone—from CEOs
to marketers.
Quote for the day:
"The whole point of getting things done is knowing what to leave undone."
-- Lady Stella Reading