Crafting Leaders: The finishing touches
The process of narrowing the funnel for identifying future leaders must commence
soon after fresh talent is inducted within the organization and certainly long
before organizational knocks have bled the spirit, energy and
desire-to-be-different from these young men and women. An earlier column
explained how alternative fast-track schemes function and ways to choose and
groom future leaders from early stages. 2 More recently, I have added two coda
to the exposition. When choosing leaders for facing the uncertainties of
tomorrow it is not enough to capture their capabilities at the time of selection
but take into account the steepness of the slope they have traversed to reach
there. 3 That is the best guarantee of future resilience and continued
development in spite of handicaps. Moreover, constraints of time and shortage of
the right kind of teachers prevent those running to the top of the pyramid from
formally refreshing their knowledge and capabilities as frequently as they
should. ... The grooming of Fast-Trackers (FTers) must vary substantially from
company to company and from individual to individual.
The undeniable benefits of making cyber resiliency the new standard
"It's about practicing due care and due diligence from a cybersecurity
standpoint and having a layered defense with a layered
people-process-and-technology-driven program with the right governance and
services and tools to enable the mission of the organization so that if there's
an event, you can recover and adapt to keep business running," he adds. To do
that, CISOs and their executive colleagues must have their cybersecurity basics
well established -- basics such as knowing their tolerance for risk,
understanding their IT environment, their security controls, their
vulnerabilities, and how those all could impact the organization's operations.
CISOs aren't limited to these frameworks or the assessment tools created
specifically to measure cyber resiliency, says Tenreiro de Magalhaes and others.
CISOs can also run tabletop drills and red-team exercises to test, measure and
report on resiliency. Repeating such drills and exercises can then track whether
the organization's cybersecurity program as well as specific additions to it
help improve resiliency over time, experts say.
Hybrid work is in trouble. Here are 4 ways to make it work in the longer term
"We're all humans and we work with each other," he says. "To make hybrid working
effective, there must be an element of interaction. There must be a
connectivity, both to the business and your team." Warne says balance is
essential, so find the right reasons for bringing people together in the office.
"At River Island, it's about making sure that people are in for a purpose and
not just presenteeism, and making sure that the people who need to work together
are able to work together," he says. "If you work with a colleague, it's crucial
you don't have a situation where one of you comes into the office and the other
one works from home." Warne says his team doesn't have mandated days in the
office. Instead, his organization's hybrid-working strategy is all about
collaboration. ... However, hybrid working has allowed for an even higher
level of flexibility in her organization -- and the key to success has been
constant communication. Cousineau continues to listen to feedback from her team.
One staff member suggested hybrid all-team meetings were creating a big divide
between those who were present and those who weren't.
Evolution of stronger cyber threat actors: The flip side of Gen AI story
Deepfake technology, a subset of Generative AI, allows threat actors to create
convincing video and audio forgeries. This presents a substantial threat to
organisations as deepfake attacks can tarnish reputations, manipulate public
opinion, and even influence financial markets. Imagine a scenario where a CEO’s
voice is convincingly mimicked, disseminating false information that impacts
stock prices; or consider a deepfake video of a prominent figure endorsing a
product or idea they never actually supported. Such manipulations can lead to
severe consequences for businesses and society at large. Generative AI is
revolutionising the way malware is created. Threat actors can use AI algorithms
to generate highly evasive and adaptable malware variants that can easily evade
traditional signature-based antivirus solutions. These AI-generated malware
strains constantly evolve, making detection and containment a significant
challenge for cybersecurity professionals. Moreover, Generative AI allows for
the customisation of malware based on the target environment.
The CIO’s primary job: Developing future IT leaders
The challenge for IT management is to find people who are good at their current
job but are also interested in the management side that is necessary for
departmental success. In my opinion, the reason many IT departments have decided
to go outside IT to bring in CIOs is because IT has not fostered the kind of
environment that develops these types of professionals. IT has not traditionally
tried very hard to develop strong managers from within. Most people learn to
manage by watching what their managers do. And if people have bad managers, the
results can be less than optimum. So how do we change that conundrum? First, we
must commit our current managers and supervisors to a strong management training
program. Once they have been trained in the subtleties of management, then we
hopefully will begin to see new managers with skills developed from within.
Effective management training can, and should be, structured around techniques
that current managers use to be successful. Delegating effectively and
encouraging career growth among staff are two examples.
Evolution of Data Partitioning: Traditional vs. Modern Data Lakes
In modern data lakes, data is organized into logical partitions based on
specific attributes or criteria, such as day, hour, year, or region. Each
partition acts as a subset of the data, making it easier to manage, query, and
optimize data retrieval. Partitioning enhances both data organization and
query performance. Instead of relying solely on directory-based partitioning
or basic column-based partitioning, these systems provide support for complex,
nested, and multi-level partitioning structures. This means that data can be
partitioned using multiple attributes simultaneously, allowing for highly
efficient data pruning during queries. ... Snapshots are a fundamental concept
used to capture and manage different versions or states of a table at specific
points in time. Snapshots are a key feature that enables Time Travel, data
auditing, schema evolution, and query consistency within modern Data Lakes
like Iceberg tables. Some important features of snapshots are below : Each
snapshot represents a specific version of the data table. When you create a
snapshot, it essentially freezes the state of the table at the moment the
snapshot is taken.
Will Quantum Computers Become the Next Cyber-Attack Platform?
A quantum cyberattack would likely be similar to today’s identity theft and
data breaches. “The only difference is that the damage would be more
widespread, since quantum computers could attack a broad class of encryption
algorithms rather than just the particular way that a company or data center
implements the algorithm, which is how attacks are currently done,” explains
Eric Chitambar, associate professor of electrical and computer engineering at
the Grainger College of Engineering at the University of Illinois
Urbana-Champaign. Chitambar also leads the college’s Quantum Information
Group. ... Conducting an enterprise-wide quantum risk assessment to help
identify systems that might be most vulnerable to a quantum attack would be a
good place to start, Staab says. He also recommends deploying enterprise-wide
Quantum Random Number Generator (QRNG) technology to generate
quantum-resistant encryption keys. This approach promises crypto agility,
implementation of Quantum Key Distribution (QKD) and the development of
quantum-resistant algorithms. “As we head toward a quantum computing era,
adopting a zero-trust architecture will become more important than ever,”
Staab states.
6 Reasons Private LLMs Are Key for Enterprises
Private LLMs can be used with sensitive data — such as hospital patient
records or financial data — and then use the power of generative AI to produce
groundbreaking achievements in these fields. With the LLM running on your
private infrastructure and only exposed to the people who should have access
to it, you can build powerful customer-focused applications, chatbots or just
provide an easier way for your employees to interact with your company data —
without the risk of sending the data to a third party. ... With private LLMs,
you can tailor the model and response to your company, industry or customers’
needs. Such specific information is not likely to be included in general or
public LLMs. You can feed your LLM with customer support cases, internal
knowledge-base articles, sales data, application usage data and so much more,
ensuring that the responses you receive are what you’re looking for. ...
Controlling versioning or the model you’re using is extremely important
because if you change the model that you use to create embeddings, you will
need to re-create (or version) all the embeddings you store.
Tech Revolution: The Rise of Automation and Its Impact on Society
To offset potential adverse effects, it is imperative for companies and
governments to enact policies and initiatives supporting workers susceptible
to automation’s impact. This may encompass training programs designed to
furnish workers with the requisite skills to excel in the evolving job market,
along with social support programs to aid those grappling with employment
challenges. Public policy will emerge as a pivotal determinant of
technological evolution’s trajectory and consequences. Economic incentives,
education reforms, and immigration policies will directly influence
productivity, employment levels, and enhanced economic mobility. ... Central
and state government agencies ought to collaborate with industry partners and
educational institutions to craft programs that equip new workers with the
skills needed to thrive in an automation-driven world. These programs bear the
potential to combat emerging inequality by propelling education and training
initiatives that foster success for all.
When open source cloud development doesn't play nice
Remember that the cloud provider is merely “providing” the open source
software. They are not typically supporting it beyond that. For more, you’ll
need to look internally or in other places. Open source users, whether in the
cloud or not, often have to rely on community resources, typically provided
through forums or message boards, which takes time. This can impede cloud
development progress in urgent, time-sensitive scenarios or complex issues. A
developer told me once that she needed to attend a meeting of the open source
community before she could have a resolution to a specific problem—a meeting
that was five weeks out. That won’t work. From a security standpoint, open
source software can pose specific challenges. Although a community of
developers regularly reviews such software, it can still harbor undetected
vulnerabilities, primarily because its code is openly accessible. For
instance, some open source supply chain issues arose a few years ago. These
vulnerabilities can become severe security threats without stringent security
measures and frequent updates.
Quote for the day:
''Sometimes it takes a good fall to
really know where you stand.'' -- Hayley Williams
No comments:
Post a Comment