'Leadership? No, Thank You': Navigating A New Organizational Environment Model
The culture that pushes people to be leaders frequently sugarcoats a position
like that by showing all the advantages, juicy challenges, fancy bonuses and
sparkly cars. The reality is that the responsibility is heavy and being a
leader is a lot closer to being a psychologist/coach/rescuer/mom/dad than a
hands-on worker. A leadership position calls for emotional intelligence
growth, great adaptability and, believe it or not, ego detachment. A great
leader is one who does not hoard talent but lets people fly, knows that the
best team is made of people who are different from and better than they are,
learns how to hold pressure and remain calm and, above all, can be trusted.
Show workers this truth. ... Sometimes, not wanting a leadership position may
indicate simply that one is afraid of it and not that one doesn’t want it. We
all know that. Companies can and must push people out of their comfort zones
but also need to maintain a balance of respecting their preferences. How?
Training them before a leadership role. Yes. Most companies train leaders
after they have assumed a leadership role.
Modern Application Management Requires Deeper Internet Visibility
Unfortunately, most IT teams today have limited ability to discern how the
performance of Internet services is impacting their applications. There are,
of course, Internet performance management (IPM) tools capable of surfacing
network performance metrics. The challenge and opportunity now is to surface
those metrics in context with all the other telemetry data that DevOps teams
collect from the various application performance management (APM) and
observability platforms they rely on to monitor and troubleshoot application
environments. ... Broadly, there are three major classes of blind spots that
impact distributed application performance. The first and arguably most opaque
are the services provided by third-party vendors. Ranging from content
delivery networks (CDNs) to software-as-a-service (SaaS) application, each of
these services is controlled by an external service provider that typically
doesn’t allows a DevOps team to collect telemetry data by deploying agent
software in their IT environments. At best, they may expose an application
programming interface (API) to enable an agentless approach to collecting
data, but that method doesn’t typically provide the level of control required
to optimize application performance.
Ruby on Rails Is Not Dead and May Even Be AI Panacea for Devs
Ruby on Rails has always been promoted as a tool that a single person can use
to create a web application — that’s why it was so popular with Web 2.0
entrepreneurs. The Rails website in April 2005 described the framework as “a
full-stack, open-source web framework in Ruby for writing real-world
applications with joy and less code than most frameworks spend doing XML
sit-ups.” While XML is no longer a factor in 2024, DHH continues to do
interviews espousing the “joy and less code” philosophy. In an interview with
the devtools.fm podcast last month, he even suggested this approach will help
developers adapt in the current generative AI era. “As we are now facing
perhaps an existential tussle with AI,” he said, “I think it’s never been more
important that the way we design programming languages is designed for the
human first. The human needs all the help the human can get, if we’re going to
have any chance to remain not only just valuable, but relevant as a
programmer. And maybe that’s a lost cause anyway, but at least in the last 20
years that I’ve been working with Ruby on Rails, I’ve seen that bet just pay
[off] over and over again.”
Business leaders can no longer afford to wait until disruptions occur to measure their financial impact. They need insights to protect the customer and their financial bottom line as quickly and seamlessly as possible. AI and ML provide the means to achieve such agility, offering “quick wins” in the form of immediate financial value. By harnessing accelerators to automate data capture and deliver intelligent insights at the point of disruption, reducing lead time to capture data from several weeks to near real time, they obtain optimized recommendations at the point of disruption across the value network, thus protecting the customer experience and the financial impact on the business in near real time. ... AI contributes to decision intelligence in supply chains. A good example of decision-making processes that have been enhanced by AI is the Amazon Scan, Label, Apply & Manifest (SLAM) process. When a customer places an order, there are multiple microservices and intelligent algorithms that run to find the most optimal way to fulfill it, based on the customer promise and best financial business outcome.
Is AI driving tech layoffs?
GenAI simply isn’t ready yet. Just like the internet of 1999, the genAI tools
of 2024 will eventually get there. But in the meantime, I predict, as Gartner
would put it, we’re heading quickly to the “Trough of Disillusionment.” That’s
where the initial burst of excitement over a new technology runs out and
everyone realizes the reality isn’t close to what we all dreamed it would be.
I’ve seen too many of these bubbles over the years and still we fall for it
every time. What’s different now, and why the coming fall will hurt so much,
is that almost every company has fallen under the genAI spell. Not only are
businesses planning to move to it, they’re already replacing the people they
need to get their work done with half-baked AI models. This is going to
greatly accelerate the coming crash. Don’t get me wrong. GenAI will eventually
replace some jobs. But former US. Treasury Secretary and current OpenAI board
member Larry Summers gets it right. He recently said, “If one takes a view
over the next generation, this could be the biggest thing that has happened in
economic history since the Industrial Revolution.”
Unlocking the Power of Generative AI in Banking: Insights from Microsoft’s Daragh Morrissey
The first use case I would start with is your developers. It’s the most mature generative AI scenario. And as you build new applications for this, why not build them with generative AI? Then I would think about the out-of-the-box AI, gen AI that you’ll get from us if you start introducing it to Teams and Office. You’ll hit a ton of use cases there that are sort of horizontal across the whole business. Then, you’ll be left with a set of custom use cases. These could be things like I would start; you could start with a contact center, just enhancing what you currently have in your contact center. You don’t have to rip out your contact center, either. It’s just about sort of adding the capabilities on top. Building a knowledge base is also a great way of learning how to use this inside the organization. ... One of the things we did as well was create this concept of a citizen developer or citizen data scientist. You could just take a set of data, and we can prompt you to say, “It looks like you need one of these models, that could be sentiment analysis or something.” Then, it will build a model with the data.
Critical Bugs Put Hugging Face AI Platform in a 'Pickle'
In examining Hugging Face's infrastructure and ways to weaponize the bugs they
discovered, Wiz researchers found that anyone could easily upload an AI/ML
model to the platform, including those based on the Pickle format. Pickle is a
widely used module for storing Python objects in a file. Though even the
Python software foundation itself has deemed Pickle as insecure, it remains
popular because of its ease of use and the familiarity people have with it.
"It is relatively straightforward to craft a PyTorch (Pickle) model that will
execute arbitrary code upon loading," according to Wiz. Wiz researchers took
advantage of the ability to upload a private Pickle-based model to Hugging
Face that would run a reverse shell upon loading. They then interacted with it
using the Inference API to achieve shell-like functionality, which the
researchers used to explore their environment on Hugging Face's
infrastructure. That exercise quickly showed the researchers their model was
running in a pod in a cluster on Amazon Elastic Kubernetes Service (EKS). ...
With Hugging Face Spaces, Wiz found an attacker could execute arbitrary code
during application build time that would let them examine network connections
from their machine.
Sophisticated Latrodectus Malware Linked to 2017 Strain
While initial analysis suggested Latrodectus is a new variant of IcedID,
subsequent research found that it is a new malware most likely named
Latrodectus because of a string identified in the code. Latrodectus employs
infrastructure used in historic IcedID operations, indicating potential ties
to the same threat actors. IcedID, first discovered in 2017, has been
described as a banking Trojan and remote access Trojan. Researchers discovered
insights into the activities of threat actors TA577 and TA578 - the primary
distributors of Latrodectus that illustrate the evolving tactics threat actors
have used over time. TA577, previously known for its distribution of Qbot,
used Latrodectus in three campaigns in November 2023 before switching back to
Pikabot. In contrast, TA578 has been predominantly distributing Latrodectus
since mid-January 2024, using contact forms and impersonation techniques to
deliver the malware to targets. Latrodectus functions as a downloader, and its
primary objective is to download payloads and execute arbitrary commands. Its
sandbox evasion techniques are noteworthy, and it shares similarities with the
IcedID malware.
Deceptive AI: The Human-Machine Romance
Like God, AI bots assure us they are omnipresent and omniscient and can be a
panacea for all our emotional needs, a claim that is too good to be true. All
of us, at different points in our lives, have witnessed miserable Bot failures
while responding to well-scoped, structured and sequenced business processes.
Then how on earth do we even believe a neural network can handle complex,
unstructured human emotions? The outcomes will be insanely unpredictable. That
is what exactly happened with 21-year-old Jaswant Singh Chail when he was
coerced by a romantic chatbot to break into Windsor Castle to kill the Queen
of England. He is now serving a prison sentence, still firmly believing the AI
bot is an incarnation of the angel who will eventually reunite with him. Don’t
see this scenario in isolation; such AI bots in the hands of extremists can be
a game changer in recruiting and radicalizing younger minds to carry out
unspeakable crimes (Remember gory effects of the suicidal game “Blue Whale”?);
unethical business houses can leverage such channels to boost their product
sales.
Six reasons to go colo
Historically, enterprises have built, equipped, and operated their own data
centers according to need – both capacity needs and on different geographical
locations. While this approach has qualities in terms of being tailor-made for
your specific operations – the infrastructure lacks the scalability,
flexibility and sometimes even cloud connectivity required to gain and keep a
competitive advantage in today’s fast-paced markets. Furthermore, the
investments related to constructing your own data center are highly
capital-intensive, which makes it difficult for some companies to pursue such
a strategy. Against this background, the reasons for the fast-paced growth of
the data center colocation industry becomes easy to grasp. Data center
colocation is a considerably more accessible, scalable, and cost-efficient
solution for your facility. When using data center colocation, you consume the
physical data center ‘as a service’. The often-used expression ‘let your
business focus on what it does best’ applies well here. You get the
peace-of-mind knowing your ground-bounded infrastructure has a secured uptime
and you leave the matters of cooling, electricity supply and physical security
to a partner who is an expert in exactly that.
Quote for the day:
"You may only succeed if you desire
succeeding; you may only fail if you do not mind failing." --
Philippos
No comments:
Post a Comment