Humanoid robots are a bad idea
Humanoid robots that talk, perceive social and emotional cues, elicit empathy
and trust, trigger psychological responses through eye contact and who trick
us into the false belief that they have inner thoughts, intentions and even
emotions create for humanity what I consider a real problem. Our response to
humanoid robots is based on delusion. Machines — tools, really — are being
deliberately designed to hack our human hardwiring and deceive us into
treating them as something they’re not: people. In other words, the whole
point of humanoid robots is to dupe the human mind, to mislead us into have
the kind of connection with these machines formerly reserved exclusively for
other human beings. Why are some robot makers so fixated on this outcome? Why
isn’t the goal instead to create robots that are perfectly designed for their
function, rather than perfectly designed to trick the human mind? Why isn’t
there a movement to make sure robots do not elicit false emotions and beliefs.
What’s the harm in preserving our intuition that a robot is just a machine,
just a tool? Why try to route around that intuition with machines that trick
our minds, coopting or hijacking our human empathy?
11 Irritating Data Quality Issues
Organizations need to put data quality first and AI second. Without dignifying
this sequence, leaders fall into fear of missing out (FOMO) in attempts to
grasp AI-driven cures to either competitive or budget pressures, and they jump
straight into AI adoption before conducting any sort of honest self-assessment
as to the health and readiness of their data estate, according to Ricardo
Madan, senior vice president at global technology, business and talent
solutions provider TEKsystems. “This phenomenon is not unlike the cloud
migration craze of about seven years ago, when we saw many organizations
jumping straight to cloud-native services, after hasty lifts-and-shifts, all
prior to assessing or refactoring any of the target workloads. This sequential
dysfunction results in poor downstream app performance since architectural
flaws in the legacy on-prem state are repeated in the cloud,” says Madan in an
email interview. “Fast forward to today, AI is a great ‘truth serum’ informing
us of the quality, maturity, and stability of a given organization’s existing
data estate -- but instead of facing unflattering truths, invest in holistic
AI data readiness first, before AI tools."
CISOs urged to prepare now for post-quantum cryptography
Post-quantum algorithms often require larger key sizes and more computational
resources compared to classical cryptographic algorithms, a challenge for
embedded systems, in particular. During the transition period, systems will
need to support both classical and post-quantum algorithms to support
interoperability with legacy systems. Deidre Connolly, cryptography
standardization research engineer at SandboxAQ, explained: “New cryptography
generally takes time to deploy and get right, so we want to have enough lead
time before quantum threats are here to have protection in place.” Connolly
added: “Particularly for encrypted communications and storage, that material
can be collected now and stored for a future date when a sufficient quantum
attack is feasible, known as a ‘Store Now, Decrypt Later’ attack: upgrading
our systems with quantum-resistant key establishment protects our present-day
data against upcoming quantum attackers.” Standards bodies, hardware and
software manufacturers, and ultimately businesses across the globe will have
to implement new cryptography across all aspects of their computing systems.
Work is already under way, with vendors such as BT, Google, and Cloudflare
among the early adopters.
AI for application security: Balancing automation with human oversight
Security testing should be integrated throughout Application Delivery
Pipelines, from design to deployment. Techniques such as automated
vulnerability scanning, penetration testing, continuous monitoring, and many
others are essential. By embedding compliance and risk assessment tasks into
underlying change management processes, IT professionals can ensure that
security testing is at the core of everything they do. Incorporating these
strategies at the application component level ensures alignment with business
needs to effectively prioritize results, identify attacks, and mitigate risks
before they impact the network and infrastructure. ... To build a
security-first mindset, organizations must embed security best practices into
their culture and workflows. If new IT professionals coming into an
organization are taught that security-first isn’t a buzzword, but instead the
way the organization operates, it becomes company culture. Making security an
integral part of the application delivery pipelines ensures that security
policies and processes align with business goals. Education and communication
are key—security teams must work closely with developers to ensure that
security requirements are understood and valued.
TSA biometrics program is evolving faster than critics’ perceptions
Privacy impact assessments (PIAs) are not only carried out for each new or
changed process, but also published and enforced. The images of U.S. citizens
captured by the TSA may be evaluated and used for testing, but they are
deleted within 12 hours. Travelers have the choice of opting out of biometric
identity verification, in which case they go through a manual ID check, just
like decades ago. As happened previously with body scanners, TSA has adapted
the signage it uses to notify the public about its use of biometrics. Airports
where TSA uses biometrics now have signs that state in bold letters that
participation is optional, explain how it works and include QR codes for
additional information. The technology is also highly accurate, with tests
showing 99.97% accurate verifications. In the cases which do not match, the
traveler must then go through the same manual procedure used previously and
also in cases where people opt out. TSA does not use biometrics to match
people against mugshots from local police departments, for deportations or
surveillance. In contrast, the proliferation of CCTV cameras observing people
on their way to the airport and back home is not mentioned by Senator Merkley.
Blockchain: Redefining property transactions and ownership
Blockchain’s core strength lies in its ability to create a secure, immutable
ledger of transactions. In the real estate context, this means that all
details related to a property transaction— from the initial agreement to the
final transfer of ownership—are recorded in a way that cannot be altered or
tampered with. Blockchain technology empowers brokers to streamline
transactions and enhance transparency, allowing them to focus on offering
personalised insights and strategic advice. This shift enables brokers to
provide a more efficient and cost-effective service while maintaining their
advisory role in the real estate process. Another innovative application of
blockchain in real estate is through smart contracts. These are digital
contracts that automatically execute when certain conditions are met, ensuring
that the terms of an agreement are fulfilled without the need for manual
oversight. In real estate, smart contracts can be used to automate everything
from title transfers to escrow arrangements. This automation not only speeds
up the process but also reduces the chances of disputes, as all terms are
clearly defined and executed by the technology itself. Beyond improving the
efficiency of transactions, blockchain also has the potential to change how we
think about property ownership.
Agile Reinvented: A Look Into the Future
There’s no denying that agile is poised at a pivotal juncture, especially
given the advent of AI. While no one knows how AI will influence agile in the
long term, it is already shaping how agile teams are structured and how its
members approach their work, including using AI tools to code or write user
stories and jobs to be done. To remain relevant and impactful, agile must be
responsive to the evolving needs of the workforce. Younger developers, in
particular, seek more room for creativity. New approaches to agile team
formation—including Team and Org Topologies or FaST, which relies on elements
of dynamic reteaming instead of fixed team structures to tackle complex
work—are emerging to create space for innovation. Since agile was built upon
the values of putting people first and adapting to change, it can, and should,
continue to empower teams to drive innovation within their organizations. This
is the heart of modern agile: not blindly adhering to a set of rules but
embracing and adapting its principles to your team’s unique circumstances. As
agile continues to evolve, we can expect to see it applied in even more varied
and innovative ways. For example, it already intersects with other
methodologies like DevSecOps and Lean to form more comprehensive
frameworks.
Breaking Free from Ransomware: Securing Your CI/CD Against RaaS
By embracing a proactive DevSecOps mindset, we can repel RaaS attacks and
safeguard our code. Here’s your toolkit: ... Don’t wait until deployment to
tighten the screws. Integrate security throughout the software development
life cycle (SDLC). Leverage software composition analysis (SCA) and software
bill of materials (SBOM) creation, helping you scrutinize dependencies for
vulnerabilities and maintain a transparent record of every software component
in your pipeline. ... Your pipelines aren’t static entities; they are living
ecosystems demanding constant Leveraging tools to implement continuous
monitoring and logging of pipeline activity. Look for anomalies, suspicious
behaviors and unauthorized access attempts. Think of it as having a
cybersecurity hawk perpetually circling your pipelines, detecting threats
before they take root. ... Minimize unnecessary access to your CI/CD
environment. Enforce strict role-based access controls and least privilege
Utilize access control tools to manage user roles and permissions tightly,
ensuring only authorized users can interact with sensitive resources.
Remember, the 2022 GitHub vulnerability exposed the dangers of lax access
control in CI/CD environments.
Achieving cloudops excellence
Although there are no hard-and-fast rules regarding how much to spend on
cloudops as a proportion of the cost of building or migrating applications, I
have a few rules of thumb. Typically, enterprises should spend 30% to 40% of
their total cloud computing budget on cloud operations and management. This
covers monitoring, security, optimization, and ongoing management of cloud
resources. ... Cloudops requires a new skill set. Continuous training and
development programs that focus on operational best practices are vital. This
transforms the IT workforce from traditional system administrators to cloud
operations specialists who are adept at leveraging cloud environments’ nuances
for efficiency. Beyond technical implementations, enterprise leaders must
cultivate a culture that prioritizes operational readiness as much as
innovation. The essential components are clear communication channels,
cross-departmental collaboration, and well-defined roles. Organizational
coherence enables firms to pivot and adapt swiftly to the changing tides of
technology and market demands. It’s also crucial to measure success by
deployment achievements and ongoing performance metrics. By setting clear
operational KPIs from the outset, companies ensure that cloud environments are
continuously aligned with business objectives.
What high-performance IT teams look like today — and how to build one
“Today’s high-performing teams are hybrid, dynamic, and autonomous,” says Ross
Meyercord, CEO of Propel Software. “CIOs need to create a clear vision and
articulate and model the organization’s values to drive alignment and
culture.” High-performance teams are self-organizing and want significant
autonomy in prioritizing work, solving problems, and leveraging technology
platforms. But most enterprises can’t operate like young startups with
complete autonomy handed over to devops and data science teams. CIOs should
articulate a technology vision that includes agile principles around
self-organization and other non-negotiables around security, data governance,
reporting, deployment readiness, and other compliance areas. ...
High-performance teams are often involved in leading digital transformation
initiatives where conflicts around priorities and solutions among team members
and stakeholders can arise. These conflicts can turn into heated debates, and
CIOs sometimes have to step in to help manage challenging people issues. “When
a CIO observes misaligned goals or intra-IT conflict, they need to step in
immediately to prevent organizational scar tissue from forming,” says
Meyercord of Propel Software.
Quote for the day:
"Don't necessarily and sharp edges. Occasionally they are necessary to
leadership." -- Donald Rumsfeld
No comments:
Post a Comment