W3C issues new technical draft for verifiable credentials standards
Part of the promise of the W3C standards is the ability to share only the data
that’s necessary for a completing a secure digital transaction, Goodwin
explained, noting that DHS’s Privacy Office is charged with “embedding and
enforcing privacy protections and transparency in all DHS activities.” DHS was
brought into the process to review the W3C Verifiable Credentials Data Model and
Decentralized Identifiers framework and to advise on potential issues. DHS
S&T said in a statement last month that “part of the promise of the W3C
standards is the ability to share only the data required for a transaction,”
which it sees as “an important step towards putting privacy back in the hands of
the people.” “Beyond ensuring global interoperability, standards developed by
the W3C undergo wide reviews that ensure that they incorporate security,
privacy, accessibility, and internationalization,” said DHS Silicon Valley
Innovation Program Managing Director Melissa Oh. “By helping implement these
standards in our digital credentialing efforts, S&T, through SVIP, is
helping to ensure that the technologies we use make a difference for people in
how they secure their digital transactions and protect their privacy.”
Managing Technical Debt in the Midst of Modernization
Rather than delivering a product and then worrying about technical debt, it is
more prudent to measure and address it continuously from the early stages of a
project, including requirement and design, not just the coding phase. Project
teams should be incentivized to identify improvement areas as part of their
day-to-day work and implement the fixes as and when possible. Early detection
and remediation can help streamline IT operations, improve efficiencies, and
optimize cost. ... Inadequate technical knowledge or limited experience in the
latest skills itself leads to technical debt. Enterprises must invest and
prioritize continuous learning to keep their talent pool up to date with the
latest technologies. A skill-gap analysis helps forecast the need for skills for
future initiatives. Teams should be encouraged to upskill in AI, cloud, and
other latest technologies, as well as modern design and security standards. This
will help enterprises address the technical debt skill-gap effectively.
Enterprises can also employ a hub and spoke model, where a central team offers
automation and expert guidance while each development team maintains their own
applications, systems and related technical debt.
Generative AI Adoption: What’s Fueling the Growth?
The banking, financial services, and insurance (BFSI) sector is another area
where generative AI is making a significant impact. In this industry, generative
AI enhances customer service, risk management, fraud detection, and regulatory
compliance. By automating routine tasks and providing more accurate and timely
insights, generative AI helps financial institutions improve efficiency and
deliver better services to their customers. For instance, generative AI can be
used to create personalized customer experiences by analyzing customer data and
predicting their needs. This capability allows banks to offer tailored products
and services, improving customer satisfaction and loyalty. ... The life sciences
sector stands to benefit enormously from the adoption of generative AI. In this
industry, generative AI is used to accelerate drug discovery, facilitate
personalized medicine, ensure quality management, and aid in regulatory
compliance. By automating and optimizing various processes, generative AI helps
life sciences companies bring new treatments to market more quickly and
efficiently. For instance, generative AI can largely draw on masses of
biological data to find a probable medication, much faster than conventional
means.
Overcoming Software Testing ‘Alert Fatigue’
Before “shift left” became the norm, developers would write code that quality
assurance testing teams would then comb through and identify the initial bugs
in the product. Developers were then only tasked with reviewing the proofed
end product to ensure it functioned as they initially envisioned. But now, the
testing and quality control onus has been put on developers earlier and
earlier. An outcome of this dynamic is that developers are becoming
increasingly numb to the high volume of bugs they are coming across in the
process, and as a result, they are pushing bad code to production. ...
Organizations must ensure that vital testing phases are robust and
well-defined to mitigate these adverse outcomes. These phases should include
comprehensive automated testing, continuous integration (CI) practices, and
rigorous manual testing by dedicated QA teams. Developers should focus on unit
and integration tests, while QA teams handle system, regression acceptance,
and exploratory testing. This division of labor enables developers to
concentrate on writing and refining code while QA specialists ensure the
software meets the highest quality standards before production.
SSD capacities set to surge as industry eyes 128 TB drives
Maximum SSD capacity is expected to double from its current 61.44 TB maximum
by mid-2025, giving us 122 TB and even 128 TB drives, with the prospect of
exabyte-capacity racks. Five suppliers have discussed and/or demonstrated
prototypes of 100-plus TB capacity SSDs recently. ... Systems with enclosures
full of high-capacity SSDs will need to cope with drive failure and that means
RAID or erasure coding schemes. SSD rebuilds take less time than HDD rebuilds
but higher-capacity SSDs take longer. Looking at a 61.44 TB Solidigm D5-P5336
drive, its max sequential write bandwidth is 3 GBps. For example, rebuilding a
61.44 TB Solidigm D5-P5336 drive with a max sequential write bandwidth of 3
GBps would take approximately 5.7 hours. A 128 TB drive will take 11.85 hours
at the same 3 GBps write rate. These are not insubstantial periods. Kioxia has
devised an SSD RAID parity compute offload scheme with a parity compute block
in the SSD controller and direct memory access to neighboring SSDs to get the
rebuild data. This avoids the host server’s processor getting involved in RAID
parity compute IO and could accelerate SSD rebuild speed.
Putting Individuals Back In Charge Of Their Own Identities
Digital identity comprises many signals to ensure it can accurately reflect
the real identity of the relevant individual. It includes biometric data, ID
data, phone data, and much more. In shareable IDs, these unique features are
captured through a combination of AI and biometrics which provide robust
protection against forgery and replication, and so provide a high assurance
that a person is who they say they are. Importantly, these technologies
provide an easy and seamless alternative to other verification processes. For
most people, visiting a bank branch to prove their identity with paper
documents is no longer convenient, while knowledge-based authentication, like
entering your mother’s maiden name, is not viable because data breaches make
this information readily for sale to nefarious actors. It’s no wonder that 76% of consumers find biometrics more convenient, while 80% find it more
secure than other options. ... A shareable identity is a user-controlled
identity credential that can be stored on a device and used remotely.
Individuals can then simply re-use the same digital ID to gain access to
services without waiting in line, offering time-saving convenience for all.
Revolutionizing cloud security with AI
Generative AI can analyze data from various sources, including social media,
forums, and the dark web. AI models use this data to predict threat vectors
and offer actionable insights. Enhanced threat intelligence systems can help
organizations better understand the evolving threat landscape and prepare for
potential attacks. Moreover, machine learning algorithms can automate threat
detection across cloud environments, increasing the efficiency of incident
response times. ... AI-driven automation is becoming helpful in handling
repetitive security tasks, allowing human security professionals to focus on
more complex challenges. Automation helps streamline and triage alerts,
incident response, and vulnerability management. AI algorithms can process
incident data faster than human operators, enabling quicker resolution and
minimizing potential damage. ... AI models can enforce privacy policies by
monitoring data access while ensuring compliance with regulations such as the
General Data Protection Regulation in the U.K., or the California Consumer
Privacy Act. When bolstered by AI, homomorphic encryption and differential
privacy techniques offer ways to analyze data while keeping sensitive
information secure and anonymous.
Are CIOs at the Helm of Leading Generative AI Agenda?
The growing integration of generative AI into corporate technology and
information infrastructures is likely to bring a notable shift to the role of
CIOs. While many technology leaders are already spearheading gen AI adoption,
their role goes beyond technology management. It now includes driving
strategic growth and maintaining a competitive edge in an AI-driven landscape.
... The CIO role has evolved significantly over recent decades. Once focused
primarily on maintaining system uptime and availability, CIOs now serve as key
business enablers. As technology advances rapidly and organizations
increasingly rely on IT, the CIO's influence on enterprise success continues
to grow. According to the EY survey, CIOs who report directly to the CEO and
co-lead the AI agenda are the most effective in driving strategic change.
Sixty-three percent of CIOs are leading the gen AI agenda in their
organizations, with CEOs close behind at 55%. Eighty-four percent of
organizations where the gen AI agenda is co-led by the CIO and CEO achieve or
anticipate achieving a 2x return on investment from gen AI, compared to only
56% of organizations where the agenda is led solely by CIOs.
Intel and Karma partner to develop software-defined car architecture
Instead of all those individual black boxes, each with a single job, the new
approach is to consolidate the car's various functions into domains, with each
domain being controlled by a relatively powerful car computer. These will be
linked via Ethernet, usually with a master domain controller overseeing the
entire network. We're already starting to see vehicles designed with this
approach; the McLaren Artura, Audi Q6 e-tron, and Porsche Macan are all recent
examples of software-defined vehicles. Volkswagen Group—which owns Audi and
Porsche—is also investing $5 billion in Rivian specifically to develop a new
software-defined vehicle architecture for future electric vehicles. In
addition to advantages in processing power and weight savings,
software-defined vehicles are easier to update over-the-air, a must-have
feature since Tesla changed that paradigm. Karma and Intel say their
architecture should also have other efficiency benefits. ... Intel is also
contributing its power management SoC to get the most out of inverters, DC-DC
converters, chargers, and as you might expect, the domain controllers use
Intel silicon as well, apparently with some flavor of AI enabled.
Why the next Ashley Madison is just around the corner
Unfortunately, it’s not a matter of ‘if’ another huge data breach will occur –
it’s simply a matter of when. Today organisations of all sizes, not just the
big players, have a ticking time bomb on their hands with the potential to
detonate their brand reputation and destroy customer loyalty. ... Due to a
lack of dedicated cybersecurity teams and finite financial resources to
allocate to protective measures, small organisations will often prove easier
to successfully infiltrate when compared to the average big player. The
potential reward from a single attack may be smaller, but hackers can combine
successful attacks against multiple SMEs to match the financial gain of
successfully hacking a large organisation, and with far less effort. SMEs are
therefore increasingly likely to fall victim to financially crippling attacks,
with 46% of all cyber breaches now impacting businesses with fewer than 1,000
employees. ... The very first step in any attack chain is always the use of
tools to gather intelligence about the victims systems, version numbers of
(not patched) software in use and insecure configuration or programming. Any
hacker, whether a professional or amateur, is using scanning bots or relying
on websites like Shodan.io, generating an attack list of victims with
vulnerable software.
Quote for the day:
“No one knows how hard you had to
fight to become who you are today.” -- Unknown
No comments:
Post a Comment