Quantum computing attacks on cloud-based systems
Enterprises should indeed be concerned about the advancements in quantum
computing. Quantum computers have the potential to break widely used encryption
protocols, posing risks to financial data, intellectual property, personal
information, and even national security. However, this reaction to danger goes
well beyond NIST releasing quantum-resistant algorithms; it’s also crucial for
enterprises to start transitioning today to new forms of encryption to
future-proof their data security. As other technology advancements arise and
enterprises run from one protection to another, work will begin to resemble
Whac-A-Mole. I suspect many enterprises will be unable to whack that mole in
time, will lose the battle, and be forced to absorb a breach. ... Although
quantum computing represents a groundbreaking shift in computational
capabilities, the way we address its challenges transcends this singular
technology. It’s obvious we need a multidisciplinary approach to managing and
leveraging all new advancements. Organizations must be able to anticipate
technological disruptions like quantum computing and also become adaptable
enough to implement solutions rapidly.
QA and Testing: The New Cybersecurity Frontline
The convergence of Security, QA, and DevOps is pivotal in the evolution of
software security. These teams, often interdependent, share the common objective
of minimizing software defects. While security teams may not possess deep QA
expertise and QA professionals might lack cybersecurity specialization, their
collaborative efforts are essential for a lock-tight security approach. ...
Automated testing tools can quickly identify common vulnerabilities and ensure
that security measures are consistently applied across all code changes.
Meanwhile, manual testing allows for more nuanced assessments, particularly in
identifying complex issues that automated tools might miss. The best QA
processes rely on both methods working in concert to ensure consistent and
comprehensive testing coverage for all releases. While QA focuses on identifying
and rectifying functional bugs, cybersecurity experts concentrate on
vulnerabilities and weaknesses that could be exploited. By incorporating
security testing, such as Mobile Application Security Testing (MAST), into the
QA process, teams can proactively address security risks, recognize the
importance of security, and prioritize threat prevention alongside quality
improvements, enhancing the overall quality and reliability of the software.
Bridging the Divide: How to Foster the CIO-CFO Partnership
Considering today’s evolving business and regulatory landscape, such as the SEC
Cybersecurity Ruling and internal focus on finance transformation, a strong
CIO-CFO relationship is especially critical. For cybersecurity, the CIO
historically focused on managing the organization's technological infrastructure
and developing robust security measures, while the CFO concentrated on financial
oversight and regulatory compliance. However, the SEC's ruling mandates the
timely disclosure of material cybersecurity incidents, requiring a bridge
between roles and the need for closer collaboration. The new regulation demands
a seamless integration of the CIO’s expertise in identifying and assessing cyber
threats with the CFO’s experience in understanding financial implications and
regulatory requirements. This means cybersecurity is no longer seen as solely a
technology issue but as a critical part of financial risk management and
corporate governance. By working closely together, the CIO and CFO can create
clear communication channels, shared responsibilities, and joint accountability
for incident response and disclosure processes.
Rising cloud costs leave CIOs seeking ways to cope
Cloud costs have risen for many of CGI’s customers in the past year. Sunrise
Banks, which operates community banks and a fintech service, has also seen cloud
costs increase recently, says CIO Jon Sandoval. The company is a recent convert
to cloud computing; it replaced its own data centers with the cloud just over a
year ago, he says. Cloud providers aren’t the only culprits, he says. “I’ve seen
increases from all of our applications and services that that we procure, and a
lot of that’s just dealing with the high levels of inflation that we’ve
experienced over the past couple years,” he adds. “Labor, cost of goods —
everything has gotten more expensive.” ... Cloud cost containment requires
“assertive and sometimes aggressive” measures, adds Trude Van Horn, CIO and
executive vice president at Rimini Street, an IT and security strategy
consulting firm. Van Horn recommends that organizations name a cloud controller,
whose job is to contain cloud costs. “The notion of a cloud controller requires
a savvy and assertive individual — one who knows a lot about cloud usage and
your particular cloud landscape and is responsible to monitor trends, look for
overages, manage against the budget,” she says.
Zero-Touch Provisioning for Power Management Deployments
At the heart of ZTP lies Dynamic Host Configuration Protocol (DHCP), a
foundational network protocol that assigns IP addresses to devices (clients) on
a network, facilitating their communication within the network and with external
systems. DHCP is an essential network protocol used in IP networks to
dynamically assign IP addresses and other network configuration parameters to
devices, thereby simplifying network administration. DHCP's capabilities extend
beyond basic IP address assignment in providing various configuration details to
devices via DHCP options. These options are instrumental in ZTP, allowing
devices to automatically receive critical configuration information, including
network settings, server addresses, and paths to configuration files. By
utilizing DHCP options, devices can self-configure and integrate into the
network seamlessly with "zero touch." With DHCP functionalities, ZTP can be
utilized to automate the commissioning and configuration of critical power
devices such as uninterruptible power systems (UPSs) and power distribution
units (PDUs). Network interfaces can be leveraged in conjunction with ZTP for
advanced connectivity and management features.
Exclusive: Gigamon CEO highlights importance of deep observability
The importance of deep observability is heightened as companies undergo digital
transformation, often moving workloads into virtualized environments or public
clouds. This shift can increase risks related to compliance and security.
Gigamon's deep observability helps CIOs move application workloads without
compromising security. "You can maintain your security posture regardless of
where the workload moves," Buckley said. "That's a really powerful capability
for organizations today." Overall, the deep observability market grew 61 percent
in 2023 and continued to expand as organizations increasingly embrace hybrid
cloud infrastructure, with a forecasted CAGR of 40 percent and projected revenue
of nearly $2B in 2028, according to research firm 650 Group. "CIOs are moving
workloads to wherever it makes the organization more effective and efficient,
whether that's public cloud, on-premises, or a hybrid approach," Buckley
explained. "The key is to ensure there's no increased risk to the organization,
and the security profile remains constant."
Prioritizing your developer experience roadmap
The biggest points of friction will be an ongoing process, but, as he said, “A
lot of times, engineers have been at a place for long enough where they’ve
developed workarounds or become used to problems. It’s become a known
experience. So we have to look at their workflow to see what the pebbles are and
then remove them.” Successful platform teams pair program with their customers
regularly. It’s an effective way to build empathy. Another thing to prioritize
is asking: Is this affecting just one or two really vocal teams or is it
something systemic across the organization? ... Another way that platform
engineering differs from the behemoth legacy platforms is that it’s not a giant
one-off implementation. In fact, Team Topologies has the concept of Thinnest
Viable Platform. You start with something small but sturdy that you can build
your platform strategy on top of. For most companies, the biggest time-waster is
finding things. Your first TVP is often either a directory of who owns what or
better documentation. But don’t trust that instinct — ask first. Running a
developer productivity survey will let you know what the biggest frustrations
are for your developers. Ask targeted questions, not open-ended ones.
How to prioritize data privacy in core customer-facing systems
Before creating a data-sharing agreement with a third party, review the
organization’s data storage, collection and transfer safeguards. Verify that the
organization’s data protection policies are as robust as yours. Further, when
drafting an eventual agreement, ensure that contract terms dictate a superior
level of protection, delineating the responsibilities and expectations of each
party in terms of compliance and cybersecurity. Due diligence on the front half
of a relationship is necessary. However, it’s also essential to maintain an open
line of communication after the partnership commences. Organizations should
regularly reassess their partners’ commitments to data privacy by inquiring
about their ongoing data protection policies, including data storage timelines
and the intent of using said data. ... Most customers can opt out of data
collection and tracking at any time. This preference is known as “consent” — and
enabling its collection is only half the journey. Organizations must also
proactively enforce consent to ensure that downstream data routing doesn’t
jeopardize or invalidate a customer’s express preferences.
Choosing a Data Quality Tool: What, Why, How
Data quality describes a characteristic or attribute of the data itself, but
equally important for achieving and maintaining the quality of data is the
ability to monitor and troubleshoot the systems and processes that affect data
quality. Data observability is most important in complex, distributed data
systems such as data lakes, data warehouses, and cloud data platforms. It allows
companies to monitor and respond in real time to problems related to data flows
and the data elements themselves. Data observability tools provide visibility
into data as it traverses a network by tracking data lineages, dependencies, and
transformations. The products send alerts when anomalies are detected, and apply
metadata about data sources, schemas, and other attributes to provide a clearer
understanding and more efficient management of data resources. ... A company’s
data quality efforts are designed to achieve three core goals: Promote
collaboration between IT and business departments; Allow IT staff to manage
and troubleshoot all data pipelines and data systems, whether they’re completely
internal or extend outside the organization; Help business managers manipulate
the data in support of their work toward achieving business goals.
Businesses increasingly turn to biometrics for physical access and visitor management
Experts suggest that to address these concerns, employers need to be more
transparent about their use of biometric technologies and implement robust
safeguards to protect employees’ data. This includes informing employees about
how their data will be used, stored, and protected from potential breaches.
Employers should also offer alternatives for those who are uncomfortable with
biometric systems to ensure no employee feels coerced. Companies that prioritize
transparency, consent, and data protection are more likely to gain employee
trust and avoid backlash. However, without clear guidelines and protections,
resistance to workplace biometrics is likely to grow. “Education needs to be
laid out very clearly and regularly that, ‘Look, biometrics is not an invasion
of privacy,” adds Murad. “It’s providing an envelope of security for your
privacy, it’s protecting it.’ I think that message is getting there, but it’s
taking time.” Several companies have recently introduced new physical access
security technologies. Nabla Works has launched advanced facial recognition
tools with anti-spoofing features for secure access across various
applications
Quote for the day:
"It is not the strongest of the species
that survive, nor the most intelligent, but the one most responsive to
change." -- Charles Darwin
No comments:
Post a Comment