How confidential computing will shape the next phase of cybersecurity
At its core, confidential computing encrypts data at the hardware level. It’s a
way of “protecting data and applications by running them in a secure, trusted
environment,” explains Noam Dror—SVP of solution engineering at HUB Security, a
Tel Aviv, Israel-based cybersecurity company that specializes in confidential
computing. In other words, confidential computing is like running your data and
code in an isolated, secure black box, known as an “enclave” or trusted
execution environment (TEE), that’s inaccessible to unauthorized systems. The
enclave also encrypts all the data inside, allowing you to process your data
even when hackers breach your infrastructure. Encryption makes the information
invisible to human users, cloud providers, and other computer
resources. Encryption is the best way to secure data in the cloud, says
Kurt Rohloff, cofounder and CTO at Duality, a cybersecurity firm based in New
Jersey. Confidential computing, he says, allows multiple sources to analyze and
upload data to shared environments, such as a commercial third-party cloud
environment, without worrying about data leakage.
Not All Multi-Factor Authentication Is Created Equal
Many legacy MFA platforms rely on easily phishable factors like passwords, push
notifications, one-time codes, or magic links delivered via email or SMS. In
addition to the complicated and often frustrating user experience they create,
phishable factors such as these open organizations up to cyber threats. Through
social engineering attacks, employees can be easily manipulated into providing
these authentication factors to a cyber criminal. And by relying on these
factors, the burden to protect digital identities lies squarely on the end user,
meaning organizations’ cybersecurity strategies can hinge entirely on a moment
of human error. Beyond social engineering, man-in-the middle attacks and readily
available toolkits make bypassing existing MFA a trivial exercise. Where there
is a password and other weak and phishable factors, there is an attack vector
for hackers, leaving organizations to suffer the consequences of account
takeovers, ransomware attacks, data leakage, and more. A phishing-resistant MFA
solution completely removes these factors, making it impossible for an end user
to be tricked into handing them over even by accident or collected by automated
phishing tactics.
Europe’s cyber security strategy must be clear about open source
While the UK government has tried to recognise the importance of digital supply
chain security, current policy doesn’t consider open source as part of that
supply chain. Instead, regulation or proposed policies focus only on third-party
software vendors in the traditional sense but fail to recognise the building
blocks of all software today and the supply chain behind it. To hammer the
point, the UK’s 11,000+ word National Cyber Security Strategy does not include a
single reference to open source. GCHQ guidance meanwhile remains limited, with
little detailed direction beyond ‘pull together a list of your software’s open
source components or ask your suppliers.’ ... In this sense, the EU has
certainly been listening. The recently released Cyber Resilience Act (CRA) is
its proposed regulation to combat threats affecting any digital entity and
‘bolster cyber security rules to ensure more secure hardware and software
products’. First, the encouraging bits: the CRA doesn’t just call for vendors
and producers of software to have (among other things) a Software Bill of
Materials (SBoM) - it demands companies have the ability to recall
components.
Eight Common Data Strategy Pitfalls
Lack of data culture: Data hidden within silos with little communication between
business units leads to a lack of data culture. Data Literacy and
enterprise-wide data training is required to allow business staff to read,
analyze, and discuss data. Data culture is the starting point for developing an
effective Data Strategy.The Data Strategy is too focused on data and not on the
business side of things: When businesses focus too much on just data, the Data
Strategy may just end up serving the needs of analytics without any focus on
business needs. An ideal Data Strategy enlists human capabilities and provides
opportunities for training staff to carry out the strategy to meet business
goals. This approach will work better if citizen data scientists are included in
strategy teams to bridge the gap between the data scientist and the business
analyst.Investing in data technology before democratizing data: In many cases,
Data Strategy initiatives focus on quick investment in technology without first
addressing data access issues. If data access is not considered first, costly
technology investments will go to waste.
Here's Why Your Data Science Project Failed (and How to Succeed Next Time)
Every data science project needs to start with an evaluation of your primary
goals. What opportunities are there to improve your core competency? Are there
any specific questions you have about your products, services, customers, or
operations? And is there a small and easy proof of concept you can launch to
gain traction and master the technology? The above use case from GE is a prime
example of having a clear goal in mind. The multinational company was in the
middle of restructuring, re-emphasizing its focus on aero engines and power
equipment. With the goal of reducing their six- to 12-month design process, they
decided to pursue a machine learning project capable of increasing the
efficiency of product design within their core verticals. As a result, this
project promises to decrease design time and budget allocated for R&D.
Organizations that embody GE's strategy will face fewer false starts with their
data science projects. For those that are still unsure about how to adapt
data-driven thinking to their business, an outsourced partner can simplify the
selection process and optimize your outcomes.
5 Skills That Make a Successful Data Manager
The role of a data manager in an organization is tricky. This person is often
neither an IT guy who implements databases on his/her own, nor a business guy
who is actually responsible for data or processes (that’s rather a Data
Steward’s area of responsibility). So what’s the real value-add of a data
manager (or even a data management department)? In my opinion, you need someone
who is building bridges between the different data stakeholders on a methodical
level. It’s rather easy to find people who consider themselves as experts for a
particular business area, data analysis method or IT tool, but it is rather
complicated to find one person who is willing to connect all these people and to
organize their competencies as it is often required in data projects. So what I
am referring to are skills like networking, project management, stakeholder
management and change management HIwhich are required to build a data community
step-by-step as backbone for Data Governance. Without people, a data manager
will fail! So in my opinion, a recruiter who seeks for data managers should not
only challenge technical skills but also these people skills.
Why distributed ledger technology needs to scale back its ambition
There is nonetheless an expectation that DLT can prove to be a net good for
financial markets. Foreign exchange markets have an estimated $8.9 trillion at
risk every day due to the final settlement of transactions between two parties
taking days. This is why the Financial Stability Board and the Committee on
Payments and Market Infrastructures have focused their efforts on enhancing
cross-border payments with a comprehensive global roadmap. Part of this roadmap
includes exploring the use of DLT and Central Bank Digital Currencies. The
problem may not be the technology itself, but the aim of replacing current
technology systems with distributed networks. DLT networks are being designed to
completely overhaul and replace legacy technology that financial markets depend
on today. Many pilot projects, such as mBridge and Jura, rely on a single
blockchain developed by a single vendor. This introduces a single point of
trust, and removes many of the benefits of disintermediation.
Why is “information architecture” at the centre of the design process?
The information architecture within a design (both process and output) makes the
balancing within the equation possible. It also ensures the equation is
“solvable” by other people. It does this by introducing logical coherence. It
ensures words, images, shapes and colours are used consistently. And it ensures
that as we move from idea to execution, we stay true to the original intent —
and can clearly articulate it — so that we can meaningfully measure the
effectiveness of our design. Without this internal coherence and confidence that
our output is an accurate, reliable test of our hypothesis, we’re not doing
design. The power of design which has a consistent information architecture is
that if we find that our idea (which we translate to intent, experiments and
experiences) is not equal to the problem, we can interrogate every part of the
equation. We may have made a mistake in execution. Maybe our idea wasn’t quite
right. Or even more powerfully, maybe we didn’t really understand the problem
fully.
Improve Your Software Quality with a Strong Digital Immune System
You can improve your software quality with a strong digital immune system since
a digital immune system is designed to guard against cyberattacks and other
sorts of hostile activities on computer systems, networks, and hardware. It
operates by constantly scanning the network and systems for indications of
prospective threats and then taking the necessary precautions to thwart or
lessen such dangers. This can entail detecting and preventing malicious
communications, identifying and containing compromised devices, and patching
security holes. A robust digital immune system should offer powerful and
efficient protection against cyber threats and assist individuals and companies
in staying secure online. Experts in software engineering are searching for
fresh methods and strategies to reduce risks and maximize commercial impact. The
idea of “digital immunity” offers a direction. It consists of a collection of
techniques and tools for creating robust software programmes that provide
top-notch user experiences. With the help of this roadmap, software engineering
teams may identify and address a wide range of problems, including functional
faults, security flaws, and inconsistent data.
Security Bugs Are Fundamentally Different Than Quality Bugs
For each one of the types of testing listed above, a different skillset is
required. All of them require patience, attention to detail, basic technical
skills, and the ability to document what you have found in a way that the
software developers will understand and be able to fix the issue(s). That is
where the similarities end. Each one of these types of testing requires
different experience, knowledge, and tools, often meaning you need to hire
different resources to perform the different tasks. Also, we can’t concentrate
on everything at once and still do a great job at each one of them. Although
theoretically you could find one person who is both skilled and experienced in
all of these areas, it is rare, and that person would likely be costly to employ
as a full-time resource. This is one reason that people hired for general
software testing are not often also tasked with security testing. Another reason
is that people who have the experience and skills to perform thorough and
complete security testing are currently a rarity.
Quote for the day:
"Leadership is particularly necessary
to ensure ready acceptance of the unfamiliar and that which is contrary to
tradition." -- Cyril Falls
No comments:
Post a Comment