Applying particle physics methods to quantum computing
In quantum computing, which relies on quantum bits, or qubits, to carry
information, the fragile state known as quantum superposition is difficult to
maintain and can decay over time, causing a qubit to display a zero instead of
a one—this is a common example of a readout error. Superposition provides that
a quantum bit can represent a zero, a one, or both quantities at the same
time. This enables unique computing capabilities not possible in conventional
computing, which rely on bits representing either a one or a zero, but not
both at once. Another source of readout error in quantum computers is simply a
faulty measurement of a qubit's state due to the architecture of the computer.
In the study, researchers simulated a quantum computer to compare the
performance of three different error-correction (or error-mitigation or
unfolding) techniques. They found that the IBU method is more robust in a very
noisy, error-prone environment, and slightly outperformed the other two in the
presence of more common noise patterns. Its performance was compared to an
error-correction method called Ignis that is part of a collection of
open-source quantum-computing software development tools developed for IBM's
quantum computers, and a very basic form of unfolding known as the matrix
inversion method.
Common Challenges Facing Angular Enterprises - Stephen Fluin at Ngconf
The top emerging concerns emerging from the conversations that Fluin had in
the first trimester of this year are linked to user experience, micro
front-ends, server-side rendering, monorepositories and code sharing, managing
applications that are only partly Angular-based, and presenting a business
case for the upgrade of Angular versions. A good user experience means fast
initial load and seamless transitions. Fluin strongly recommended using the
source-map-explorer npm package to monitor and analyze the composition of an
Angular bundle: In enterprise conversations, this was actually identified as
one of the most valuable things they had learned. Fluin also mentioned that
simply by keeping up-to-date with the latest Angular versions, Angular
developers will naturally benefit from smaller bundle sizes or an improved
command-line interface implementing configurable optimization strategies
(e.g., better bundling, server-side rendering). Fluin posited that seamless
transitions between routes in Angular applications already was one of
Angular’s strengths. Fluin then explained that the independent deployability
characteristic of micro front-end may come into tension with the recommended
use of monorepositories to address other issues such as testing, code sharing,
or dependency management.
How Shell is fleshing out a digital-twin strategy
According to Shell, the deployment of the simulation technology will also
enable safe asset life extension by replacing the over-conservative estimates
made with conventional simulation software, with accurate assessments that
reflect actual remaining fatigue life. Elohor Aiboni, asset manager for
Bonga, said: “The Bonga Main FPSO heralded a number of innovative ‘firsts’
when it was built back in 2004, so it’s fitting that it is the first asset of
its kind to deploy something as advanced as a structural digital twin. We are
very excited about the new capabilities that Akselos brings and believe it
will create a positive impact on the way we manage structural integrity. It is
also a great example of digitisation coming to life.” In a recent blog post,
Victor Voulgaropoulos, industry analyst at Verdantix wrote: “Shell is again in
the spotlight, as it seeks to further accelerate its digital transformation
initiatives by implementing digital-twin solutions across its global portfolio
of assets and capital projects. Shell has signed an enterprise framework
agreement with Kongsberg Digital, a Kongsberg subsidiary, for the deployment
of Kongsberg’s Kognitwin Energy, a cloud-based software-as-a-service
digital-twin solution, within Shell’s upstream, liquified natural gas, and
downstream business lines.”
How COVID-19 Changed the VC Investment Landscape for Cybersecurity Companies
Businesses have faced the need to find new and inventive ways to survive the
"new normal." For many companies, this means digitizing existing processes and
relying heavily on cloud-based services to enable workers to access corporate
networks from their homes. But this presents myriad new problems for
businesses. While the pandemic provides vast opportunities for digital
transformation, it unfortunately creates the perfect storm for data breaches
and hackers, too. Social distancing restrictions have forced firms to abandon
the protections in the office in favor of enabling employees to work from
home, where they might not have the same robust levels of security. Of course,
VCs have kept their ears to the ground and are looking to cybersecurity and
artificial intelligence (AI) startups as a means to mitigate these new
vulnerabilities. Cybersecurity spending is forecast to grow approximately 9% a
year from 2021 to 2024, according to Gartner, as businesses invest more
heavily in identifying and quickly responding to threats. While large
corporations have traditionally been responsible for huge amounts of private
data that make cybersecurity a priority, the new virtual backdrop across all
industries means that businesses of all shapes and sizes are looking to build
the capabilities and defenses needed to keep malicious actors at bay.
NHS warned over Ryuk spreading through Trickbot replacements
“In recent weeks, we assess with high confidence that BazarBackdoor has been
Ryuk’s most predominant loader,” said the firm. “With lower confidence, we
assess this wave of Ryuk activity may be, in part, in retaliation for
September’s TrickBot disruptions.” Bazar’s components are most usually
delivered in spear phishing campaigns operated via Sendgrid, a bona fide email
marketing service. The emails contain links to Microsoft Office or Google Docs
files, and the lure usually relates to a threat of employee termination or a
debit payment. In turn, these emails link to the initial payload, a headless
preliminary loader that ultimately downloads, unpacks and loads Bazar. The
firm added that newer campaigns seem to forgo the spam distribution in favour
of human-operated attacks against exposed admin interfaces or cloud services.
Typically, once they have gained control of the target system using Bazar,
Wizard Spider will download a post-exploitation toolkit, such as Cobalt Strike
or Metasploit, to gather target information and enumerate the network, at
which point they will harvest credentials to move into other systems and
compromise the entire network – then they will deploy Ryuk ransomware. NHS
Digital said current Bazar campaigns could accomplish this in under five
hours.
Implementing a Staged Approach to Evolutionary Architecture
Traditionally, software architecture and design phases have been considered as
initial phases. In this approach, the architecture decisions were considered
valid for the entire life of the system. With the wisdom of ages and in
reaction to industry transformations, we have started to see architecture as
evolving. This evolution necessitates a different set of approaches in the
direction of continuous planning, facilitating via continuous integration,
dashboards, and tools, thus providing guide rails for systems to evolve. This
article focuses on these approaches and tools to support the journey. We are
in the midst of a rapidly changing environment. As Rebecca Parsons discussed
in a presentation on evolutionary architecture, the changes span across
business models, requirements, and customer expectations. The technology
landscape also changes quite often. In a broader sense, the changes are
happening at an unparalleled rate and impact on our environment. ...
Smartphones reached major penetration in the last 10 years. Software, a key
ingredient of all these, changes even faster. Sometimes, the software
frameworks we use are no longer relevant by the time of release.
Digital Business Opportunities Surge With IoT-Based Sensors At The Edge
Sensor data from machines – wherever they are located – carries heightened
importance in a pandemic-driven business environment of unpredictable starts
and stops. That’s because it provides critical visibility into what’s going on
within machines across the business. For example, Wallis reported a surge in
customer inquiries about using IoT to accomplish maintenance tasks
automatically, remotely, and safely. “Interest is high in IoT-enabled
automation from organizations that want to get the job done with minimal
employee risk and fewer productivity losses,” said Wallis. “Remote asset
diagnostics and monitoring gives companies 24/7 visibility about machine
performance, eliminating unnecessary physical maintenance calls. The same
applies to procurement transparency, where sensors on items reduce the need
for physical inspections.” But the benefits of IoT don’t stop there.
Connected IoT-based data from machines was game-changing for a power
generation company based in Italy, turning an essentially commoditized
business into a value-based service that increased customer loyalty. Using SAP
Internet of Things, SAP Edge Services, and SAP Predictive Maintenance and
Service, the company brought data together from the edge, meaning machine
performance at power plants worldwide, with data from various systems
including supply chain, warehouse management, machine repair and maintenance.
Take back control of IT with cloud native IGA
Legacy solutions have painted themselves into the corner of maintaining a large
amount of custom code. This makes upgrades costly, so they don’t happen. That
means customers suffer by not being able to adopt new features, bug fixes and
new capabilities to support their new business and compliance requirements. The
primary reason why legacy software projects don’t get fully completed and go
over budget is known as the 80/20 rule. Organizations can solve 80% of the
problems or challenges they have with the software as it is, but everybody wants
to solve that last 20%. And that 20% isn’t a quick fix – it takes 10 times the
amount of time that first 80% took. Understandably, organizations want to try to
tackle the more challenging problems, which always require high customization.
It’s very difficult for organizations to maintain a highly customized code in
their environments that the first generation of IGA products required. All those
changes to the code will then need to be maintained. But modern IGA has learned
from all the coding requirements of the past and now provides a much simpler way
to give users different levels of access. The identity governance and
administration market started with highly regulated businesses. However, all
industries are now impacted.
How remote access technology is improving the world as we know it
Globalisation and a dramatic uptick in both the need and desire for remote
working have resulted in a dispersed workforce — in which it is easy to lose
both professional and personal connection But the unprecedented speed of
digital transformation, technologies such as 5G and improving consumer
hardware such as smartphones, means that the prompt adoption of Augmented
Reality (AR) in remote support is rapidly coalescing to close the connection
gap. ... AR can be used to upskill these employees, and train new ones. When
onboarding a new member of staff, ensuring that the employee is aware of the
correct protocols and procedures is often critical. For example, when a new
employee is familiarising themselves with a machine, an AR-capable smartphone
or tablet can provide relevant training to ensure it’s operated correctly. If
this technology was not available, uncertainties could lead to a break in
compliance, safety issues, or even increased downtime — all critical issues in
multiple industries, including manufacturing. Today, this technology
goes beyond needing an AR-capable device to hand though. Features such as
session recording and being able to take a screenshot of the live video stream
are increasingly being used to create a pool of expert knowledge that is
readily available on demand.
Value vs Time: an Agile Contract Model
The cost of bug fixes is included in the price, so our interest is to have as
few bugs as possible in our software. This is obviously great value for our
customers, but also for users who will run into fewer bugs while using the
software. To do this, we use the common agile practices and methodologies such
as TDD (test-driven development), Pair Programing, Pull / merge request
management, and a strict procedure of verification and human tests before
releasing to the customer. Also, continuous improvement techniques such as
retrospective meetings and a lot of training help us deploy higher quality
software. We have a clear DoD (Definition of Done) shared with the customer for
each User Story (which also covers the UX / UI mockups for each US), and the
teams are autonomous in managing the implementation part, while respecting the
DoD and a minimum level of quality that is guaranteed by the practices and
processes listed. Including any bug-fix in the User Story development cost also
has a commercial advantage for Zupit. Customers don’t always "digest" that bugs
are part of the software development process and aren’t happy to pay the cost of
fixing them. A model where the supplier takes care of this aspect helps us to
convince customers about the quality of our work and to close contracts more
easily.
Quote for the day:
"The role of leaders is not to get other people to follow them but to empower others to lead." -- Bill George
No comments:
Post a Comment