Scientists Work To Turn Noise on Quantum Computers to Their Advantage
“We know very little about quantum computers and noise, but we know really well
how this molecule behaves when excited,” said Hu. “So we use quantum computers,
which we don’t know much about, to mimic a molecule which we are familiar with,
and we see how it behaves. With those familiar patterns we can draw some
understanding.” This operation gives a more ‘bird’s-eye’ view of the noise that
quantum computers simulate, said Scott Smart, a Ph.D. student at the University
of Chicago and first author on the paper. The authors hope this information can
help researchers as they think about how to design new ways to correct for
noise. It could even suggest ways that noise could be useful, Mazziotti said.
For example, if you’re trying to simulate a quantum system such as a molecule in
the real world, you know it will be experiencing noise—because noise exists in
the real world. Under the previous approach, you use computational power to add
a simulation of that noise. “But instead of building noise in as additional
operation on a quantum computer, maybe we could actually use the noise intrinsic
to a quantum computer to mimic the noise in a quantum problem that is difficult
to solve on a conventional computer,” Mazziotti said.How to Bring Shadow Kubernetes IT into the Light
Running container-based applications in production goes well beyond Kubernetes.
For example, IT operations teams often require additional services for tracing,
logs, storage, security and networking. They may also require different
management tools for Kubernetes distribution and compute instances across public
clouds, on-premises, hybrid architectures or at the edge. Integrating these
tools and services for a specific Kubernetes cluster requires that each tool or
service is configured according to that cluster’s use case. The requirements and
budgets for each cluster are likely to vary significantly, meaning that updating
or creating a new cluster configuration will differ based on the cluster and the
environment. As Kubernetes adoption matures and expands, there will be a direct
conflict between admins, who want to lessen the growing complexity of cluster
management, and application teams, who seek to tailor Kubernetes infrastructure
to meet their specific needs. What magnifies these challenges even further is
the pressure of meeting internal project deadlines — and the perceived need to
use more cloud-based services to get the work done on time and within budget.
Managing the complexity of cloud strategies
Both polycloud and sky computing are strategies for managing the complexities of
a multicloud deployment. Which model is better? Polycloud is best at leveraging
the strengths of each individual cloud provider. Because each cloud provider is
chosen based on its strength in a particular cloud specialty, you get the best
of each provider in your applications. This also encourages a deeper integration
with the cloud tools and capabilities that each provider offers. Deeper
integration means better cloud utilization, and more efficient applications.
Polycloud comes at a cost, however. The organization as a whole, and each
development and operations person within the organization, need deeper knowledge
about each cloud provider that is in use. Because an application uses
specialized services from multiple providers, the application developers need to
understand the tools and capabilities of all of the cloud providers. Sky
computing relieves this knowledge burden on application developers. Most
developers in the organization need to know and understand only the sky API and
the associated tooling and processes.
US, EU Agree to a New Data-Sharing Framework
The Biden administration and the European Commission said in a joint statement issued on Friday that the new framework "marks an unprecedented commitment on the U.S. side to implement reforms that will strengthen the privacy and civil liberties protections applicable to U.S. signals intelligence activities." Signals intelligence involves the interception of electronic signals/systems used by foreign targets. In the new framework, the U.S. reportedly will apply new "safeguards" to ensure signals surveillance activities "are necessary and proportionate in the pursuit of defined national security objectives," the statement says. It also will establish a two-level "independent redress mechanism" with binding authority, which it said will "direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities." The efforts, the statement says, places limitations on surveillance. Officials said the framework reflects more than a year of negotiations between U.S. Secretary of Commerce Gina Raimondo and EU Commissioner for Justice Didier Reynders.Google's tightening key security on Android with a longer (but better) chain of trust
There's a software key stored on basically every Android phone, inside a
secure element and separated from your own data — separately from Android
itself, even. The bits required for that key are provided by the device
manufacturer when the phone is made, signed by a root key that's provided by
Google. In more practical terms, apps that need to do something sensitive
can prove that the bundled secure hardware environment can be trusted, and
this is the basis on which a larger chain of love trust can be built,
allowing things like biometric data, user data, and secure operations of all
kind to be stored or transmitted safely. Previously, Android devices that
wanted to enjoy this process needed to have that key securely installed at
the factory, but Google is changing from in-factory private key provisioning
to in-factory public key extraction with over-the-air certificate
provisioning, paired with short-lived certificates. As even the description
makes it sound, this new change is a more complicated system, but it fixes a
lot of issues in practice. How Do I Demonstrate the ROI of My Security Program?
Extended Threat Intelligence: A new approach to old school threat intelligence
Large-Scale, Available Graphene Supercapacitors; How Close are We?
One issue with supercapacitors so far has been their low energy density.
Batteries, on the other hand, have been widely used in consumer electronics.
However, after a few charge/discharge cycles, they wear out and have safety
issues, such as overheating and explosions. Hence, scientists started
working on coupling supercapacitors and batteries as hybrid energy storage
systems. For example, Prof. Roland Fischer and a team of researchers from
the Technical University Munich have recently developed a highly efficient
graphene hybrid supercapacitor. It consists of graphene as the electrostatic
electrode and metal-organic framework (MOF) as the electrochemical
electrode. The device can deliver a power density of up to 16 kW/kg and an
energy density of up to 73 Wh/kg, comparable to several commercial devices
such as Pb-acid batteries and nickel metal hydride batteries. Moreover, the
standard batteries (such as lithium) have a useful life of around 5000
cycles. However, this new hybrid graphene supercapacitor retains 88% of its
capacity even after 10,000 cycles.3 reasons user experience matters to your digital transformation strategy
Simply put, a strong UX makes it easier for people to follow the rules. You
can “best practice” employees all day long, but if those practices get in the
way of day-to-day responsibilities, what’s the point of having them? Security
should be baked into all systems from the get-go, not treated as an
afterthought. And when it’s working well, people shouldn’t even know it’s
there. Don’t make signing into different systems so complicated or
time-consuming that people resort to keeping a list of passwords next to their
computer. Automating security measures as much as possible is the surest way
to stay protected while putting UX at the forefront. By doing this, people
will have access to the systems they need and be prohibited from those that
they don’t for the duration of their employment – not a minute longer or
shorter. Automation also enables organizations to understand what is normal
vs. anomalous behavior so they can spot problems before they get worse. For
business leaders who really want to move the needle, UX should be just as
important as CX. Employees may not be as vocal as customers about what needs
improvement, but it’s critical information.Automation Is No Silver Bullet: 3 Keys for Scaling Success
Many organizations think automation is an easy way to enter the market. Although it’s a starting point, automated testing warrants prioritization. Automated testing doesn’t just speed up QA processes, but also speeds up internal processes. Maintenance is also an area that benefits from automation with intelligent suggestions and searches. Ongoing feedback needs to improve user expectations. It’s a must-have for agile continuous integration and continuous delivery cycles. Plus, adopting automated testing ensures more confidence in releases and lower risks of failures. That means less stress and happier times for developers. That is increasingly important given the current shortage of developers amid the great reshuffle. Automated testing can help fight burnout and sustain a team of developers who make beautiful and high-quality applications. Some of the benefits of test automation include the reduction of bugs and security in final products, which increases the value of software delivered.Quote for the day:
"Leadership is about carrying on when everyone else has given up" -- Gordon Tredgold
No comments:
Post a Comment