Search form

Perimeter Institute Quantum Discussions

This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.

Quantum information and quantum metrology can be used to study gravitational effects such as gravitational waves and the universality of the equivalence principle. On one hand, the possibility of carrying out experiments to probe gravity using quantum systems opens an avenue to deepen our understanding of the overlap of these theories. On the other hand, incorporating relativity in quantum technologies promises the development of a new generation of relativistic quantum applications of relevance in Earth-based and space-based setups.

Computing ground
states of local Hamiltonians is a fundamental problem in condensed matter
physics. We give the first randomized polynomial-time algorithm for finding
ground states of gapped one-dimensional Hamiltonians: it outputs an

A self-correcting quantum memory is a physical system whose quantum state can be preserved over a long period of time without the need for any external intervention. The most promising candidates are topological quantum systems which would protect information encoded in their degenerate groundspace while interacting with a thermal environment. Many models have been suggested but several approaches have been shown to fail due to no-go results of increasingly general scope.

We propose a non-commutative extension of the Pauli stabilizer formalism. The aim is to describe a class of many-body quantum states which is richer than the standard Pauli stabilizer states. In our framework, stabilizer operators are tensor products of single-qubit operators drawn from the group {\alpha I, X,S}, where \alpha=e^{i\pi/4} and S=diag(1,i). We provide techniques to efficiently compute various properties, related to e.g. bipartite entanglement, expectation values of local observables, preparation by means of quantum circuits, parent Hamiltonians etc.

We introduce an abstract model of computation corresponding to an experiment in which identical, non-interacting bosons are sent through a non-adaptive linear circuit before being measured. We show that despite the very limited nature of the model, an exact classical simulation would imply a collapse of the polynomial hierarchy. Moreover, under plausible conjectures, a "noisy" approximate simulation would do the same.

Non-Abelian anyons promise to reveal spectacular features of quantum mechanics that could ultimately provide the foundation for a decoherence-free quantum computer. The Moore-Read quantum Hall state and a (relatively simple) two-dimensional p+ip superconductor both support Ising non-Abelian anyons, also referred to as Majorana zero modes.

Privacy and coherence have long been considered closely related properties of a quantum state. Indeed, a coherently transmitted quantum state is inherently private. Surprisingly, coherent quantum communication is not always required for privacy: there are quantum channels that are too noisy to transmit quantum information but it can send private classical information. Here, we ask how different the private classical and the quantum capacities can be. We present a class of channels N_d with input dimension d^2, quantum capacity Q(N_d)

Probabilistic protocols in quantum information are an attempt to improve performance by occasionally reporting a better result than could be expected from a deterministic protocol. Here we show that probabilistic protocols can never improve performance beyond the quantum limits on the corresponding deterministic protocol. To illustrate this result we examine three common probabilistic protocols: probabilistic amplification, weak value amplification, and probabilistic metrology.

A fundamental question in complexity theory is how much resource is needed to solve k independent instances of a problem compared to the resource required to solve one instance. Suppose solving one instance of a problem with probability of correctness p, we require c units of some resource in a given model of computation. A direct sum theorem states that in order to compute k independent instances of a problem, it requires k times units of the resource needed to compute one instance.

Quantum many-body problems are notorious hard. This is partly because the Hilbert space becomes exponentially big with the particle number N. While exact solutions are often considered intractable, numerous approaches have been proposed using approximations.