Search form

Conceptual Foundations and Foils for Quantum Information Processing

Conference Date:

Monday, May 9, 2011 (All day) to Friday, May 13, 2011 (All day)

Scientific Areas:

Quantum Foundations

The quest for a complete set of physical or information-theoretic principles that capture the essence of quantum information processing is attracting an increasing number of researchers. This new research trend aims at deriving quantum protocols and connections between them directly from basic principles, without assuming quantum theory and its mathematical dowry of Hilbert spaces and observable algebras. Particularly relevant to the process is the characterization of information-theoretic foils to quantum theory, that is, general probabilistic theories that differ from quantum theory in their information processing capabilities. Ultimately, this research is expected to provide insights in many other areas of fundamental physics and to contribute to the development of a new informational view of nature.

Thanks to the injection of fresh ideas on information processing, the field of quantum foundations is currently experiencing an exceptional growth, with a flourishing of new results and an increasing number of researchers joining the community. This conference aims to capture the excitement pervading the field and to boost its future progress, providing an overview of the main recent trends, bringing together researchers working on quantum foundations and quantum information processing, promoting the exchange of ideas and the development of collaborative work in a relaxed atmosphere. We encourage all interested researchers to take part in this exciting occasion.

The conference represents the continuation of a series of focused workshops organized in Cambridge and ETH Zurich during the past four years and is expected to have a much broader audience due to the increasing interest gained by the topic and to the enlargement of the community. In order to cover a wider spectrum of contributions, the conference will also feature a poster session where participants are encouraged to present their work.

We consider theories that satisfy: information causality, reversibility, local discriminability, all tight effects are measurable. A property of these theories is that binary systems (with two perfectly distinguishable states and no more) have state spaces with the shape of a unit ball (the Bloch ball) of arbitrary dimension. It turns out that for dimension different than three these systems cannot be entangled. Hence, the only theory with entanglement which satisfying the above assumptions is quantum theory.

Valerio Scarani, National University of Singapore

3 >> 2

Three-partite quantum systems exhibit interesting features that are absent in bipartite ones. Several instances are classics by now: the GHZ argument, the W state, the UPB bound entangled states, Svetlichny inequalities... In this talk, I shall discuss some on-going research projects that we are pursuing in my group (in collaboration, or in friendly competition, with other groups) and that involve three-partite entanglement or non-locality:

* Activation of non-locality in networks.

* Device-independent assessment of the entangling power of a measurement.

* Can one falsify all models of hidden communication with finite speed?

* Information causality in the three-partite scenario.

I shall conclude by a blind excursion into uncertainty relations and cryptography, which also shows 3>>2 albeit with a different meaning.

It is now exactly 75 years ago that John von Neumann denounced his own Hilbert space formalism: "I would like to make a confession which may seem immoral: I do not believe absolutely in Hilbert space no more.'' (sic) [1] His reason was that Hilbert space does not elucidate in any direct manner the key quantum behaviors. One year later, together with Birkhoff, they published "The logic of quantum mechanics". However, it is fair to say that this program was never successful nor does it have anything to do with logic. So what is logic? We will conceive logic in two manners: (1) Something which captures the mathematical content of language (cf 'and', 'or', 'no', 'if ... then' are captured by Boolean algebra); (2) something that can be encoded in a 'machine' and enables it to reason.

Recently we have proposed a new kind of 'logic of quantum mechanics' [4]. It follows Schrodinger in that the behavior of compound quantum systems, described by the tensor product [2, again 75 years ago], that captures the quantum behaviors. Over the past couple of years we have played the following game: how much quantum phenomena can be derived from 'composition + epsilon'. It turned out that epsilon can be taken to be 'very little', surely not involving anything like continuum, fields, vector spaces, but merely a 'two-dimensional space' of temporal composition (cf 'and then') and compoundness (cf 'while'), together with some very natural purely operational assertion. In a very short time, this radically different approach has produced a universal graphical language for quantum theory which helped to resolve some open problems.

Most importantly, it paved the way to automate quantum reasoning [5,6], and also enables to model meaning for natural languages [7,8]. That is, we are truly taking 'quantum logic' now! If time permits, we also discuss how this logical view has helped to solve concrete problems in quantum information.

In this talk, I'll survey various "foils" of BQP (Bounded-Error Quantum Polynomial-Time) that have been proposed: that is, changes to the quantum model of computation that make it either more or less powerful. Possible topics include: postselected quantum computing, quantum computing with nonlinear Schrodinger equation, quantum computing with non-unitary linear transformations, quantum computing with hidden variables, linear-optical quantum computing, quantum computing with restricted gate sets, quantum computing with separable mixed states, quantum computing over finite fields, and more depending on audience interest.

Usually, quantum theory (QT) is introduced by giving a list of abstract mathematical postulates, including the Hilbert space formalism and the Born rule. Even though the result is mathematically sound and in perfect agreement with experiment, there remains the question why this formalism is a natural choice, and how QT could possibly be modified in a consistent way. My talk is on recent work with Lluis Masanes, where we show that five simple operational axioms actually determine the formalism of QT uniquely. This is based to a large extent on Lucien Hardy's seminal work. We start with the framework of "general probabilistic theories", a simple, minimal mathematical description for outcome probabilities of measurements. Then, we use group theory and convex geometry to show that the state space of a bit must be a 3D (Bloch) ball, finally recovering the Hilbert space formalism. There will also be some speculation on how to find natural post-quantum theories by dropping one of the axioms.

In 1964, John Bell proved that independent measurements on entangled quantum states lead to correlations that cannot be reproduced using local hidden variables. The core of his proof is that such distributions violate some logical constraints known as Bell inequalities. This remarkable result establishes the non-locality of quantum physics. Bell's approach is purely qualitative. This naturally leads to the question of quantifying quantum physics' non-locality. We will specifically consider two quantities introduced for this purpose. The first one is the maximum amount of Bell inequality violation, and the second one is the communication cost of simulating quantum distributions. In this talk, we prove that these two quantities are strongly related: the logarithm of the first is upper bounded by the second. We prove this theorem in the more general context of non-signalling distributions. This generalization gives us two clear benefits. First, the rich structure of the underlying affine space provides us with a very strong intuition. Secondly, non-signalling distributions capture traditional communication complexity of boolean functions. In that case, our theorem is equivalent to the factorization norm lower bound of Linial and Shraibman, for which we give an elementary proof.

Gilles Brassard, Université de Montréal

Is Information the Key?

Consider the two great physical theories of the twentieth century: relativity and quantum mechanics. Einstein derived relativity from very simple principles. By contrast, the foundation of quantum mechanics is built on a set of rather strange, disjointed and ad hoc axioms, reflecting at best the history that led to discovering this new world order. The purpose of this talk is to argue that a better foundation for quantum mechanics lies within the teachings of quantum information science. The basic postulate is that the truly fundamental laws of Nature concern information, not waves or particles. For example, it is known that quantum key distribution is possible but quantum bit commitment is not and that nature is nonlocal but not as nonlocal as is imposed by causality. But should these statements be considered as theorems or axioms? It's time to pause and reflect on what is really fundamental and what are merely consequences. Could information be the key?

Andreas Winter, University of Bristol

Non-contextual correlations in probabilistic models

Non-contextuality is presented as an abstraction and at the same time generalisation of locality. Rather than in correlations, the underlying physical model leaves its signature in collections of expectation values, which are contrained by inequalities much like Bell's or Tsirelson's inequalities. These non-contextual inequalities reveal a deep connection to classic topics in graph theory, such as independence numbers, Lovasz numbers and other graph parameters. By considering the special case of bi-local experiments, we arrive at a semidefinite relaxation (and indeed a whole hierarchy of such relaxations) for the problem of determining the maximum quantum violation of a given Bell inequality.

Alex Wilce, Susquehanna University

Symmetry, Self-Duality and the Jordan Structure of Quantum Theory

This talk reviews recent and on-going work, much of it joint with Howard Barnum, on the origins of the Jordan-algebraic structure of finite-dimensional quantum theory. I begin by describing a simple recipe for constructing highly symmetrical probabilistic models, and discuss the ordered linear spaces generated by such models. I then consider the situation of a probabilistic theory consisting of a symmetric monoidal *-category of finite-dimensional such models: in this context, the state and effect cones are self-dual. Subject to a further "steering" axiom, they are also homogenous, and hence, by the Koecher-Vinberg Theorem, representable as the cones of formally real Jordan algebras. Finally, if the theory contains a single system with the structure of a qubit, then (by a result of H. Hanche-Olsen), each model in the category is the self-adjoint part of a C*-algebra.

Renato Renner, ETH Zurich

How Fundamental is the Uncertainty Principle?

According to quantum theory, it is impossible to prepare the state of a system such that the outcome of any projective measurement on the system can be predicted with certainty. This limitation of predictive power, which is known as the uncertainty principle, is one of the main distinguishing properties of quantum theory when compared to classical theories. In this talk, I will discuss the implications of this principle to foundational questions. In particular, I will consider the hypothesis that the uncertainty principle, rather than (only) telling us something about reality, may be seen as a manifestation of the limitations of our (classical) methods used to describe reality.

Caslav Brukner, University of Vienna

Quantum correlations with no causal order

Much of the recent progress in understanding quantum theory has been achieved within an operational approach. Within this context quantum mechanics is viewed as a theory for making probabilistic predictions for measurement outcomes following specified preparations. However, thus far some of the essential elements of the theory – space, time and causal structure – elude such an operational formulation and are assumed to be fixed. Is it possible to extend the operational approach to quantum mechanics such that the notions of an underlying spacetime or causal structure are not assumed? What new phenomenology can follow from such an approach? We

develop a framework for multipartite quantum correlations that does not presume these notions, but simply that experimenters in their local laboratories are free to perform arbitrary quantum operations. All known situations that respect definite causal order, including signalling and no-signalling correlations between space-like and time-like separated experiments, as well as probabilistic mixtures of these, can be expressed in this framework. Remarkably, we find quantum correlations which are neither causally ordered nor in a probabilistic mixture of definite causal orders. These correlations are shown to enable performing a communication task that is impossible if a fixed background time is assumed and the events are sufficiently localized in the time.

Quantum Theory can be derived from six operational axioms. We introduce the operational and probabilistic language that is used to formulate the principles. After the basic notions of system, state, effect and transformation are reviewed, the principles are stated, and their immediate consequences and interpretations are analyzed. Finally, some key results that represent milestones of the derivation are discussed, with particular focus on their implications on information processing and their relation with the standard quantum formalism. The global picture of the presentation highlights quantum theory as a particular operational language emerging from a background of information processing theories, thanks to the purification postulate that singles out the strictly quantum features of information.

Gen Kimura, Shibaura Institute of Technology

On Basic Principles of General Probabilistic Theories

We propose an operationally motivated definition of the physical equivalence of states in General Probabilistic Theories and consider the principle of the physical equivalence of pure states, which turns out to be equivalent to the symmetric structure of the state space. We further consider a principle of the decomposability with distinguishable pure states and give classification theorems of the state spaces for each principle, and derive the Bloch ball in 2 and 3 dimensional systems.

Mauro D'Ariano, University of Pavia

A Quantum-Digital Universe

David Deutsch re-formulated the Church-Turing thesis as a physical principle, asserting that "every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means". Such principle can be regarded as a new theoretical paradigm, whereby the entire Physics is emerging from a quantum computation. But for a theory to be a good one, it must explain a large class of phenomena based on few general principles. Taking as a general principle the topological homogeneity of the computational network with graph-dimension equal to the space-time dimension corresponds to replacing quantum field theory (QFT) with a numerable set of quantum systems in local interaction. This means to consider QFT as a kind of Fermi-scale "thermodynamic" limit of a deeper Planck-scale theory, with the quantum field replaced by a giant quantum computer.

In the talk, I will illustrate mechanisms of emergence of physics from the quantum computation in 1+1 dimensions. We will see that Dirac's is just the equation describing the free flow of information, leading to an informational definition of inertial mass and Planck constant. I will then illustrate the emergence mechanism of Minkowsian space-time from the computation, how the field Hamiltonian comes out, and how quantum fields are actually eliminated in favor of qubits. We will see that the digital nature of the field leads to an in-principle observable consequence in terms of a mass-dependent refraction index of vacuum, with the information becoming stationary at the Planck mass. Such refraction index of vacuum is a general phenomenon due to unitariety in the discrete, and can also help in solving the speed-of-light isotropy conundrum posed by digitalization of the field in more than 1 space dimensions. We will also see how the quantum nature of the processed information plays a crucial role in other practical informational issues, e.g. the possibility of driving the information in different directions, without the need of increasing the complexity of the circuit.

Finally I will briefly comment about gravity as emergent from the quantum computation, and the connection with Verlinde-Jacobson approach.

Antonio Acín, ICFO Barcelona

Guess your neighbor input

We present “guess your neighbor input” (GYNI), a multipartite nonlocal task in which each player must guess the input received by his neighbor. We show that quantum correlations do not perform better than classical ones at this task, for any prior distribution of the inputs. There exist, however, input distributions for which general no-signalling correlations can outperform classical and quantum correlations. Some of the Bell inequalities associated to our construction correspond to facets of the local polytope. We then discuss implications of this game in connection with recent attempts of deriving quantum correlations from information based principles, such as non-trivial communication complexity, information causality and Gleason’s theorem. Our results show that truly multipartite concepts are necessary to obtain the set of quantum correlations for an arbitrary number of parties.

Modal quantum theory (MQT) is a discrete model that is similar in structure to ordinary quantum theory, but based on a finite field instead of complex amplitudes. Its interpretation involves only the "modal" concepts of possibility and impossibility rather than quantitative probabilities. Despite its very simple structure, MQT nevertheless includes many of the key features of actual quantum physics, including entanglement and nonclassical computation. In this talk we describe MQT and explore how modal and probabilistic theories are related. Under what circumstances can we assign probabilities to a given modal structure?

Nicolas Brunner, University of Bristol

Data tables, dimension witnesses, and QKD

We address the problem of testing the dimensionality of classical and quantum systems in a ‘black-box’ scenario. Imagine two uncharacterized devices. The first one allows an experimentalist to prepare a physical system in various ways. The second one allows the experimentalist to perform some measurement on the system. After collecting enough statistics, the experimentalist obtains a ‘data table’, featuring the probability distribution of the measurement outcomes for each choice of preparation (of the system) and of measurement. Here, we develop a general formalism to assess the minimal dimensionality of classical and quantum systems necessary to reproduce a given data table. To illustrate these ideas, we provide simple examples of classical and quantum ‘dimension witnesses’. In general quantum systems are more economical than classical ones in terms of dimensionality, in the sense that there exist data tables obtainable from quantum systems of dimension d which can only be generated from classical systems of dimension strictly greater than d. By drawing connections to communication complexity one can find data tables for which this classical/quantum separation is dramatic. Finally, these ideas can also be used to demonstrate security of one-way QKD in a semi-device-independent scenario, in which devices are uncharacterized, but only assumed to produce quantum systems of a given dimension.

A seminal work by Cleve, Høyer, Toner and Watrous (quant-ph/0404076) proposed a close connection between quantum nonlocality and computational complexity theory by considering nonlocal games and multi-prover interactive proof systems with entangled provers. It opened up the whole area of study of the computational nature of nonlocality. Since then, understanding nonlocality has been one of the major goals in computational complexity theory in the quantum setting. This talk gives a survey of this exciting area.

Jonathan Barrett, Royal Holloway

Is the universe exponentially complicated? A no-go theorem for hidden variable interpretations of quantum theory.

The quantum mechanical state vector is a complicated object. In particular, the amount of data that must be given in order to specify the state vector (even approximately) increases exponentially with the number of quantum systems. Does this mean that the universe is, in some sense, exponentially complicated? I argue that the answer is yes, if the state vector is a one-to-one description of some part of physical reality. This is the case according to both the Everett and Bohm interpretations. But another possibility is that the state vector merely represents information about an underlying reality. In this case, the exponential complexity of the state vector is no more disturbing that that of a classical probability distribution: specifying a probability distribution over N variables also requires an amount of data that is exponential in N. This leaves the following question: does there exist an interpretation of quantum theory such that (i) the state vector merely represents information and (ii) the underlying reality is simple to describe (i.e., not exponential)? Adapting recent results in communication complexity, I will show that the answer is no. Just as any realist interpretation of quantum theory must be non-locally-causal (by Bell's theorem), any realist interpretation must describe an exponentially complicated reality.

We will explore generalizations of the Shannon and von Neumann entropy to other probabilistic theories, and their connection to the principle of information causality. We will also investigate the link between information causality and non-local games, leading to a new quantum bound on computing the inner product non-locally.

Dan Browne, University College London

Computation from correlations - in classical, quantum and generalised theories

Operational theories [1], defined in terms of the actions and observations of an experimenter, have been extremely successful as foils to quantum mechanics, providing a generic framework in which families of theories may be compared and classified. One area of particular interest has been in the non-classical correlations (often referred to non-locality) which can arise in quantum (and generalized) theories, when measurements are space-like separated. In the context of non-locality, one usually considers the correlations in separated measurements on isolated systems. A similar setting arises in quantum computation theory, in measurement-based quantum computation, a model of computation of equivalent power to standard circuit model quantum computation. Measurements are made on isolated non-interacting quantum systems, and the non-classical correlations which arise embody (in some loose sense) the mechanism via which the computation is executed. These measurements are adaptive, meaning that bases are chosen dependent upon the outcome of prior measurements, but apart from this, the setting is essentially identical to a multi-party Bell non-locality experiment (e.g. [2]). In this talk I will review some recent work [3] in which Bell-type correlations are studied from the perspective of computation - in particular drawing parallels with measurement-based quantum computation. In particular, I shall give examples of results [3] which appear naturally in this setting, while being not so self-evident in more conventional approaches. Finally, I shall discuss approaches to and challenges in developing non-trivial models of correlation-based quantum computation in general operational theories.

That quantum mechanics is non-local, in the sense of Bell inequality violations and the associated entanglement effects, is by now well known. In my talk however I will argue that quantum mechanics contains also a completely different type of non-locality, that has received virtually no attention until now: It is the fact that the quantum equations of motion are nonlocal. The discovery of dynamic non-locality dates from before that of Bell’s inequalities – it is the nonlocality responsible for the Aharonov-Bohm effect. Its implications however transcend the context in which they were discovered and go directly to the core of quantum physics.

Lucien Hardy, Perimeter Institute

Reformulating and reconstructing quantum theory

I provide a reformulation of finite dimensional quantum theory in the circuit framework in terms of mathematical axioms, and a reconstruction of quantum theory from operational postulates. The mathematical axioms for quantum theory are the following:

[Axiom 1] Operations correspond to operators.

[Axiom 2] Every complete set of positive operators corresponds to a complete set of operations.

The following operational postulates are shown to be equivalent to these mathematical axioms:

[P1] Definiteness. Associated with any given pure state is a unique maximal effect giving probability equal to one. This maximal effect does not give probability equal to one for any other pure state.

[P2] Information locality. A maximal measurement on a composite system is affected if we perform maximal measurements on each of the components.

[P3] Tomographic locality. The state of a composite system can be determined from the statistics collected by making measurements on the components.

[P4] Compound permutatability. There exists a compound reversible transformation on any system effecting any given permutation of any given maximal set of distinguishable states for that system.

[P5] Preparability. Filters are non-mixing and non-flattening.

Hence, from these postulates we can reconstruct all the usual features of quantum theory: States are represented by positive operators, transformations by completely positive trace non-increasing maps, and effects by positive operators. The Born rule (i.e. the trace rule) for calculating probabilities also follows. See arXiv:1104.2066 for more details. These operational postulates are deeper than those I gave ten years ago in quant-ph/0101012.

Entanglement provides a coherent view of the physical origin of randomness and the growth and decay of correlations, even in macroscopic systems exhibiting few traditional quantum hallmarks. It helps explain why the future is more uncertain than the past, and how correlations can become macroscopic and classical by being redundantly replicated throughout a system's environment. The most private information, exemplified by a quantum eraser experiment, exists only transiently: after the experiment is over no record remains anywhere in the universe of what "happened". At the other extreme is information that has been so widely replicated as to be infeasible to conceal and unlikely to be forgotten. But such conspicuous information is exceptional: a comparison of entropy flows into and out of the Earth with estimates of the planet's storage capacity leads to the conclusion that most macroscopic classical information---for example the pattern of drops in last week's rainfall---is impermanent, eventually becoming nearly as ambiguous, from a terrestrial perspective, as the transient result of a quantum eraser experiment. Finally we discuss prerequisites for a system to accumulate and maintain in its present state, as our world does, a complex and redundant record of at least some features of its past. Not all dynamics and initial conditions lead to this behavior, and in those that do, the behavior itself tends to be temporary, with the system losing its memory, and even its classical character, as it relaxes to thermal equilibrium.

Roger Colbeck, Perimeter Institute

Randomness amplification

I will discuss what we know about creating randomness within physics. Although quantum theory prescribes completely random outcomes to particular processes, could it be that within a yet-to-be-discovered post-quantum theory these outcomes are predictable? We have recently shown that this is not possible, using a very natural assumption. In the present talk, I will discuss some recent progress towards relaxing this assumption, providing arguably the strongest evidence yet for truly random processes in our world.

Over the last 10 years there has been an explosion of “operational reconstructions” of quantum theory. This is great stuff: For, through it, we come to see the myriad ways in which the quantum formalism can be chopped into primitives and, through clever toil, brought back together to form a smooth whole. An image of an IQ-Block puzzle comes to mind, http://www.prismenfernglas.de/iqblock_e.htm. There is no doubt that this is invaluable work, particularly for our understanding of the intricate connections between so many quantum information protocols. But to me, it seems to miss the mark for an ultimate understanding of quantum theory; I am left hungry. I still want to know what strange property of matter forces this formalism upon our information accounting. To play on something Einstein once wrote to Max Born, “The quantum reconstructions are certainly imposing. But an inner voice tells me that they are not yet the real thing. The reconstructions say a lot, but do not really bring us any closer to the secret of the 'old one’." In this talk, I hope to expand on these points and convey some sense of why I am fascinated with the problem of the symmetric informationally complete POVMs to an extent greater than axiomatic reconstructions.

Howard Barnum, University of New Mexico

Composite systems and information processing

The talk will focus primarily on recent work with Alexander Wilce in which we show that any locally tomographic composite of a qubit with any finite-dimensional homogeneous self-dual (equivalently Jordan-algebraic) system must be a standard finite-dimensional quantum (i.e. $C^*$-algebraic) system. I may touch on work in progress with collaborators on composites of arbitrary homogeneous self-dual systems. As motivation I will relate the properties of homogeneity and weak and strong self-duality to information processing phenomena, especially Schrooedingerian "steering" and teleportation (touching on earlier work with Wilce and Gaebler, as well as Barrett and Leifer). If time permits I will explain the relation between some category-theoretic notions coming from the approach of Abramsky and Coecke and Selinger, notably compactness and dagger-compactness, to weak self-duality (work with Ross Duncan and Wilce).

Stephanie Wehner, National University of Singapore

Does ignorance of the whole imply ignorance of the parts?

A central question in our understanding of the physical world is how our knowledge of the whole relates to our knowledge of the individual parts. One aspect of this question is the following: to what extent does ignorance about a whole preclude knowledge of at least one of its parts? Relying purely on classical intuition, one would certainly be inclined to conjecture that a strong ignorance of the whole cannot come without significant ignorance of at least one of its parts. Indeed, we show that this reasoning holds in any non-contextual hidden variable model (NC-HV). Curiously, however, such a conjecture is false in quantum theory: we provide an explicit example where a large ignorance about the whole can coexist with an almost perfect knowledge of each of its parts. More specifically, we provide a simple information-theoretic inequality satisfied in any NC-HV, but which can be arbitrarily violated by quantum mechanics. Our inequality has interesting implications for quantum cryptography.

Quantum theory allows for states with the surprising ability to exhibit nonlocal correlations(nonlocality). These correlations which have found many interesting applications recently are not as strong as allowed by our impossibility to signal arbitrarily fast [1]. In the questto understand what fundamental principles constrain nonlocality locally quantum theory has been proposed [2]. However there are infinite theories with a local description identical to quantum theory yet allowing stronger than quantum correlations [3]. We prove that it suffices to impose the reversibility of dynamics to single out the theory and correlations to be quantum for the case of two qubits. If a generalization of our result would hold for more than two qubits speed ups in reversible quantum computing would be completely determined by the local state space of quantum theory. Assuming the reversibility of dynamics as an axiom theories locally identical to quantum theory but differing in the achievable joint states would be ruled out. Conversely if an experiment would show stronger than quantum correlations either quantum theory would be invalidated already locally or imply the existence of fundamentally irreversible dynamics.

In 2003, Leggett proposed an alternative model for non-local correlations which is incompatible with quantum predictions [1]. It was shown that the Leggett model is incompatible with the experimentally observable quantum correlations [2]. Moreover, it was shown that quantum correlations are incompatible with any hidden variable model having a non trivial local part, such as Leggett’s model [3]. Conversely, others constructed a non-local hidden-variable model that reproduces the quantum correlation function for the singlet state [4]. In this Poster, without refer to any inequality, we show that the Leggett assumptions and its extension have internal inconsistency. Afterwards, we suggest a new inequality for testing quantum correlation verses any hidden variable model having a nontrivial local part and some type of non-local hidden-variable models which reproduce the quantum correlation function.

1-Found. Phys. 33. 1469 (2003)

2-Nature 446, 871, (2007); PRL 99, 210406 (2007); PRL99,210407 (2007)

3-Nature Phys., 4, 681-685, (2008); PRL 101, 050403 (2008)

4-arXiv:0907.2619 v3; JPA 41, 505301 (2008)

Chris Ferrie, IQC - University of Waterloo

Qubits as embarrassed colleagues: what do tax evasion and quantum state tomography have in common?

Quantum state estimation (a.k.a. "tomography'') plays a key role in designing quantum information processors. As a problem it resembles probability estimation -- e.g. for classical coins or dice -- but with some subtle and important discrepancies. We demonstrate an improved classical analogue that captures many of these differences: the noisy coin. Observations on noisy coins are unreliable -- an observer sees probably heads or probably tails. So like a quantum system it cannot be sampled directly. Unlike standard coins or dice whose worst-case estimation emphrisk scales as $1/N$ for all states noisy coins (and quantum states) have a worst-case risk that scales as $1/sqrtN$ and is overwhelmingly dominated by nearly-pure states. The resulting optimal estimation strategies for noisy coins are surprising and counterintuitive. We demonstrate some important consequences for quantum state estimation -- in particular that adaptive tomography can recover the $1/N$ risk scaling of classical probability estimation.

Rodrigo Gallego, ICFO - The Institute of Photonic Sciences

Fully non-local quantum correlations

Quantum Mechanics is a non-local theory but not asnon-local as the no-signalling principle allows. However there exist quantum correlations that exhibit maximalnon-locality: they are as non-local as any non-signalling correlations and thus have a local content quantified by the fraction $p_L$ of events admitting a local description equal to zero. Previous examples of maximal quantum non-locality between two parties require an infinite number of measurements and the corresponding Bell violation is not robust against noise. Here we show how every proof of the Kochen-Specker theorem gives rise to maximally non-local quantum correlations that involve a finite number of measurements and are robust against noise. We perform the experimental demonstration of a Bell test originating from the Peres-Mermin Kochen-Specker proof providing an upper bound on the local content $p_Llesssim 0.22$. Our analysis certifies that these are the most non-local correlations ever reported.

For a device-independent assessment of a measurement device we present bounds for the maximal CHSH violation if the measurement device is separable and also for projective measurements in the restricted case of qubits.

We show that bipartite correlations on the maximally entangled state in generalized probabilistic theories rely on the structure of the local state space. In particular they respect macroscopic locality if the local state space is strongly self-dual.

We show that the complementarity relation between dichotomic observables leads to the monogamy of Bell inequality violations. We introduce a simple condition for the squares of expectation values of complementary observables that is satisfied by all physical states. This condition is used to study multi-qubit correlation inequalities involving two settings per observer. In contrast with the two-qubit case a rich structure of possible violation patterns is shown to exist in the multipartite scenario.

Raymond Lal, University of Oxford

Causal structure in categorical quantum mechanics

Categorical quantum mechanics provides a new view on both quantum information and quantum foundations. This is achieved by using category theory to abstractly describe the information flow that occurs in scenarios such as quantum teleportation. This intuitive formalism has also led to insights into nonlocality and entanglement. However until now causal structure has only been implicit in the formalism. We introduce the notion of a 'causal category' to expose this implicit causal structure and we show how it formalises the interaction between quantum systems in a way that is compatible with relativistic constraints.

Spekkens has introduced a toy theory [Phys. Rev. A 75 032110 (2007)] in order to argue for an epistemic view of quantum states. I describe a notation for the theory (excluding certain joint measurements) which makes its similarities and differences with the quantum mechanics of stabilizer states clear. It also assists calculations within the theory for example of the number of possible states and transformations and enables superpositions to be defined for composite systems.

Sadegh Raeisi, Institute for Quantum Information Science

Micro-Macro Entanglement and Coarse-graining

We are studying the demonstration of entanglement in a micro-macro system. The entanglement is produced via parametric down conversion process between two photons and is being amplified through a phase-covariant cloner to entanglement between a photon and a beam of light. We introduce an analogous system which exploits an entanglement breaking amplifier and produces only classical correlation between the two parties. We show that it is hard to distinguish the outcomes of the two systems considering even a small inaccuracy in photon counting measurements. We model these inaccuracies as coarse-graining. The result is that for certain types of measurements small coarse-graining makes the classical and quantum mechanical correlation indistinguishable.

Contrary to the bipartite case the intricite structure of multipartite is far from being resolved so far. We present a general criterion to distinguish inequivalent classes of states e.g. GHZ- and W-states for three qubit systems in systems of arbitrary size.

Yutaka Shikano, Tokyo Institute of Technology

Counterfactual Closed Timelike Curve

How should we deal with the closed timelike curves (CTCs) in quantum mechanics? The CTC has many paradoxical situations like the grandfather paradox. From the viewpoint of information processing, I will present the grandfather paradox. Furthermore, I will review the Deutsch and our answers. Our answer is called the post-selected CTC based on the idea of quantum teleportation. On considering the causality, I would like to discuss about the post-selected CTC.

In the wide range of fundamental physics the issue of Axiomatizing Quantum Mechanics is as relevant as ever. The idea of achieving the mathematical structure of the Quantum Theory as a representation of a fair operational framework (see ref. 1234) namely regarding the theory as a set of rules that allow the experimenter to predict future events on the basis of suitable tests has exhibited an unexpected theoretical power. In addition to causality the following postulates had recently been considered (see ref. 2): PFAITH (existence of a pure preparationally faithful state) and FAITHE (existence of a faithful effect). These postulates correspond to admit the possibility of preparing any state and calibrating any test by means of only local operations. The two postulates alone are not sufficient to isolate the Quantum Theory in the probabilistic scenario. An interesting postulate is PURIFY namely the purifiability of all states. More recently in ref. 3 a more extensive axiomatic approach has been considered adding to causality the postulates LDISCR (local discriminability principle namely the possibility of discriminating joint states by local measurements) and a stronger version of the PURIFY requiring the uniqueness of the purification up to reversible channels on the purifying system. In the present poster some concrete probabilistic models alternative to Quantum Mechanics are presented (see ref. 5). The first model the two-box world is an extension of the Popescu-Rohrlich model (ref. 6) which achieves the greatest violation of the CHSH inequality compatible with the no-signaling principle. Through a preparationally faithful state chosen among the non local states of the original model it is possible to extend it to a probabilistic theory. The second model the two-clock world is actually a full class of models all having a disk as convex set of states for the local system. These models satisfy the PURIFY Postulate (allow purication of all the mixed states) but the purication is not unique up to reversible channels on the purifying system as required in refs. 3 and 4. The two-clock world naturally contains the the two-rebit world (qubits with real Hilbert space) as a particular realization. This model violates LDISCR and then the local observability principle. The third model the spin-factor is an n-dimensional generalization of the clock. Here we see that the only dimension n = 3 allows teleportation according to the fact that the qubit is a realization of the 3-spin-factor. Finally the classical probabilistic theory is revisited in the probabilistic theories framework. In pointing out which postulates are violated by each model we also notice deep relations between the local and the non-local structures of the probabilistic theories. REFERENCES1. L. Hardy Quantum theory from five reasonable axioms quant-ph/0101012v4 (2001). 2. G. M. D'Ariano in Philosophy of quantum information and entanglement A. Bokulich and G. Jaeger eds. (Cambridge University Press Cambridge UK 2010). Also arXiv 0807.4383. 3. G. Chiribella G. M. D'Ariano P. Perinotti Probabilistic Theories with Purication Phys. Rev. A 81 062348 (2010). 4. G. Chiribella G. M. D'Ariano P. Perinotti Informational derivation of Quantum Theory arXiv 2011.6451 (2010). 5. G. M. D'Ariano and A. Tosini Testing axioms for Quantum Theory on Probabilistic toy-theories Quant. Inf. Proc. 9 95-141 (2010) (Special Issue on Foundations of Quantum Information) (also arXiv:0911.5409) .6. S. Popescu and D. Rohrlich Quantum non-locality as an axiom Found. Phys. 24 379 (1994). 7. L. Hardy Foliable Operational Structures for General Probabilistic Theories arXiv: 0912.4740 (2009) 8. L. Bombelli J. H. Lee D. Meyer and R. Sorkin Space-Time as a Causal Set Phys. Rev. Lett. 51 521 (1987) 9. L. Lamport Time clocks and the ordering of events in a distributed system Comm. ACM 21 558 (1978)10. G. M. D'Ariano and A. Tosini Space-time and special relativity from causal networks arXiv: 1008.4805 (2010) 11. G. M. D'Ariano The Quantum Field as a Quantum Computer arXiv (2010)

Bell correlation functions are often used as the figure of merit for analyzing the strength of correlations of a nonlocal box with binary inputs and outputs. However stronger Bell correlations do not necessary imply distillability to stronger nonlocal correlations -- it is known that there is a nonlocal box exhibiting only a vanishing small inequality violation but can be converted to the maximally nonlocal box (the PR-box) by a distillation protocol consisting of local transformations of many copies of a given nonlocal box without using communication. One could therefore consider the Bell correlation functions as insufficient for fully characterizing the properties of nonlocal boxes. To address this we introduce a set of variables characterizing the anisotropy of nonlocal boxes namely the difference from the isotropic nonlocal box defined by adding uniformly random uncorrelated noise on outputs of a PR-box. We explore the relationship between the distillability and isotropy and show that if a nonlocal box is more anisotropic it is more distillable by investigating protocol-independent properties of distillation.

Negativity in quasi-probability representations of quantum mechanics is often used as an indicator of nonclassical behaviour. We explore the relation between one definition of "nonclassical'' namely preparation noncontextuality and show that it limits the computational power of physical theories. In particular we show that no preparation noncontextual model can be nonnegative for one class of gate sets that are universal for quantum computation. We also discuss the relation between the discrete Wigner function and an efficient simulation of a quantum computation.

Modal quantum theory (MQT) is a simplified cousin of ordinary Hilbert space quantum theory. Several important theorems of actual quantum theory have direct analogues in MQT including the Bell-Hardy theorem excluding local hidden variables the Kochen-Specker theorem excluding non-contextual hidden variables and the Conway-Kochen "free will theorem'' about entangled systems. The proofs of these analogue theorems are similar to but much simpler than the originals.

Hardy's non-locality paradox is a proof without inequalities showing that certain non-local correlations violate local realism. It is 'possibilistic' in the sense that one only distinguishes between possible outcomes (positive probability) and impossible outcomes (zero probability). Here we show that Hardy's paradox is quite universal: in any (2,2,l) or (2,k,2) Bell scenario, the occurence of Hardy's paradox is a necessary and sufficient condition for possibilistic non-locality. In particular, it subsumes all ladder paradoxes. This universality of Hardy's paradox is not true more generally: we find a new 'proof without inequalities' in the (2,3,3) scenario that can witness non-locality even for correlations that do not display the Hardy paradox. We discuss the ramifications of our results for the computational complexity of recognising possibilistic non-locality.

Karl Johan Paulsson, Oxford University

A diagrammatic approach to foundations of quantum mechanics

We look at two recent attempts by Coecke, Paquette and Pavlovic and Coecke and Perdrix to accommodate for classical structures in categorical quantum mechanics, through embedding and environmental coupling. We identify limitations of this approach, such as the inability to distinguish which 'wire' encodes which structure. We also look at further directions this line of research could take in the form of equivalences of the categorical notions of these structures. Finally we look at the grander picture to try and propose a way to achieve a diagrammatic notion to reason about foundations of physics. With this the hope is that we can extend the categorical approach beyond its current limitations in to a more physical approach.

We devise a protocol in which general non-classical multipartite correlations produce a physically relevant effect, leading to the creation of bipartite entanglement. In particular, we show that the relative entropy of quantumness, which measures all non-classical correlations among subsystems of a quantum system, is equivalent to and can be operationally interpreted as the minimum distillable entanglement generated between the system and local ancillae in our protocol. We emphasize the key role of state mixedness in maximizing non-classicality: Mixed entangled states can be arbitrarily more non-classical than separable and pure entangled states.

Aleks Kissinger, University of Oxford

Synthesising Physical Theories

Automated theorem provers are a software tool that takes a set of mathematical axioms pertaining to some mathematical theory (e.g. the theory of natural numbers and arithmetic) and automatically searches for proofs of theorems. Typically, a lot of hard, manual labour goes into identifying the "right" set of axioms to feed to a theorem prover, however researchers at Edinburgh have recently developed a technique they call "conjecture synthesis" for doing this automatically. Naive approaches (i.e. generate all "possible" axioms and check them) quickly become computationally unfeasible, but this conjecture synthesis technique cleverly incorporates axioms that have already been verified into the search routine to get impressive results. The existing work treats a type of mathematical theory called a "term theory", where one is primarily concerned with showing that two terms are equal (e.g. x * (y + z) = x*y + x*z). This poster explores how the techniques of conjecture synthesis can be extended to so-called "graphical theories", where, instead of terms, we are concerned with the equality of diagrams (e.g. quantum circuits, or Penrose-style tensor diagrams). We then suggest how this extension can be used to explore problems in multipartite quantum entanglement.

Quantum correlations can be nonlocal, and violate some Bell inequalities. The simulation of quantum correlations with alternative nonlocal resources, such as classical communication, gives a natural way to quantify their nonlocality, as well as interesting insights on their peculiarities. For bipartite correlations, Toner and Bacon showed that a single bit of communication allowed to reproduce any correlation obtained by performing Von Neumann measurements on a singlet state. Very little is known however on multipartite correlations: is it also possible to simulate these classically with a finite number of bits?

We consider the maximum probability of successfully teleporting a quantum state without using a classical communication channel. We show that, when multiple copies of the state are available, the probability of successful teleportation increases, asymptotically attaining the inverse of the Hilbert space dimension. Such a limit value is determined by the amount of classical information that the EPR channel can send back from the future into the past without violating causality.

Patrick Coles, Carnegie Mellon University

The principle behind the uncertainty principle

Expressions of the uncertainty principle in terms of entropy – entropic uncertainty relations – are gathering increasing interest, particularly ones that allow for quantum memory or quantum side information since they are very strong and are useful for cryptography [Berta et al. Nature Physics 6, 659 (2010)]. Entropic uncertainty relations come in a variety of forms and are proven in a variety of ways. But is there a common principle responsible for all of these relations? We show that several entropic uncertainty relations, including quantum memory ones, are ultimately due to the monotonicity of the relative entropy. This suggests the uncertainty principle can be viewed as a data-processing inequality, expressing the notion that information cannot increase due to evolution in time. Furthermore, finding minimum uncertainty states (i.e. states that satisfy the uncertainty relation with equality) is intimately connected to the question of whether the particular measurement process is reversible, or whether information is irreversibly lost in the process.

We investigate the relationships between Kochen-Specker (KS) sets, pseudo-telepathy (PT) games and coding games. It is known that certain types of PT games can be turned into KS sets and vice versa (Renner and Wolf, 2004). Recently it has been showed that entanglement can increase the number of messages that can be sent over a classical channel with zero-error probability (Cubitt, Leung, Matthews, Winter, 2010). We call such scenarios PT coding games. It turns out that a relationship similar to the one found by Renner and Wolf exists between KS sets and PT coding games. The connections between KS sets, PT games and coding games raises the question of whether these three objects are in fact just different aspects of the same phenomenon.

We pose the question whether causal ordering between events is a necessary element in quantum theory. In order to address the problem we develop a formalism for multipartite quatum correlations that does not assume any underlying space-time or causal structure, but only that local agents are free to perform arbitrary quantum operations. All known situations, including non-signalling correlations between space-like separated observers, signalling ones between observers connected by a channel and probabilistic mixtures of them, can be expressed in this formalism. We find that there exist more general possibilities that are incompatible with any underlying causal structure.

Ever since quantum mechanics was first developed, it has been unclear what it really tells us about reality. A novel framework, based on 5 axioms, is presented here which offers an interpretation of quantum mechanics unlike any considered thus far: It is postulated that physical objects can exist in one of two distinct modes, based on whether they have an intrinsic actual spacetime history or not. If they do, their mode of existence is actual and they can be described by classical physics. If they do not, then their mode of existence is called actualizable and they must be described in terms of a linear superposition of all possible actualizable (not actual) histories. The reason for the distinction is based on the axiom that there exists a limit in which spacetime reduces to a one-dimension reduced version, called areatime, and that objects which merely actualizably exist in spacetime actually exist in areatime. The operational comparison of the passage of time for such objects to the passage of time for a spacetime observer is postulated to be made possible by what is called an angular dual bilateral symmetry. This symmetry is isomorphic to the trivial group of order one but due to additional specification of the identity operation can be decomposed into the superposition of two imaginary phase angles of opposite sign. To mathematically describe the spacetime manifestation of objects which actually exist in areatime, each actualizable history is associated with an actualizable path, which in turn is associated with the imaginary phases. For a single free particle, the complex exponent is identified with term proportional to its relativistic action, thus recovering the path integral formulation of quantum mechanics.

Daniel Cavalcanti, CQT

Activation of nonlocality in quantum networks

The results of local measurements on some composite quantum systems cannot be reproduced classically. This impossibility known as quantum nonlocality represents a milestone in the foundations of quantum theory. Quantum nonlocality is also a valuable resource for information processing tasks e.g. quantum communication quantum key distribution quantum state estimation or randomness extraction. Still deciding if a quantum state is nonlocal remains a challenging problem. We introduce a novel approach to this question: we study the nonlocal properties of quantum states when distributed and measured in networks. Using our framework we show how any one-way entanglement distillable state leads to nonlocal correlations. Then we prove that nonlocality is a non-additive resource which can be activated. There exist states local at the single-copy level that become nonlocal when taking several copies of it. Our results imply that the nonlocality of quantum states strongly depends on the measurement context.Reference: D. Cavalcanti M.L. Almeida V. Scarani A. Acin. Nature Communications 2 184 (2011).

We present a microscopic decoherence model for a chiral molecule immersed in a buffer gas in terms of a two-level system interacting with a two-qubit environment. Our model allows a simple derivation of a master equation computed first by J. Trost and K. Hornberger [PRL 103 023202 (2009)]. We employ information-theoretical concepts to explain the stability of the chiral states by collisional decoherence in terms of information about chirality flowing out to the environment. This is a prototypical system that illustrates the modern principles of decoherence and the quantum-to-classical transition.

Jonathan Poritz, Colorado State University - Pueblo

TBA

Denis Rosset, Universit de Genve - GAP Optique

Bilocality : characterizing the effects of the topology of local variables

When considering tripartite quantum experiments sharing two pairs of local variables the quantum physicist can ask (i) if local correlations are really described by an hidden local variable shared by the three parties or (ii) if the distribution of local variables should be restricted to the topology of the quantum state whose correlations they try to reproduce.We note that this question is relevant for N-partite systems with N>2. We will see that the set of correlations obeying ii) is non-convex and can be characterized in some subspaces by nonlinear equivalents of Bell inequalities.