We study a class of quantum measurements that furnish probabilistic representations of finite-dimensional quantum theory. The Gram matrices associated with these Minimal Informationally Complete quantum measurements (MICs) exhibit a rich structure. They are “positive” matrices in three different senses, and conditions expressed in terms of them have shown that the Symmetric Informationally Complete measurements (SICs) are in some ways optimal among MICs. Here, we explore MICs more widely than before, comparing and contrasting SICs with other classes of MICs, and using Gram matrices to begin the process of mapping the territory of all MICs. Moreover, the Gram matrices of MICs turn out to be key tools for relating the probabilistic representations of quantum theory furnished by MICs to quasi-probabilistic representations, like Wigner functions, which have proven relevant for quantum computation. Finally, we pose a number of conjectures, leaving them open for future work.

Quantum set theory (QST) and topos quantum theory (TQT) are two long running projects in the mathematical foundations of quantum mechanics that share a great deal of conceptual and technical affinity. Most pertinently, both approaches attempt to resolve some of the conceptual difficulties surrounding quantum mechanics by reformulating parts of the theory inside of non-classical mathematical universes, albeit with very different internal logics. We call such mathematical universes, together with those mathematical and logical structures within them that are pertinent to the physical interpretation, `Q-worlds’. Here, we provide a unifying framework that allows us to (i) better understand the relationship between different Q-worlds, and (ii) define a general method for transferring concepts and results between TQT and QST, thereby significantly increasing the expressive power of both approaches. Along the way, we develop a novel connection to paraconsistent logic and introduce a new class of structures that have significant implications for recent work on paraconsistent set theory.

In this paper, we will discuss a formal link between neural networks and quantum computing. For that purpose we will present a simple model for the description of the neural network by forming sub-graphs of the whole network with the same or a similar state. We will describe the interaction between these areas by closed loops, the feedback loops. The change of the graph is given by the deformations of the loops. This fact can be mathematically formalized by the fundamental group of the graph. Furthermore the neuron has two basic states $|0\rangle$ (ground state) and $|1\rangle$ (excited state). The whole state of an area of neurons is the linear combination of the two basic state with complex coefficients representing the signals (with 3 Parameters: amplitude, frequency and phase) along the neurons. Then it can be shown that the set of all signals forms a manifold (character variety) and all properties of the network must be encoded in this manifold. In the paper, we will discuss how to interpret learning and intuition in this model. Using the Morgan-Shalen compactification, the limit for signals with large amplitude can be analyzed by using quasi-Fuchsian groups as represented by dessins d’enfants (graphs to analyze Riemannian surfaces). As shown by Planat and collaborators, these dessins d’enfants are a direct bridge to (topological) quantum computing with permutation groups. The normalization of the signal reduces to the group $SU(2)$ and the whole model to a quantum network. Then we have a direct connection to quantum circuits. This network can be transformed into operations on tensor networks. Formally we will obtain a link between machine learning and Quantum computing.

Epistemic interpretations of quantum theory maintain that quantum states only represent incomplete information about the physical states of the world. A major motivation for this view is the promise to evade the physicality of the “collapse” process and provide a reasonable account of state update under measurement by asserting that it is a natural feature of simply updating incomplete statistical information. Here we demonstrate that known epistemic ontological models of quantum theory in dimension $d\geq3$, including those designed to evade the conclusion of the PBR theorem, cannot represent state update correctly. We do so by introducing orthogonalizing measurements, which place strict constraints on the structure of epistemic models. Conversely, interpretations for which the wavefunction is real evade such restrictions despite remaining subject to long-standing criticism regarding physical discontinuity, indeterminism and the ambiguity of the von Neumann ‘cut’ under the associated physical “collapse” process.

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Jeremy Steeger

Abstract

I defend an analog of probabilism that characterizes rationally coherent estimates for chances. Specifically, I demonstrate the following accuracy-dominance result for stochastic theories in the C*-algebraic framework: supposing an assignment of chance values is possible if and only if it is given by a pure state on a given algebra, your estimates for chances avoid accuracy-dominance if and only if they are given by a state on that algebra. When your estimates avoid accuracy-dominance (roughly: when you cannot guarantee that other estimates would be more accurate), I say that they are sufficiently coherent. In formal epistemology and quantum foundations, the notion of rational coherence that gets more attention requires that you never allow for a sure loss (or ‘Dutch book’) in a given sort of betting game; I call this notion full coherence. I characterize when these two notions of rational coherence align, and I show that there is a quantum state giving estimates that are sufficiently coherent, but not fully coherent.

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Geoff Beck

Abstract

This work outlines the novel application of the empirical analysis of causation, presented by Kutach, to the study of information theory and its role in physics. The central thesis of this paper is that causation and information are identical functional tools for distinguishing controllable correlations, and that this leads to a consistent view, not only of information theory, but also of statistical physics and quantum information. This approach comes without the metaphysical baggage of declaring information a fundamental ingredient in physical reality and exorcises many of the otherwise puzzling problems that arise from this view-point, particularly obviating the problem of ‘excess baggage’ in quantum mechanics. This solution is achieved via a separation between information carrying causal correlations of a single qubit and the bulk of its state space.

Abstract

A classical origin for the Bohmian quantum potential, as that potential term arises in the quantum mechanical treatment of black holes and Einstein–Rosen (ER) bridges, can be based on 4th-order extensions of Einstein’s equations. The required 4th-order extension of general relativity is given by adding quadratic curvature terms with coefficients that maintain a fixed ratio, as their magnitudes approach zero, with classical general relativity as a singular limit. If entangled particles are connected by a Planck-width ER bridge, as conjectured by Maldacena and Susskind, then a connection by a traversable Planck-scale wormhole, allowed in 4th-order gravity, describes such entanglement in the ontological interpretation. It is hypothesized that higher-derivative gravity can account for the nonlocal part of the quantum potential generally.

Graphical abstract

We revisit the question of frame equivalence in Quantum Field Theory in the presence of gravity, a situation of relevance for theories aiming to describe the early Universe dynamics and Inflation in particular. We show that in those cases, the path integral measure must be carefully defined and that the requirement of diffeomorphism invariance forces it to depend non-trivially on the fields. As a consequence, the measure will transform also non-trivially between different frames and it will induce a new finite contribution to the Quantum Effective Action that we name frame discriminant. This new contribution must be taken into account in order to asses the dynamics and physical consequences of a given theory. We apply our result to scalar-tensor theories described in the Einstein and Jordan frame, where we find that the frame discriminant can be thought as inducing a scale-invariant regularization scheme in the Jordan frame.

We study conformal field theories with boundaries, and their boundary renormalization group (RG) flows, using methods from quantum information theory. Positivity of the relative entropy, together with unitarity and Lorentz invariance, give rise to bounds that characterize the irreversibility of such flows. This generalizes the recently proved entropic $g$-theorem to higher dimensions. In $2+1$ dimensions with a boundary, we prove the entropic $b$-theorem — the decrease of the two-dimensional Weyl anomaly under boundary RG flows. In higher dimensions, the bound implies that the leading area coefficient of the entanglement entropy induced by the defect decreases along the flow. Our proof unifies these properties, and provides an information-theoretic interpretation in terms of the distinguishability between the short distance and long distance states. Finally, we establish a sum rule for the change in the area term in theories with boundaries, which could have implications for models with localized gravity.

A decade ago, it was practically gospel truth among physicists that the universe began with a sudden unfurling of space known as cosmic inflation. Physicists also widely believed that the cosmos’s missing dark matter consists of invisible clouds of heavy, inert particles dubbed WIMPs, and that the laws of nature respect supersymmetry, a tidy mirroring of matter and forces. The only thing left to do was gather proof of these solutions to some of the biggest mysteries in the universe.

That proof never came. Today, the cosmos’s origin story is in question, the identity of dark matter is anyone’s guess and supersymmetry is all but off the table, leaving gaps in our laws of nature. Add to that the dark energy mystery, black hole paradoxes and quantum weirdness, and it’s clear that the field of fundamental physics is experiencing both a period of confusion and one of refreshing openness to new ideas.

This year didn’t bring significant clues or answers to any of physics’ fundamental mysteries. If anything, those mysteries deepened. By contrast, condensed matter physicists, who study the exotic emergent behaviors of large numbers of particles, and astronomers, armed with powerful new telescopes, are swimming in data and discoveries.

The celebrated British physicist Stephen Hawking died on March 14 at age 76. Hawking was something of a betting man, regularly entering into friendly wagers with colleagues over key questions in theoretical physics. In 1991, he made a bet that information that falls into black holes gets destroyed and can never be retrieved. He lost that bet; physicists now believe information somehow escapes black holes. But how it gets out — a question raised by Hawking’s discovery of black hole radiation — has become a major driver of fundamental physics research. One increasingly popular approach has been to study the question by treating black holes as holograms.

Scientists reported in March that a set of small radio antennas in the Australian outback known as the EDGES experiment had detected a spectral absorption band coming from the first stars. The intensity of the signal, indicating how much light was absorbed by these stars, was unexpectedly strong, suggesting that the young cosmos was significantly colder than anyone thought. One cosmologist hypothesized that the gas swirling around at the time could have been cooled by interactions with a nonstandard kind of dark matter, but this idea failed scrutiny. Follow-up studies of the putative signal from the cosmic dawn will be well worth keeping an eye on.

Venturing far beyond our solar system, the April release of detailed measurements of more than a billion Milky Way stars by the Gaia space telescope spurred an explosion of new understanding of how our galaxy formed and evolved. Astronomers identified anomalous populations of stars that appear to retain a memory of a long-ago collision between the young Milky Way and a dwarf galaxy. “There’s debris everywhere,” said Cambridge’s Vasily Belokurov.

As researchers puzzle over the universe’s seemingly willy-nilly set of elementary particles and forces, new findingsare fueling an old suspicion that these ingredients spring from the logic of strange eight-part numbers called “octonions.” The mathematical physicist Cohl Furey has found some intriguing new links between elementary particles and octonions, but it remains to be seen whether they’ll lead to a breakthrough or a dead end.

Since the summer, physicists have been debating a conjecture that seems to put our universe at odds with string theory. The conjecture states that as the universe expands, the density of energy in the vacuum of empty space must decrease faster than a certain rate. The rule appears to hold in all simple models of universes based on string theory, suggesting that it might be true throughout the “landscape” of possible universes that the theory allows. But the rule violates two widespread beliefs about the actual universe: It deems impossible both the standard picture of the cosmos’s present-day expansion, driven by dark energy, and the leading model of its explosive birth — the theory known as cosmic inflation. The conjecture has caused confusion, but “also, of course, huge excitement,” said the physicist Timm Wrase, because “it has a lot of tremendous implications for cosmology.”

The century-old debate over the meaning of quantum mechanics reheated this year, with several experiments, actual and gedanken, contributing to the discussion of reality in a probabilistic universe. Quanta’s most commented-on story of 2018 was a critique of the increasingly popular “many-worlds interpretation,” which posits a near-infinity of universes that play out all possible realities allowed by quantum mechanics in parallel. Contributing writer Philip Ball deemed the problems with this idea “overwhelming.”