Issues in the philosophy of quantum mechanics include first and foremost, its interpretation. Probably the most well-known of these is the 'orthodox' Copenhagen interpretation associated with Neils Bohr, Werner Heisenberg, Wolfgang Pauli, John von Neumann, and others. Beginning roughly at the midway point of the previous century, philosophers' attention began to be drawn towards alternative interpretations of the theory, including Bohmian mechanics, the relative state formulation of quantum mechanics and its variants (i.e., DeWit's "many worlds" variant, Albert and Loewer's "many minds" variant, etc.), and the dynamical collapse family of theories. One particular interpretational issue that has attracted very much attention since the seminal work of John Bell, is the issue of the extent to which quantum mechanical systems do or do not admit of a local realistic description. Bell's investigation of the properties of entangled quantum systems, inspired by the famous thought experiment of Einstein, Podolsky, and Rosen, seems to lead one to the conclusion that the only realistic "hidden variables" interpretation compatible with the quantum mechanical formalism is a nonlocal one. In recent years, some of the attention has focused on applications of quantum mechanics and their potential for illuminating quantum foundations. These include the sciences of quantum information and quantum computation. Additional areas of research include philosophical investigation into the extensions of nonrelativistic quantum mechanics (such as quantum electrodynamics and quantum field theory more generally), as well as more formal logico-mathematical investigations into the structure of quantum states, state spaces, and their dynamics.

Key works

Bohr 1928 and Heisenberg 1930 expound upon what has since become known as the 'Copenhagen interpretation' of quantum mechanics. The famous 'EPR' thought experiment of Einstein et al 1935 aims to show that quantum mechanics is an incomplete theory which should be supplemented by additional ('hidden') parameters. Bohr 1935 replies. More on Bohr's views can be found in Faye 1991, FOLSE 1985. Inspired by the EPR thought experiment, Bell 2004 [1964] proves what has since become known as "Bell's theorem." This, and a related result due to Kochen & Specker 1967 serve to revive the discussion of hidden variables and alternative interpretations of quantum mechanics. Jarrett 1984 analyses the key "factorisability" assumption Bell uses to derive his theorem into two distinct sub-assumptions, which Jarrett refers to as "locality" and "completeness". Two important volumes dedicated to the topics of entanglement and nonlocality are Cushing & McMullin 1989 and Maudlin 2002. Among the more discussed alternative interpretations of quantum mechanics are: Bohmian mechanics (Bohm 1952, and see also Cushing et al 1996), and Everett's relative state formulation (Everett Iii 1973). The latter gives rise to many variants, including the many worlds, many minds, and decoherence-based approaches (see Saunders et al 2010). Other notable interpretations and alternative theories include dynamical collapse theories (Ghirardi et al 1986), as well as the Copenhagen-inspired QBist view (Fuchs 2003, Fuchs manuscript). An attempt to axiomatize quantum mechanics in terms of information theoretic constraints, and a discussion of the relevance of this for the interpretation of quantum mechanics is given in Clifton et al 2002. Discussion of this and other issues in quantum information theory can be found in: Timpson 2004. Key works in the philosophy of quantum field theory include: Redhead 1995, Redhead 1994, Ruetsche 2011, Teller 1995.

Introductions

Hughes 1989 is an excellent introduction to the formalism and interpretation of quantum mechanics. Albert 1992 is another, which focuses particularly on the problem of measurement in quantum mechanics.

We have recently started to understand that fundamental aspects of complex systems such as emergence, the measurement problem, inherent uncertainty, complex causality in connection with unpredictable determinism, time­irreversibility and non­locality all highlight the observer's participatory role in determining their workings. In addition, the principle of 'limited universality' in complex systems, which prompts us to search for the appropriate 'level of description in which unification and universality can be expected', looks like a version of Bohr's 'complementarity principle'. It is more or (...) less certain that the different levels of description possible of a complex whole ­­ actually partial objectifications ­­ are projected on to and even redefine its constituent parts. Thus it is interesting that these fundamental complexity issues don't just bear a formal resemblance to, but reveal a profound connection with, quantum mechanics. Indeed, they point to a common origin on a deeper level of description. (shrink)

The modest ambition of this short note is to point out a plausible route from a *perspectival ontology* and McTaggart’s *AB-spacetime* to an AdS/CFT correspondence. There are several minor arguments that would need to be filled in for this route to succeed.

The basic idea of quantum mechanics is that the property of any system can be in a state of superposition of various possibilities. This state of superposition is also known as wave function and it evolves linearly with time in a deterministic way in accordance with the Schrodinger equation. However, when a measurement is carried out on the system to determine the value of that property, the system instantaneously transforms to one of the eigen states and thus we get only (...) a single value as outcome of the measurement. Quantum measurement problem seeks to find the cause and exact mechanism governing this transformation. In an attempt to solve the above problem, in this paper, we will first define what the wave function represents in real world and will identify the root cause behind the stochastic nature of events. Then, we will develop a model to explain the mechanism of collapse of the quantum mechanical wave function in response to a measurement. In the process of development of model, we will explain Schrodinger cat paradox and will show how Born’s rule for probability becomes a natural consequence of measurement process. (shrink)

This conclusion indicates exactly my EDWs!!! So, the framework is UNBELIEVABLE similar to my EDWs! The authors avoid any contradiction introducing the “theory of causal witnesses” that represent the correspondences between EDWs, no more or less!!!

Why microscopic objects exhibit wave properties (are delocalized), but macroscopic do not (are localized)? Traditional quantum mechanics attributes wave properties to all objects. When complemented with a deterministic collapse model (Quantum Stud.: Math. Found. 3, 279 (2016)) quantum mechanics can dissolve the discrepancy. Collapse in this model means contraction and occurs when the object gets in touch with other objects and satisfies a certain criterion. One single collapse usually does not suffice for localization. But the object rapidly gets in touch (...) with other objects in a short time, leading to rapid localization. Decoherence is not involved. (shrink)

Determinism is established in quantum mechanics by tracing the probabilities in the Born rules back to the absolute (overall) phase constants of the wave functions and recognizing these phase constants as pseudorandom numbers. The reduction process (collapse) is independent of measurement. It occurs when two wavepackets overlap in ordinary space and satisfy a certain criterion, which depends on the phase constants of both wavepackets. Reduction means contraction of the wavepackets to the place of overlap. The measurement apparatus fans out the (...) incoming wavepacket into spatially separated eigenpackets of the chosen observable. When one of these eigenpackets together with a wavepacket located in the apparatus satisfy the criterion, the reduction associates the place of contraction with an eigenvalue of the observable. The theory is nonlocal and contextual. Keywords:. (shrink)

Research into ancient physical structures, some having been known as the seven wonders of the ancient world, inspired new developments in the early history of mathematics. At the other end of this spectrum of inquiry the research is concerned with the minimum of observations from physical data as exemplified by Eddington's Principle. Current discussions of the interplay between physics and mathematics revive some of this early history of mathematics and offer insight into the fine-structure constant. Arthur Eddington's work leads to (...) a new calculation of the inverse fine-structure constant giving the same approximate value as ancient geometry combined with the golden ratio structure of the hydrogen atom. The hyperbolic function suggested by Alfred Landé leads to another result, involving the Laplace limit of Kepler's equation, with the same approximate value and related to the aforementioned results. The accuracy of these results are consistent with the standard reference. Relationships between the four fundamental coupling constants are also found. (shrink)

In a quantum universe with a strong arrow of time, it is standard to postulate that the initial wave function started in a particular macrostate---the special low-entropy macrostate selected by the Past Hypothesis. Moreover, there is an additional postulate about statistical mechanical probabilities according to which the initial wave function is a ''typical'' choice in the macrostate. Together, they support a probabilistic version of the Second Law of Thermodynamics: typical initial wave functions will increase in entropy. Hence, there are two (...) sources of randomness in such a universe: the quantum-mechanical probabilities of the Born rule and the statistical mechanical probabilities of the Statistical Postulate. I propose a new way to understand time's arrow in a quantum universe. It is based on what I call the Thermodynamic Theories of Quantum Mechanics. According to this perspective, there is a natural choice for the initial quantum state of the universe, which is given by not a wave function but by a density matrix. The density matrix plays a microscopic role: it appears in the fundamental dynamical equations of those theories. The density matrix also plays a macroscopic / thermodynamic role: it is exactly the projection operator onto the Past Hypothesis subspace. Thus, given an initial subspace, we obtain a unique choice of the initial density matrix. I call this property "the conditional uniqueness" of the initial quantum state. The conditional uniqueness provides a new and general strategy to eliminate statistical mechanical probabilities in the fundamental physical theories, by which we can reduce the two sources of randomness to only the quantum mechanical one. I also explore the idea of an absolutely unique initial quantum state, in a way that might realize Penrose's idea of a strongly deterministic universe. (shrink)

How can McTaggart's A-series notion of time be incorporated into physics while retaining the B-series notion? It may be the A-series 'now' can be construed as ontologically private. How is that modeled? Could a definition of a combined AB-series entropy help with the Past Hypothesis problem? What if the increase in entropy as a system goes from earlier times to later times is canceled by the decrease in entropy as a system goes from future, to present, to past?

Although quantum mechanics can accurately predict the probability distribution of outcomes in an ensemble of identical systems, it cannot predict the result of an individual system. All the local and global hidden variable theories attempting to explain individual behavior have been proved invalid by experiments (violation of Bell’s inequality) and theory. As an alternative, Schrodinger and others have hypothesized existence of free will in every particle which causes randomness in individual results. However, these free will theories have failed to quantitatively (...) explain the quantum mechanical results. In this paper, we take the clue from quantum biology to get the explanation of quantum mechanical distribution. Recently it was reported that mutations (which are quantum processes) in DNA of E. coli bacteria instead of being random were biased in a direction such that the chance of survival of the bacteria is increased. Extrapolating it, we assume that all the particles including inanimate fundamental particles have a will and that is biased to satisfy the collective goals of the ensemble. Using this postulate, we mathematically derive the correct spin probability distribution without using quantum mechanical formalism (operators and Born’s rule) and exactly reproduce the quantum mechanical spin correlation in entangled pairs. Using our concept, we also mathematically derive the form of quantum mechanical wave function of free particle which is conventionally a postulate of quantum mechanics. Thus, we prove that the origin of quantum mechanical results lies in the will (or consciousness) of the objects biased by the collective goal of ensemble or universe. This biasing by the group on individuals can be called as “coherence” which directly represents the extent of life present in the ensemble. So, we can say that life originates out of establishment of coherence in a group of inanimate particles. (shrink)

This paper discusses the extent to which advances in quantum physics can affect ideas of free will and determinism. It questions whether arguments that conclude the existence of free will from quantum physics are as valid as they seem. -/- The paper discusses the validity of Searle’s philosophy of mind, Robert Kane’s parallel processing, and Ted Honderich’s near-determinism, as well as dealing with chaos theory, the relationship between ‘randomness’ and ‘unpredictability,’ and Bell’s theorem, discussing how they can be used to (...) answer the question regarding quantum physics and free will. -/- The paper is tentative towards forming any definitive conclusion due to the ambiguity and confusion surrounding quantum physics but alludes to the idea that quantum randomness not only retracts from universal determinism, but also retracts from human free will, thus theorising a form of universal chaos and unpredictability. (shrink)

Indeterminacy in its various forms has been the focus of a great deal of philosophical attention in recent years. Much of this discussion has focused on the status of vague predicates such as ‘tall’, ‘bald’, and ‘heap’. It is determinately the case that a seven-foot person is tall and that a five-foot person is not tall. However, it seems difficult to pick out any determinate height at which someone becomes tall. How best to account for this phenomenon is, of course, (...) a controversial matter. For example, some (such as Sorensen (2001) and Williamson (2002)) maintain that there is a precise height at which someone becomes tall and such apparent cases of indeterminacy merely reflects our ignorance of this fact. Others maintain that there is some genuine – and not merely epistemic – indeterminacy present is such cases and offer various accounts of how best to account for it. Supervaluationists (such as Keefe (2008)), for example, claim that the indeterminacy with respect to vague terms lies in their not having a single definite extension. Rather, each term is associated with a range of possible precise extensions or precisifications such that it is semantically unsettled which is the correct extension. One precisification of ‘tall’ might allow that anyone over five feet ten inches is tall, whereas another would only allow those over six foot to qualify; but no precisification will take someone who is five foot to be tall, and someone who is seven foot will count as tall on all precisifications. Thus – while someone who is seven foot will be determinately tall and someone who is five foot determinately not so – it will be indeterminate whether someone who stands at five foot eleven inches is tall. -/- Yet, it is important to stress that putative cases of indeterminacy are not limited to vague predicates of this kind. Philosophers have invoked indeterminacy in discussions of topics as diverse as moral responsibility (Bernstein (forthcoming)), identity over time (Williams (2014)), and the status of the future (Barnes and Cameron (2009)). In this paper, we focus on two areas where discussion of various kinds of indeterminacy has been commonplace: physics and fiction. We propose a new model for understanding indeterminacy across these domains and argue that it has some notable advantages when compared to earlier accounts. Treating physics and fiction cases univocally also indicates an interesting connection between indeterminacy in these two areas. (shrink)

We leave in a beautiful and uniform world, a world where everything probable is possible. Since the epic theory of relativity many scientists have embarked in a pursuit of astonishing theoretical fantasies, abandoning the prudent and logical path to scientific inquiry. The theory is a complex theoretical framework that facilitates the understanding of the universal laws of physics. It is based on the space-time continuum fabric abstract concept, and it is well suited for interpreting cosmic events. However, it is not (...) well suited for handling of small, local topics as global warming, local energy issues, and overall common humanity matters. We now forward may fancy theories and spend unimaginable effort to validate them, even when we are perhaps headed in a wrong direction. For example, in our times matters of climate changes are debated by politicians based on economical considerations that are as illogical as they come. The venerable paths of scientific method developed during centuries by prominent scientists and philosophers has been willingly ignored and abandoned for various and prejudiced purpose. Contact email: gondork@yahoo.com . (shrink)

Among the very architects of the recent re-emergence of emergentism in the physical sciences, Robert B. Laughlin certainly occupies a prominent place. Through a series of works beginning as early as his Nobel lecture in 1998, a lecture given after having been awarded, together with Störmer and Tsui, the Nobel prize in physics for its contribution in the elucidation of the fractional quantum Hall effect, Laughlin openly and relentlessly advocated a strongly anti-reductionistic view of physics – and, more particularly, of (...) the interface between condensed matter and particles physics – which culminated in what can be considered his emergentist manifesto: A Different Universe. Reinventing Physics from the Bottom Down (2005). In spite of this prominent role in the vindication of emergentism, rare are the philosophers, among whom even those sympathetic to the idea of emergence, who have paid serious attention to Laughlin’s insights. The subtleties of his view – it is true, often concealed in many technicalities – have accordingly, and somewhat unfortunately, mainly passed unnoticed. (shrink)

The semantically ambiguous nature of the sign and aspects of non-classicality of elementary matter as described by quantum theory show remarkable coherent analogy. We focus on how the ambiguous nature of the image, text and art work bears functional resemblance to the dynamics of contextuality, entanglement, superposition, collapse and decoherence as these phenomena are known in quantum theory. These quantumlike properties in linguistic signs have previously been identified in formal descritions of e.g. concept combinations and mental lexicon representations and have (...) been reported on in the literature. In this approach the informationalized, communicated, mediatized conceptual configuration—of e.g. the art work—in the personal reflected mind behaves like a quantum state function in a higher dimensional complex space, in which it is time and again contextually collapsed and further cognitively entangled. The observer–consumer of signs becomes the empowered ‘produmer’ creating the cognitive outcome of the interaction, while loosing most of any ‘classical givenness’ of the sign. These quantum-like descriptions are now developed here in four example aesthetic signs; the installation Mist room by Ann Veronica Janssens, the installation Sections of a happy moment by David Claerbout, the photograph The Falling Man by Richard Drew and the documentary Huicholes. The Last Peyote Guardians by Vilchez and Stefani. Our present work develops further the use of a previously developed quantum model for concept representation in natural language. In our present approach of the aesthetic sign, we extend to individual—idiosyncratic—observer contexts instead of socially shared group contexts, and as such also include multiple idiosyncratic creation of meaning and experience. This irreducible superposition emerges as the core feature of the aesthetic sign and is most critically embedded in the ‘no-interpretation’ interpretation of the documentary signal. (shrink)

Some physicists and philosophers argue that unitarily inequivalent representations in quantum field theory are mathematical surplus structure. Support for that view, sometimes called ‘algebraic imperialism’, relies on Fell’s theorem and its deployment in the algebraic approach to QFT. The algebraic imperialist uses Fell’s theorem to argue that UIRs are ‘physically equivalent’ to each other. The mathematical, conceptual, and dynamical aspects of Fell’s theorem will be examined. Its use as a criterion for physical equivalence is examined in detail and it is (...) proven that Fell’s theorem does not apply to the vast number of representations used in the algebraic approach. UIRs are not another case of theoretical underdetermination, because they make different predictions about ‘classical’ operators. These results are applied to the Unruh effect where there is a continuum of UIRs to which Fell’s theorem does not apply. _1_ Introduction _2_ Weak Equivalence and Physical Equivalence _3_ Mathematical Overview of Algebraic Quantum Field Theory _4_ Fell’s Theorem and Philosophical Responses to Weak Equivalence _5_ Weak Equivalence in C*-Algebras and W*-Algebras _6_ Classical Equivalence and Weak Equivalence _7_ Interlude: Is Weak Equivalence Really Physical Equivalence? _8_ The Unruh Effect _9_ Time Evolution and Symmetries _10_ Conclusions Appendix. (shrink)

Empirical studies persistently indicate that the usual explanatory strategies used in quantum mechanics (QM) instruction fail, in general, to yield understanding. In this study we propose an instructional intervention, which (a) incorporates into its subject matter a critical comparison of QM scientific content with the fundamental epistemological and ontological commitments of the prominent philosophical theories of explanation - a weak form of which we meet in QM teaching; (b) illuminates the reasons of their failure; and (c) implements an explanatory strategy (...) highly inspired by the epistemological pathways through which, during the birth-process of QM, science has gradually reached understanding. This strategy, an inherent element of which is the meta-cognitive and meta-scientific thinking, aims at leading learners not only to an essential understanding of QM worldview, but to a deep insight into the 'Nature of Science' as well. (shrink)

In the present study we attempt to incorporate the philosophical dialogue about physical reality into the instructional process of quantum mechanics. Taking into account that both scientific realism and constructivism represent, on the basis of a rather broad spectrum, prevalent philosophical currents in the domain of science education, the compatibility of their essential commitments is examined against the conceptual structure of quantum theory. It is argued in this respect that the objects of science do not simply constitute ‘personal constructions’ of (...) the human mind for interpreting nature, as individualist constructivist consider,neither do they form products of a ‘social construction’, as sociological constructivist assume; on the contrary, they reflect objective structural aspects of the physical world. A realist interpretation of quantum mechanics, we suggest, is not only possible but also necessary for revealing the inner meaning of the theory’s scientific content. It is pointed out, however, that a viable realist interpretation of quantum theory requires the abandonment or radical revision of the classical conception of physical reality and its traditional metaphysical presuppositions. To this end, we put forward an alternative to the traditional realism interpretative scheme, which is in harmony with the findings of present-day quantum theory, and which, if adequately introduced into the instructional process of contemporary physics, is expected to promote the conceptual reconstruction of the learners towards an appropriate view of nature. (shrink)

We study computability theoretic properties of and equivalence structures and how they differ from computable equivalence structures or equivalence structures that belong to the Ershov difference hierarchy. Our investigation includes the complexity of isomorphisms between equivalence structures and between equivalence structures.

A Gedanken experiment is presented where an excited and a ground-state atom are positioned such that, within the former’s half-life time, they exchange a photon with 50% probability. A measurement of their energy state will therefore indicate in 50% of the cases that no photon was exchanged. Yet other measurements would reveal that, by the mere possibility of exchange, the two atoms have become entangled. Consequently, the “no exchange” result, apparently precluding entanglement, is non-locally established between the atoms by this (...) very entanglement. This quantum-mechanical version of the ancient Liar Paradox can be realized with already existing transmission schemes, with the addition of Bell’s theorem applied to the no-exchange cases. Under appropriate probabilities, the initially-excited atom, still excited, can be entangled with additional atoms time and again, or alternatively, exert multipartite nonlocal correlations in an interaction free manner. When densely repeated several times, this result also gives rise to the Quantum Zeno effect, again exerted between distant atoms without photon exchange. We discuss these experiments as variants of interaction-free-measurement, now generalized for both spatial and temporal uncertainties. We next employ weak measurements for elucidating the paradox. Interpretational issues are discussed in the conclusion, and a resolution is offered within the Two-State Vector Formalism and its new Heisenberg framework. (shrink)

Symmetries have a crucial role in today’s physics. In this thesis, we are mostly concerned with time reversal invariance (T-symmetry). A physical system is time reversal invariant if its underlying laws are not sensitive to the direction of time. There are various accounts of time reversal transformation resulting in different views on whether or not a given theory in physics is time reversal invariant. With a focus on quantum mechanics, I describe the standard account of time reversal and compare it (...) with my alternative account, arguing why it deserves serious attention. Then, I review three known ways to T-violation in quantum mechanics, and explain two unique experiments made to detect it in the neutral K and B mesons. (shrink)

Ruetsche claims that an abstract C*-algebra of observables will not contain all of the physically significant observables for a quantum system with infinitely many degrees of freedom. This would signal that in addition to the abstract algebra, one must use Hilbert space representations for some purposes. I argue to the contrary that there is a way to recover all of the physically significant observables by purely algebraic methods. 1 Introduction2 Preliminaries3 Three Extremist Interpretations3.1 Algebraic imperialism3.2 Hilbert space conservatism3.3 Universalism4 Parochial (...) Observables4.1 Parochial observables for the imperialist4.2 Parochial observables for the universalist5 Conclusion. (shrink)

Quantum indeterminism seems incompatible with Kant’s defense of causality in his Second Analogy. The Copenhagen interpretation also takes quantum theory as evidence for anti-realism. This first article of a two-part series argues that the law of causality, as transcendental, applies only to the world as observable, not to hypothetical objects such as quarks, detectable only by high energy accelerators. Taking Planck’s constant and the speed of light as the lower and upper bounds of observability provides a way of interpreting the (...) observables of quantum mechanics as empirically real even though they are transcendentally ideal. (shrink)

The aim of this paper is to elaborate a notion of explanation which is applicable to stochastic processes such as quantum processes. The model-theoretic approach was adopted in order to delimit appropriate classes, by defining set-theoretical predicates, of different kinds of physical transformations that quantum systems suffer, either of transitions or of transmutations, by interaction or in a spontaneous manner. To explain a singular quantum process consists in showing that it is feasible to model it as an indeterministic process of (...) certain specified kind. (shrink)

This paper compares Cassirer´s and Bohr´s views on symbolic knowledge in quantum physics. Although both of them consider quantum physics as symbolic knowledge, for Cassirer this amounts to a complete renunciation to intuition in quantum physics, while according to Bohr only spatio-temporal images may provide the mathematical formalism of the theory with physical reference. We show the Kantian roots of Bohr´s position and we claim that his Kantian concept of symbol enables Bohr to account for the sensible content of quantum (...) theory as well as for its systematic relation to classical physics. (shrink)

In this paper, I introduce an intrinsic account of the quantum state. This account contains three desirable features that the standard platonistic account lacks: (1) it does not refer to any abstract mathematical objects such as complex numbers, (2) it is independent of the usual arbitrary conventions in the wave function representation, and (3) it explains why the quantum state has its amplitude and phase degrees of freedom. -/- Consequently, this account extends Hartry Field’s program outlined in Science Without Numbers (...) (1980), responds to David Malament’s long-standing impossibility conjecture (1982), and establishes an important first step towards a genuinely intrinsic and nominalistic account of quantum mechanics. I will also compare the present account to Mark Balaguer’s (1996) nominalization of quantum mechanics and discuss how it might bear on the debate about “wave function realism.” In closing, I will suggest some possible ways to extend this account to accommodate spinorial degrees of freedom and a variable number of particles (e.g. for particle creation and annihilation). -/- Along the way, I axiomatize the quantum phase structure as what I shall call a “periodic difference structure” and prove a representation theorem as well as a uniqueness theorem. These formal results could prove fruitful for further investigation into the metaphysics of phase and theoretical structure. (shrink)

The aim of science is the explanation of complicated systems by reducing it to simple subsystems. According to a millennia-old imagination this will be attained by dividing matter into smaller and smaller pieces of it. The popular superstition that smallness implies simplicity seems to be ineradicable. However, since the beginning of quantum theory it would be possible to realize that the circumstances in nature are exactly the other way round. The idea “smaller becomes simpler” is useful only down to the (...) atoms of chemistry. Planck’s formula shows that smaller extensions are related to larger energies. That more and more energy should result in simpler and simpler structures, this does not only sound absurd, it is absurd. A reduction to really simple structures leads one to smallest energies and, thus, to utmost extended quantum systems. The simplest quantum structure, referred to as quantum bit, has a two-dimensional state space, and it establishes a cosmological structure. Taking many of such quantum bits allows also for the construction of localized particles. The non-localized fraction of quantum bits can appear as “dark matter”. (shrink)

This article offers a contribution to the history of scientific ideas by proposing an epistemological argument supporting the assumption made by Miller whereby Niels Bohr has been influenced by cubism when he developed his non-intuitive complementarity principle. More specifically, this essay will identify the Bergsonian durée as the conceptual bridge between Metzinger and Bohr. Beyond this conceptual link between the painter and the physicist, this paper aims to emphasize the key role played by art in the development of human knowledge.

Summary The paper considers Ernst' Cassirer's standpoint with reference to Euclidean geometry and the complementarity principle of quantum theory, interpreted as a choice between a causal description and a space-time description. The acceptance of the complementarity principle by Cassirer not only lands him off the Kantian path slightly, but it also leads him to some contradictions and incompatibilities within his own system of thought. 1. Accepting complementarity, Cassirer cannot still hold that there is an infinite hierarchy of objective levels as (...) he does towards the end of hisDeterminismus; and 2. accepting complementarity, Cassirer cannot still hold on to the observability principle of Leibniz. (shrink)