Special Issue Information

Dear Colleagues,

We stand on the edge of one more major leap in our understanding of the universe. One of many indications for the need of radical re-conceptualization is the fact that in our current understanding most of the universe seems to consist of something we know next to nothing about - dark energy and dark matter. All our knowledge about physics however is based on ordinary energy/matter which makes up less than 5% of what we know as the universe.

There are several strategies for attacking this problem of understanding of physical reality, and already today we can see the beginnings of the development of a new conception of the world, where physics is placed in a broader context of human knowledge. It goes via basic ideas of information and computation. This development is a consequence of the advances in information processing technologies which affect knowledge production and our grasp of the fundamental ideas of reality, human mind and cognition, knowledge, sciences, humanities, engineering and arts.

Many have already declared that reality basically is an informational phenomenon. To name but a few: Wheeler with IT FROM BIT; Floridi with Informational Structural Realism; Lloyd, Seife, Vedral with Decoding Reality; Frieden with Physics from Fisher Information and more. How does this information relate to energy/matter?

The essential for new approaches is closure - coming back to human which is the center of all knowledge production about the world. This self-reflective process has traditionally been avoided because of the practical problems in addressing it computationally. Nowadays we have tools at our disposal which help us understand self-reflective dynamical structures, so this does not present a problem for modeling anymore.

The idea is to explore how the framework for knowledge production relates to what can be known (and all of our knowledge is structured information so laws of physics are information about the informational structure of the world – a meta-information). It connects information with matter-energy as we find it in the world and in the observer of the world.

This special issue will explore all the different facets of the relationship between the world (physical world as we know it in form of energy/matter) and information.

Abstract: What can we hope for from studies of information related to energy/matter (as it appears for us in space/time)? Information is a concept known for its ambiguity in both common, everyday use and in its specific technical applications throughout different fields of research and technology. However, most people are unaware that matter/energy today is also a concept surrounded by a disquieting uncertainty. What for Democritus were building blocks of the whole universe appear today to constitute only 4% of its observed content. (NASA 2012) [1] The rest is labeled “dark matter” (conjectured to explain gravitational effects otherwise unaccounted for) and “dark energy” (introduced to account for the expansion of the universe). We do not know what “dark matter” and “dark energy” actually are. This indicates that our present understanding of the structure of the physical world needs re-examination. [...]

Abstract: We survey a few aspects of the thermodynamics of computation, connecting information, thermodynamics, computability and physics. We suggest some lines of research into how information theory and computational thermodynamics can help us arrive at a better understanding of biological processes. We argue that while a similar connection between information theory and evolutionary biology seems to be growing stronger and stronger, biologists tend to use information simply as a metaphor. While biologists have for the most part been influenced and inspired by information theory as developed by Claude Shannon, we think the introduction of algorithmic complexity into biology will turn out to be a much deeper and more fruitful cross-pollination.

Abstract: Information must have physical support and this physical universe comprisesphysical interactions. Hence actual information processes should have a description byinteractions alone, i.e., an extensional description. In this paper, such a model of the processof information articulation from the universe is developed by generalizing the extensivemeasurement theory in metrology. Moreover, a model of the attribute creation processis presented to exemplify a step of the informational articulation process. These modelsdemonstrate the valuableness of the extensional view and are expected to enhance theunderstanding of the extensional aspects of fundamentals of information.

Abstract: The concept of information plays a fundamental role in our everyday experience, but is conspicuously absent in framework of classical physics. Over the last century, quantum theory and a series of other developments in physics and related subjects have brought the concept of information and the interface between an agent and the physical world into increasing prominence. As a result, over the last few decades, there has arisen a growing belief amongst many physicists that the concept of information may have a critical role to play in our understanding of the workings of the physical world, both in more deeply understanding existing physical theories and in formulating of new theories. In this paper, I describe the origin of the informational view of physics, illustrate some of the work inspired by this view, and give some indication of its implications for the development of a new conception of physical reality.

Abstract: In this paper I discuss the question: what comes first, physics or information? The two have had a long-standing, symbiotic relationship for almost a hundred years out of which we have learnt a great deal. Information theory has enriched our interpretations of quantum physics, and, at the same time, offered us deep insights into general relativity through the study of black hole thermodynamics. Whatever the outcome of this debate, I argue that physicists will be able to benefit from continuing to explore connections between the two.

Abstract: It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information have been in agreement that it depends on probabilities (through Shannon entropy), but there has long been a dispute about the definition of probabilities themselves. The “frequentist” view is that probabilities are (or can be) essentially equivalent to frequencies, and that they are therefore properties of a physical system, independent of any observer of the system. E.T. Jaynes developed the alternate “Bayesian” definition, in which probabilities are always conditional on a state of knowledge through the rules of logic, as expressed in the maximum entropy principle. In doing so, Jaynes and others provided the objective means for deriving probabilities, as well as a unified account of information and logic (knowledge and reason). However, neuroscience literature virtually never specifies any definition of probability, nor does it acknowledge any dispute concerning the definition. Although there has recently been tremendous interest in Bayesian approaches to the brain, even in the Bayesian literature it is common to find probabilities that are purported to come directly and unconditionally from frequencies. As a result, scientists have mistakenly attributed their own information to the neural systems they study. Here I argue that the adoption of a strictly Jaynesian approach will prevent such errors and will provide us with the philosophical and mathematical framework that is needed to understand the general function of the brain. Accordingly, our challenge becomes the identification of the biophysical basis of Jaynesian information and logic. I begin to address this issue by suggesting how we might identify a probability distribution over states of one physical system (an “object”) conditional only on the biophysical state of another physical system (an “observer”). The primary purpose in doing so is not to characterize information and inference in exquisite, quantitative detail, but to be as clear and precise as possible about what it means to perform inference and how the biophysics of the brain could achieve this goal.

Abstract: This first part of the study introduces an elementary concept of information. Our interest for newness, our curiosity in the new, will be considered as a main building block of information, and of reality itself. A typical definition of information (the reduction of uncertainty) needs to be fundamentally inverted: Information is a compositional activity, including the inconsistent, the paradox, the contradiction and the incoherent meaning. This study expands on the analysis of the composition of new structure (new macrophysical laws), and the analysis of the causality and causal state of such structures (“causally active symbols”). The classical, scientific-objective, passive understanding of information gives meaning to the fact that modern information technology does not by itself lead to an increase of human values. However, our social and moral stance is an informational one, and our informational, active conscious process holds the power to mediate and to enforce this process towards an enriched life. The indicator for such enrichment is given to us by information, and the knowledge about this process will feed us with energy to move towards an active spirit of ethics, and towards the information society. Part I of this study expands on the fundament basis and on our intrinsic responsibility to release the forces that are based on the active dimension of information. Those forces are required in order to reveal the so-called information society from its metaphorical character (Part II).

Abstract: Interpretations of quantum theory have traditionally assumed a “Galilean” observer, a bare “point of view” implemented physically by a quantum system. This paper investigates the consequences of replacing such an informationally-impoverished observer with an observer that satisfies the requirements of classical automata theory, i.e., an observer that encodes sufficient prior information to identify the system being observed and recognize its acceptable states. It shows that with reasonable assumptions about the physical dynamics of information channels, the observations recorded by such an observer will display the typical characteristics predicted by quantum theory, without requiring any specific assumptions about the observer’s physical implementation.

Abstract: Chemical affinity involves the integration of two different types of interaction. One is the interaction operating between a pair of reactants while forming a chemical bond, and the other is the prior interaction between those reactants when they identify a reaction partner. The context of the environments under which chemical reactions proceed is identified by the interaction of the participating chemical reactants themselves unless the material process of internal measurement is substituted by theoretical artifacts in the form of imposed boundary conditions, as in the case, for example, of thermal equilibrium. The identification-interaction specific to each local participant serves as a preparation for the making of chemical bonds. The identification-interaction is intrinsically selective in precipitating those chemical bonds that are synthesized most rapidly among possible reactions. Once meta-stable products appear that mediate chemical syntheses and their partial decompositions without totally decomposing, those products would become selective because of their ongoing participation in the identification-interaction. One important natural example must have been the origin and evolution of life on Earth.

Abstract: This paper discusses concepts of self-organized complexity and the theory of Coherent Infomax in the light of Jaynes’s probability theory. Coherent Infomax, shows, in principle, how adaptively self-organized complexity can be preserved and improved by using probabilistic inference that is context-sensitive. It argues that neural systems do this by combining local reliability with flexible, holistic, context-sensitivity. Jaynes argued that the logic of probabilistic inference shows it to be based upon Bayesian and Maximum Entropy methods or special cases of them. He presented his probability theory as the logic of science; here it is considered as the logic of life. It is concluded that the theory of Coherent Infomax specifies a general objective for probabilistic inference, and that contextual interactions in neural systems perform functions required of the scientist within Jaynes’s theory.

Abstract: Semiotics is widely applied in theories of information. Following the original triadic characterization of reality by Peirce, the linguistic processes involved in information—production, transmission, reception, and understanding—would all appear to be interpretable in terms of signs and their relations to their objects. Perhaps the most important of these relations is that of the representation-one, entity, standing for or representing some other. For example, an index—one of the three major kinds of signs—is said to represent something by being directly related to its object. My position, however, is that the concept of symbolic representations having such roles in information, as intermediaries, is fraught with the same difficulties as in representational theories of mind. I have proposed an extension of logic to complex real phenomena, including mind and information (Logic in Reality; LIR), most recently at the 4th International Conference on the Foundations of Information Science (Beijing, August, 2010). LIR provides explanations for the evolution of complex processes, including information, that do not require any entities other than the processes themselves. In this paper, I discuss the limitations of the standard relation of representation. I argue that more realistic pictures of informational systems can be provided by reference to information as an energetic process, following the categorial ontology of LIR. This approach enables naïve, anti-realist conceptions of anti-representationalism to be avoided, and enables an approach to both information and meaning in the same novel logical framework.