Context: The integration of data measured in first- and third-person frameworks is a challenge that becomes more prominent as we attempt to refine the ties between the dimensions we assume to be objective and our experience itself. As a result, cognitive science has been a target for criticism from the epistemological and methodological point of view, which has resulted in the emergence of new approaches. Neurophenomenology has been proposed as a means to address these limitations. The methodological application of this discipline, even in its mildest form, enriches the methodology typically used in cognitive sciences. Problem: Nowadays psychological studies are difficult to replicate. As a way to achieve replication of results published in a previous study in order to develop a methodological adaptation suitable for electroencephalographic (EEG) measurements in a subsequent experiment, first-person accounts from the participants in our pilot study were included in the experiment construction. This study’s objective is to show the benefit of including a mild-neurophenomenology-inspired approach in the adaptation from an original paradigm, which requires, foremost, the ability to replicate the original results. Method: Interviews with open and semi-structured questions were carried out at the end of an Approach-Avoidance Task (AAT. The first-person reports, together with the behavioral outcomes of each pilot, were taken into account for the development of the next piloting phase until replication of the original results was achieved, and the final experimental design was elaborated. Results: A sequence of four pilots, where the integration of third- and first-person information derived from subjects’ behavior and reported experiences while carrying them out rendered the behavioral replication we sought to achieve, providing support for a first-person enriched cognitive science paradigm. Implications: Including first-person accounts systematically during the development and performance of classic cognitive paradigms ensures that those paradigms are measuring what they claim to measure. This is the next logical step to improve replication rates, to refine the explanation of the results and avoid confounding third-person data interpretation. Constructivist content: Including first-person experiences and acknowledging the active role that participants’ experiences regarding the paradigm had in the modeling of its final version is in concordance with a constructivist standing.

Neurophenomenological (NP) methods integrate objective and subjective data in ways that retain the statistical power of established disciplines (like cognitive science) while embracing the value of first-person reports of experience. The present paper positions neurophenomenology as an approach that pulls from traditions of cognitive science but includes techniques that are challenging for cognitive science in some ways. A baseline study is reviewed for “lessons learned,” that is, the potential methodological improvements that will support advancements in understanding consciousness and cognition using neurophenomenology. These improvements, we suggest, include (1) addressing issues of interdisciplinarity by purposefully and systematically creating and maintaining shared mental models among research team members; (2) making sure that NP experiments include high standards of experimental design and execution to achieve variable control, reliability, generalizability, and replication of results; and (3) conceiving of phenomenological interview techniques as placing the impetus on the interviewer in interaction with the experimental subject.

A key challenge in modern computing is to develop systems that address complex, dynamic problems in a scalable and efficient way, because the increasing complexity of software makes designing and maintaining efficient and flexible systems increasingly difficult. Biological systems are thought to possess robust, scalable processing paradigms that can automatically manage complex, dynamic problem spaces, possessing several properties that may be useful in computer systems. The biological properties of self-organisation, self-replication, self-management, and scalability are addressed in an interesting way by autopoiesis, a descriptive theory of the cell founded on the concept of a system’s circular organisation to define its boundary with its environment. In this paper, therefore, we review the main concepts of autopoiesis and then discuss how they could be related to fundamental concepts and theories of computation. The paper is conceptual in nature and the emphasis is on the review of other people’s work in this area as part of a longer-term strategy to develop a formal theory of autopoietic computing.

Covalent bonding within chemical molecules and the internal electronic structure of atoms involve closure of phase relations in electronic wave functions, as suggested de Broglie by many years ago. The structures of crystals involving positive and negative ions can be understood in terms of replication of unit cells that may be classified in terms of symmetry. The main principle involved in crystal symmetry can be understood by examining possible patterns in decorative borders. A more widely applicable type of chemical closure occurs in oscillating reactions (dissipative structures), in which an autocatalytic process is balanced by some exit reaction. As is the case for the other types of chemical coherence, the number of distinct types of oscillating reactions is rather small. Otherwise puzzling aspects of human social and organizational behavior may be clarified by analogy with chemical oscillating reactions.

Open peer commentary on the article “A Cybernetic Computational Model for Learning and Skill Acquisition” by Bernard Scott & Abhinav Bansal. Upshot: Scott and Bansal’s assessment of the limitations of their work relies on a concept of simulation that I find problematic. It assumes that the ultimate goal of a model is a replication of the phenomena it applies, whereas a limited model produces only simulations. I argue that this position leads to unfortunate epistemological results, and it ends up assigning an unduly exclusive role to the study of the biochemical substrate of cognition.

Proto-organisms probably were randomly aggregated nets of chemical reactions. The hypothesis that contemporary organisms are also randomly constructed molecular automata is examined by modeling the gene as a binary (on-off) device and studying the behavior of large, randomly constructed nets of these binary “genes.” The results suggest that, if each “gene” is directly affected by two or three other “genes,” then such random nets: behave with great order and stability; undergo behavior cycles whose length predicts cell replication time as a function of the number of genes per cell; possess different modes of behavior whose number per net predicts roughly the number of cell types in an organism as a function of its number of genes; and under the stimulus of noise are capable of differentiating directly from any mode of behavior to at most a few other modes of behavior. Cellular differentiation is modeled as a Markov chain among the modes of behavior of a genetic net. The possibility of a general theory of metabolic behavior is suggested. Analytic approaches to the behavior of switching nets are discussed in Appendix 1, and some implications of the results for the origin of self replicating macromolecular systems is discussed in Appendix 6.

The aim of this paper is to characterize a type of causality relevant to study the closure of complex systems that we call formal causation. By this term we understand the existence of a new (not materially inherent) causal relation among constituents, generated through an autonomous process of closure. Once a certain level of organization is reached, material systems can generate internal constraints that, through recursive processes, construct their own identity. We study two different forms of closure: closure in dissipative systems and closure in template self-replication. Finally, these two forms merge and bring forth a new one: informational closure, We show how complex forms of organization are based on informational closure, which is an explicit, recorded type of formal causation allowing a functional articulation between individual organizations and larger, collective and historical (meta)organizations.

I present here an analysis of the core of biological organization from a genealogical perspective, trying to show which could be the driving forces or principles of organization leading from the physico-chemical world to the biological one. From this perspective the essential issue is to understand how new principles of generation and preservation of complexity could appear. At the beginning, the driving force towards complexity was nothing but the confluence of several principles of ordering, such as self-assembly, template replication, or self-organization, merged in the framework of what I have called a nontrivial self-maintaining organization. The key of this process is functional recursivity, namely, the fact that every novelty capable of contributing to a more efficient form of maintenance will be recruited. This leads us to the central concept of autonomy, defined as a form of self-constructing organization, which maintains its identity through its interactions with its environment. As such, autonomy grasps the idea of (minimal) metabolic organization, which, in turn, is at the basis of what we mean by (minimal) organism. Finally, from the concept of autonomy, I try to show how it has generated a new and more encompassing system in which evolution by natural selection takes over, generating in turn a new form of individual organization (genetically instructed metabolism) erasing the previous ones.

A theory of emergent or open-ended evolution that is consistent with the epistemological foundations of physical theory and the logic of self-reference requires complementary descriptions of the material and symbolic aspects of events. The matter-symbol complementarity is explained in terms of the logic of self-replication, and physical distinction of laws and initial conditions. Physical laws and natural selection are complementary models of events. Physical laws describe those invariant events over which organisms have no control. Evolution by natural selection is a theory of how organisms increase their control over events. A necessary semantic closure relation is defined relating the material and symbolic aspects of organisms capable of open-ended evolution.