We propose a novel account of the distinction between innate and acquired biological traits: biological traits are innate to the degree that they are caused by factors intrinsic to the organism at the time of its origin; they are acquired to the degree that they are caused by factors extrinsic to the organism. This account borrows from recent work on causation in order to make rigorous the notion of quantitative contributions to traits by different factors in development. We avoid the (...) pitfalls of previous accounts and argue that the distinction between innate and acquired traits is scientifically useful. We therefore address not only previous accounts of innateness but also skeptics about any account. The two are linked, in that a better account of innateness also enables us better to address the skeptics. (shrink)

Algorithmical procedure within a logical system to generate DNA chains through a formal rule up to the generation of a STOP codon's signal. Work developped under the direction of the Mexican Professor Hugo Padilla Chacón.

Causal selection is the task of picking out, from a field of known causally relevant factors, some factors as elements of an explanation. The Causal Parity Thesis in the philosophy of biology challenges the usual ways of making such selections among different causes operating in a developing organism. The main target of this thesis is usually gene centrism, the doctrine that genes play some special role in ontogeny, which is often described in terms of information-bearing or programming. This paper is (...) concerned with the attempt of confronting the challenge coming from the Causal Parity Thesis by offering principles of causal selection that are spelled out in terms of an explicit philosophical account of causation, namely an interventionist account. I show that two such accounts that have been developed, although they contain important insights about causation in biology, nonetheless fail to provide an adequate reply to the Causal Parity challenge: Ken Waters's account of actual-difference making and Jim Woodward's account of causal specificity. A combination of the two also doesn't do the trick, nor does Laura Franklin-Hall's account of explanation (in this volume). We need additional conceptual resources. I argue that the resources we need consist in a special class of counterfactual conditionals, namely counterfactuals the antecedents of which describe biologically normal interventions. (shrink)

Over the course of human history, the sciences, and biology in particular, have often been manipulated to cause immense human suffering. For example, biology has been used to justify eugenic programs, forced sterilization, human experimentation, and death camps—all in an attempt to support notions of racial superiority. By investigating the past, the contributors to _Biology and Ideology from Descartes to Dawkins_ hope to better prepare us to discern ideological abuse of science when it occurs in the future. Denis R. Alexander (...) and Ronald L. Numbers bring together fourteen experts to examine the varied ways science has been used and abused for nonscientific purposes from the fifteenth century to the present day. Featuring an essay on eugenics from Edward J. Larson and an examination of the progress of evolution by Michael J. Ruse, _Biology and Ideology_ examines uses both benign and sinister, ultimately reminding us that ideological extrapolation continues today. An accessible survey, this collection will enlighten historians of science, their students, practicing scientists, and anyone interested in the relationship between science and culture. (shrink)

Developments in the sequencing of whole genomes and in simultaneously surveying many thousands of transcription and translation products of specific cells have ushered in a conceptual revolution in genetics that rationally introduces top-down, holistic analyses. This emphasized the futility of attempts to reduce genes to structurally discrete entities along the genome, and the need to return to Johannsen's definition of a gene as 'something' that refers to an invariant entity of inheritance and development. We may view genes either as generic (...) terms for units of inheritance whose referents are pragmatic ad hoc and context-dependent, or as (epistemologically) representing entities of cell functions. It is cellular functions that determine the structural referents along the DNA. Structures that happened to secure specific functions that were essential for or conducive to the survival of cells were selected for. With natural selection being the etiological background of genes as functions, genes obtain again their theoretical role as intervening variables, abstractive variables that purely 'summarize' characters. The importance of DNA sequences is that of all possible phenotypes these are the most basic ones from which we can read off the genotype directly. (shrink)

Developmental systems theory (DST) is a wholeheartedly epigenetic approach to development, inheritance and evolution. The developmental system of an organism is the entire matrix of resources that are needed to reproduce the life cycle. The range of developmental resources that are properly described as being inherited, and which are subject to natural selection, is far wider than has traditionally been allowed. Evolution acts on this extended set of developmental resources. From a developmental systems perspective, development does not proceed according to (...) a preformed plan; what is inherited is much more than DNA; and evolution is change not only in gene frequencies, but in entire developmental systems. (shrink)

Edited by Alessandro Minelli and Thomas Pradeu, Towards a Theory of Development gathers essays by biologists and philosophers, which display a diversity of theoretical perspectives. The discussions not only cover the state of art, but broaden our vision of what development includes and provide pointers for future research. Interestingly, all contributors agree that explanations should not just be gene-centered, and virtually none use design and other engineering metaphors to articulate principles of cellular and organismal organization. I comment in particular on (...) the issue of how to construe the notion of a ‘theory’ and whether developmental biology has or should aspire to have theories, which four of the contributions discuss in detail while taking opposing positions. Beyond construing a theory in terms of its empirical content, my aim is to shift the focus toward the role that theories have for guiding future scientific theorizing and practice. Such a conception of ‘theory’ is particularly important in the context of development, because arriving at a theoretical framework that provides guidance for the discipline of developmental biology as a whole is more plausible than a unified representation of development across all taxa. (shrink)

Neurobiological disorders have diverse manifestations and symptomology. Neurodegenerative disorders, such as Alzheimer's disease, manifest late in life and are characterized by, among other symptoms, progressive loss of synaptic markers. Developmental disorders, such as autism spectrum, appear in childhood. Neuropsychiatric and affective disorders, such as schizophrenia and major depressive disorder, respectively, have broad ranges of age of onset and symptoms. However, all share uncertain etiologies, with opaque relationships between genes and environment. We propose a 'Latent Early-life Associated Regulation' (LEARn) model, positing (...) latent changes in expression of specific genes initially primed at the developmental stage of life. In this model, environmental agents epigenetically disturb gene regulation in a long-term manner, beginning at early developmental stages, but these perturbations might not have pathological results until significantly later in life. The LEARn model operates through the regulatory region (promoter) of the gene, specifically through changes in methylation and oxidation status within the promoter of specific genes. The LEARn model combines genetic and environmental risk factors in an epigenetic pathway to explain the etiology of the most common, that is, sporadic, forms of neurobiological disorders. (shrink)

This research was carried out in order to verify by simulation Mendel’s laws and seek for the clarification, from the author’s point of view, the Mendel-Fisher controversy. It was demonstrated from: the experimental procedure and the first two steps of the Hardy-Weinberg law, that the null hypothesis in such experiments is absolutely and undeniably true. Consequently, repeating hybridizing experiments as those showed by Mendel, it makes sense to expect a highly coincidence between the observed and the expected cell frequencies. By (...) simulation, 30 random samples were generated with size equal to the number of observations reported by Mendel for his single trait trial, in this case, seed shape; assuming complete dominance, with genes A and a; likewise, it was simulated the results for the experiment with two traits, segregating in separate chromosomes, in this case seed shape, as before, and albumen color, with genes B and b, both loci with complete dominance. In the case of a single trait, the data only showed evidence for rejecting the null hypothesis (Ho ) in 1/30 samples, with (P<0.05). In the case of the 30 samples of the two traits experiment, (Ho ) was rejected only on 3/30 times, when it was set a = 0.05. In both simulations there was a high correspondence between the observed and expected cell frequencies, which is simply due to the fact that (Ho ) is true, and under these conditions, that is what would to expect. It was concluded, that Mendel had no reason to manipulate his data in order to make them to coincide with his beliefs. Therefore, in experiment with a single trait, and in experiments with two traits assuming complete dominance, segregation ratios are 3:1; and 9:3:3:1, respectively. Consequently, Mendel’s laws, under the conditions as were described are absolutely valid and universal. (shrink)

Talk of a “genetic program” has become almost as common in cell and evolutionary biology as talk of “genetic information”. But what is a genetic program? I understand the claim that an organism’s genome contains a program to mean that its genes not only carry information about which proteins to make, but also about the conditions in which to make them. I argue that the program description, while accurate in some respects, is ultimately misleading and should be abandoned. After that, (...) I sketch an alternative framework which is better suited to capturing the full informational nature of genes. This framework is centered on the notion of a signaling game, as originally developed by David Lewis, but expanded upon considerably by Brian Skyrms in more recent years. On the view I develop, genes turn out to be the producers and consumers of regulatory or developmental information, rather than entities encoding such information. This finding has consequences that link up with a broader debate in the philosophy of biology concerning inheritance systems. I take this to be one form of theoretical payoff that results from applying the signaling games framework to genes. (shrink)

In 1990 Robert Lickliter and Thomas Berry identified the phylogeny fallacy, an empirically untenable dichotomy between proximate and evolutionary causation, which locates proximate causes in the decoding of ‘ genetic programs’, and evolutionary causes in the historical events that shaped these programs. More recently, Lickliter and Hunter Honeycutt argued that Evolutionary Psychologists commit this fallacy, and they proposed an alternative research program for evolutionary psychology. For these authors the phylogeny fallacy is the proximate/evolutionary distinction itself, which they argue constitutes a (...) misunderstanding of development, and its role in the evolutionary process. In this article I argue that the phylogeny fallacy should be relocated to an error of reasoning that this causal framework sustains: the conflation of proximate and evolutionary explanation. Having identified this empirically neutral form of the phylogeny fallacy, I identify its mirror image, the ontogeny fallacy. Through the lens of these fallacies I attempt to solve several outstanding problems in the debate that ensued from Lickliter and Honeycutt’s provocative article. (shrink)

A standard norm of reaction (NoR) is a graphical depiction of the phenotypic value of some trait of an individual genotype in a population as a function of an environmental parameter. NoRs thus depict the phenotypic plasticity of a trait. The topological properties of NoRs for sets of different genotypes can be used to infer the presence of (non-linear) genotype-environment interactions. While it is clear that many NoRs are adaptive, it is not yet settled whether their evolutionary etiology should be (...) explained by selection on the mean phenotypic trait values in different environments or whether there are specific genes conferring plasticity. If the second alternative is true the NoR is itself an object of selection. Generalized NoRs depict plasticity at the level of populations or subspecies within a species, species within a genus, or taxa at higher levels. Historically, generalized NoRs have routinely been drawn though rarely explicitly recognized as such. Such generalized NoRs can be used to make evolutionary inferences at higher taxonomic levels in a way analagous to how standard NoRs are used for microevolutionary inferences. (shrink)

How does a complex organism develop from a relatively simple, homogeneous mass? The usual answer is: through the execution of species-specific genetic instructions specifying the development of that organism. Commentators are sometimes sceptical of this usual answer, but of course not all commentators. Some biologists refer to master control genes responsible for the activation of all the genes responsible for every aspect of organismal development; and some philosophers, most notoriously Rosenberg, buy this claim hook, line, and sinker. Here I explore (...) both the seeming plausibility of the usual position, and also its ultimate inadequacy. (shrink)

The genetic code appeared on Earth with the first cells. The codes of cultural evolution arrived almost four billion years later. These are the only codes that are recognized by modern biology. In this book, however, Marcello Barbieri explains that there are many more organic codes in nature, and their appearance not only took place throughout the history of life but marked the major steps of that history. A code establishes a correspondence between two independent 'worlds', and the codemaker is (...) a third party between those 'worlds'. Therefore the cell can be thought of as a trinity of genotype, phenotype and ribotype. The ancestral ribotypes were the agents which gave rise to the first cells. The book goes on to explain how organic codes and organic memories can be used to shed new light on the problems encountered in cell signalling, epigenesis, embryonic development, and the evolution of language. (shrink)

Laudan's thesis that conceptual problem solving is at least as important as empirical problem solving in scientific research is given support by a study of the relation between the chromosome theory and the Mendelian research program. It will be shown that there existed a conceptual tension between the chromosome theory and the Mendelian program. This tension was to be resolved by changing the constraints of the Mendelian program. The relation between the chromosome theory and the Mendelian program is shown to (...) be a good illustration of the influence of science itself on the rational standards governing scientific development. (shrink)

Jaegwon Kim’s exclusion argument is a general ontological argument, applicable to any properties deemed supervenient on a microproperty basis, including biological properties. It implies that the causal power of any higher-level property must be reducible to the subset of the causal powers of its lower-level properties. Moreover, as Kim’s recent version of the argument indicates, a higher-level property can be causally efficient only to the extent of the efficiency of its micro-basis. In response, I argue that the ontology that aims (...) to capture experimentally based explanations of metabolic control systems and morphogenetic systems must involve causally relevant contextual properties. Such an ontology challenges the exclusiveness of micro-based causal efficiency that grounds Kim’s reductionism, since configurations themselves are inherently causally efficient constituents. I anticipate and respond to the reductionist’s objection that the nonreductionist ontology’s account of causes and inter-level causal relations is incoherent. I also argue that such an ontology is not open to Kim’s overdetermination objection. (shrink)

Recent and not so recent advances in our molecular understanding of the genome make the once prevalent view of the genome as a passive container of genetic information (i.e., genes) untenable, and emphasize the importance of the internal organization and re-organization dynamics of the genome for both development and evolution. While this conclusion is by now well accepted, the construction of a comprehensive conceptual framework for studying the genome as a dynamic system, capable of self-organization and adaptive behavior is still (...) underway. This work deals with the effect of such a conceptual shift on evolutionary thought. Specifically, I try to articulate the conceptual commitments and obligations of views that privilege explanatorily or causally the genome, its dynamics and mechanisms, over genes. I refer to this class of views as belonging to ‘the genome perspective’. (shrink)