[1] "... there is a natural feeling (among scientists) that one of the greatest of our figures should not be dissected, at least by one of us", citado in Darwin and the Mysterious Mr. X, Loren Eiseley, Londres, J. M. Dent & Sons,1979.

3Theory Group, School of Physics and Astronomy, University of Manchester, Manchester, M13 9PL, UK

Available online 12 June 2006.

Hubbell's neutral theory of biodiversity has challenged the classic niche-based view of ecological community structure. Although there have been many attempts to falsify Hubbell's theory, we argue that falsification should not lead to rejection, because there is more to the theory than neutrality alone. Much of the criticism has focused on the neutrality assumption without full appreciation of other relevant aspects of the theory. Here, we emphasize that neutral theory is also a stochastic theory, a sampling theory and a dispersal-limited theory. These important additional features should be retained in future theoretical developments of community ecology.

‘When we look at the plants and bushes clothing an entangled bank, we are tempted to attribute their proportional numbers and kinds to what we call chance. But how false a view is this!’. In this statement, Darwin clearly summarized his philosophical position: there is no place for stochasticity in population biology [1]. In 2001, Stephen Hubbell [2], after more than 25 years working on population and community ecology of tree species in tropical forests, presented an explanatory theory that is formulated entirely in terms of chance. Given that On the Origin of Species by means of Natural Selection [1] is one of the most influential scientific books ever written, it is no wonder that Hubbell's ideas have generated so much controversy among ecologists 3, 4, 5, 6, 7, 8, 9 and 10.

Here, we do not enter into a philosophical discussion of the nature and origins of randomness in the world around us, but instead take an operational approach and argue why chance should be taken into account in any attempt to gain insight in the structure and functioning of ecological communities 11 and 12. We discuss the ability of neutral theory to generate new insights in community ecology, which, in the end, might not support neutrality. We also discuss the limitations and potential application of neutral ideas to biodiversity assessment in empirical settings.

Neutral theory is an ideal theoryMost previous articles on neutral theory highlight its failure to capture the complexity of ecological communities 5, 13, 14 and 15. However, here, we emphasize its merits and argue that neutral theory in ecology is a first approximation to reality. Ideal gases do not exist, neither do neutral communities. Similar to the kinetic theory of ideal gases in physics, neutral theory is a basic theory that provides the essential ingredients to further explore theories that involve more complex assumptions 16 and 17.

What are the essential ingredients of neutral theory? First, and foremost, it is a neutral theory in that the interactions among species are assumed to be equivalent on an individual ‘per capita’ basis [2]. Second, it is a stochastic theory, based on mechanistic assumptions about the processes controlling the origin and interaction of biological populations at the individual level (i.e. speciation, birth, death and migration). Because interactions are assumed to operate at the individual level, but the regularities that we would like to explain are truly macroscopic, this feature is reminiscent of the statistical thermodynamics approach in physics 2, 18 and 19. Third, it is a sampling theory: because it is built upon the sampling theory of selectively neutral alleles in population genetics [20], the sampling nature of the theory is guaranteed. In this way, we interact with the system under study through the sampling process and obtain measures in our sampling that are related to those in the real system in a particular way clearly specified by the theory. Fourth, and most innovatively, it is a dispersal-assembled theory [2]. This means that dispersal is assumed to have a leading role in structuring ecological communities. However, dispersal and sampling are intertwined and a nonrandom way of sampling can be formulated that incorporates dispersal limitation [17] (see Glossary).

The originality of Hubbell's neutral theory lies in the combination of the fact that it (i) assumes equivalence among interacting species; (ii) is an individual-based stochastic theory; and (iii) can be formulated as a dispersal-limited sampling theory. Previous formulations of neutral theory lacked some of these aspects and no niche-based dynamical theory for ecological communities has been formulated as a sampling theory from scratch. Here, we discuss the relevance and limitations of each of these features.

Neutrality assumption

At the beginning of the 20th century, communities were viewed as a superorganism that develops in a particular and fixed way to form a well-established climax community [21]. A community is then a group of species whose competitive interaction strengths are determined by their niche overlaps, and new species originate through adaptation to new niches. This view was challenged by MacArthur and Wilson with their equilibrium theory of island biogeography [22], which was extended by Hubbell [2].The importance of random mutations and genetic drift was formalized as the neutral theory in population genetics by Kimura and Crow [23]. As reviewed elsewhere [24], these ideas readily found an ecological interpretation (Box 1). Although Watterson [25], Caswell [26] and Leigh and co-workers [27] had already translated neutral models from population genetics into community ecology, Hubbell's [2] original intuition was that, in addition to neutral drift, random dispersal is the main factor controlling the assembly of ecological communities. Migration had also been studied in population genetics, but had never taken such a prominent role as in Hubbell's theory.

[1] “It is the aim of the present volume to elaborate on this distinction between the origination (innovation) and the diversification (variation) of form by focusing on the plurality of causal factors responsible for the former, relatively neglected aspect, the origination of organismal form. Failure to incorporate this aspect represents one of he major gaps in the canonical theory of evolution, it being quite distinct from the topics with w hich population genetics or developmental genetics is primarily concerned.” (p. 4)

[2] “In other words, neo-Darwinism has no theory of the generative.” (p. 7)

The Origin-of-Life Prize ® is offered through The Gene Emergence Project ® of The Origin-of-Life Foundation, Inc. ®life@us.net

Description and Purpose of the Prize

"The Origin-of-Life Prize" ® (hereafter called "the Prize") will be awarded for proposing a highly plausible mechanism for the spontaneous rise of genetic instructions in nature sufficient to give rise to life. To win, the explanation must be consistent with empirical biochemical, kinetic, and thermodynamic concepts as further delineated herein, and be published in a well-respected, peer-reviewed science journal(s).

The one-time Prize will be paid to the winner(s) as a twenty-year annuity in hopes of discouraging theorists' immediate retirement from productive careers. The annuity consists of $50,000.00 (U.S.) per year for twenty consecutive years, totalling one million dollars in payments.

The ability of the Foundation to underwrite these payments and to administer the Project is monitored by the well-known accounting firm of Young, Brophy & Duncan, PC, Certified Public Accountants.

Formal application by submitters is required to win. Submitters must expressly consent to abide by all terms and conditions of the Prize before judging of their paper(s) can begin.

Click on the table of contents on the left side of the screen for application requirements, submission forms, criteria for winning,etc.

Other than announcements in scientific journals, The Prize will not be publicly advertised in lay media. The Origin-of-Life Foundation, Inc. wishes to keep the project as quiet as possible within the scientific community. No media interviews will be granted until after the Prize is won.

Purpose of the Prize

"The Origin-of-Life Prize" ® is being offered to stimulate research into chaos, complexity, information, probability, self-organization, and artificial life/intelligence theories as they relate directly to biochemical and molecular biological reality. The Foundation wishes to encourage the pursuit of natural-process explanations and mechanisms of initial "gene" emergence within nature. The subject of interest is the genesis of primordial functional information itself rather than its physico-chemical matrix of retention and transmission. Bioinformation fits into the category of "prescriptive information" ("instruction," rather than mere probabilistic combinatorics [Abel, 2000]). By what mechanisms do stochastic ensembles acquire instructive/integrative potential? In other words, what are the processes whereby random biopolymeric sequences self-organize into indirect, functional code?

Central questions of interest relate to the definition and nature of "genetic instructions" and "biomessage." Is genetic recipe adequately represented and described by "mutual entropy" (shared, correlative uncertainty between transmitter and receiver)? At what point and by what processes do "biofunction" and "biosystem" enter into the chemical evolution of bioinformation?

What is a reasonable, empirically-accountable definition of "minimal life"? How does nature's genetic programming achieve such long sequences of highly functional decision-node selections?

Genes are linear, digital, quaternary decision-node strings. Nucleotide selections represent four-way algorithmic switch-settings. These switch-settings are covalently-bound into primary structure. The string's specific sequence precedes secondary and tertiary folding. Folding results from forces such as hydrogen bonding, charge attractions/repulsions, and hydrophobicity. These forces are much weaker than the covalent binding that has already determined sequence. Folding space is primarily constrained by this pre-existing nucleotide sequencing. Ultimately, the algorithmic programming instantiated into the nucleotide-selection sequence determines biofunction.

The problem is that natural selection works only at the phenotypic level, not at the genetic level. Neither physicochemical forces nor environmental selection choose the next nucleotide to be added to the biopolymer. Mutations occur at the genetic level. But environmental selection occurs at the folding (functional) level, after-the-fact of already strongly set sequence, and after-the-fact of already established algorithmic function of the folded biopolymer.

By what mechanism did prebiological nature set its initial algorithmic switch-settings to program the first few (RNA?) genes?

How was RNA folding function anticipated when covalently-bound primary structure was forming?

Suppose a self-replicative oligoribonucleotide analog sequence occurred spontaneously out of sequence space. How did this self-replicative strand simultaneously anticipate folding needs for metabolic utility? Any evolution toward folding fitness would tend to mutate the sequencing away from self-replicative fitness. What was the bridge between both functions?

How could random mutations simultaneously contribute to both disparate functions? How did so many biochemical pathways get integrated into one coherent, unified, and sophisticated metabolic process?

Clarification of what the Foundation is looking for

We are primarily interested in how certain linear digital sequences of monomers acquired three-dimensional dynamic function. The Prize offer is designed to stimulate focused research on the origin of initial genetic instructions themselves. So much of life origin work centers around biochemical factors. But biopolymers catalyzed by clay surfaces, for example, do not necessarily contain any functional (prescriptive) information. How does an algorithmically complex sequence of codons arise in nature which finds phenotypic usefulness only after translation into a completely different language (AA sequence)? How did natural process produce so indirectly the hundreds of needed three-dimensional protein catalysts for life to begin?

Mathematically, it is impossible to go backwards from 20 AA to 64 codons. There is no way to know which of four or six codons, for example, coded a given AA when one tries to go backwards against the "Central Dogma." Prescriptive Information has been lost. Various models of code origin often pursue primordial codon systems of only two nitrogen bases rather than three. At some point, such a two-base codon system must evolve into a three-base codon system. But catastrophic problems such as global frame shifts would have resulted from such a change midstream in the evolution of genetic code.

Environmental selection, if existent at all in a prebiotic environment, is nothing more than after-the-fact differential survivability/reproduction of certain stochastic ensembles in certain environments. How did initial genetic code-certain sequences of codons-come to specify only certain three-dimensional sequences of amino acid strings that "work"?

The winning submission will likely provide both a novel and cardinal conceptual contribution to current biological science and information theory. The Foundation welcomes theoretical models of a more direct primordial instruction system (one that might have preceded codon transcription and translation) provided the model provides explanation of continuous transition (abiding by the "continuity principle") to current prokaryotic and eukaryotic empirical life.

Inanimate stepping stones of abiotic evolution are essential components to any natural process theory of the molecular evolution of life. Full reign must be given to the exploration of spontaneously forming complexity and to self-ordering inanimate systems. But reductionistic attempts to provide models of life development must not sacrifice the very property of "life" that biology seeks to explain. Coacervates, micelles, vesicles, and various primordial quasimembrane models, for example, may resemble membrane equivalents and merit considerable ongoing research, but should not be confused with the active transport membranes of the simplest known free-living organisms.

Criteria for winning

Major issues

Applicants must provide

A. a well-conceived, detailed hypothetical mechanism explaining how the rise of genetic instructions sufficient to give rise to life as defined in "Definitions" below might have occurred in Nature by natural processes, and an

B. empirical correlation to the real world of biochemistry and molecular biology - not just mathematical or computer models - of how the prescriptive information characteristic of all known living organisms might have arisen.

The mechanism must address four topics:

- The simplest known genome's apparent anticipation and directing of future events toward biological ends, both metabolic and structural;

- The ability of the genome to convey instructions, deliver orders, and actually produce the needed biological end-products;

- The indirectness of recipe-like biological "linguistic" message code - the gap between genotypic prescriptive information (instruction) and phenotypic expression. How did the first genetic instruction arise in its coded format prior to phenotypic realization of progeny from which the environment could select? If a protobiont's genetic code and phenotype were one and the same, how did such a simple system self-organize to meet the nine minimum conditions of "life" enumerated below under "Definitions"? How did stellar energy, the four known forces of physics (strong and weak nuclear forces, electromagnetic force, and gravity), and natural processes produce initial prescriptive information (instruction/recipe) using direct or indirect code?

- The bizarre concentration of singlehanded optical isomers (homochirality of enantiomers) in living things - how did a relatively pure population of left-handed amino acids or right-handed sugars arise out of a chemical environment wherein reactions ordinarily give rise to roughly equal numbers of both right- and left-handed optical isomers?

Definitions

a. By "theory," the Foundation means a thorough explanation and mechanism explaining how natural events might have given rise to a phenomenon like the genetic sign system, algorithmic programming, and code bijection. The Prize is not being offered for creating life in vitro, but for a plausible, empirically supported theory of mechanism within nature.

b. By "mechanism," the Foundation means a scenario of sequential, cause-and-effect (or at least "functionally dependent"), empirically correlated events explaining how genetic prescriptive information (instruction) arose naturally within Nature sufficient to give rise to current life.

c. By "prescriptive information," the Foundation means the instructions necessary for biochemical function, the end-product-oriented specification of monomeric sequence, the "recipe" and "biomessage" of messenger molecules which are so manifest in all known forms of phenomenological life. As pointed out by Hubert Yockey, all known genetic information is observed in a physical matrix sequence that is linear, segregatable, and digital. These linear sequences must be translated via code to other linear sequences for prescriptive information (instruction) to be received at the receiver end of any Shannon channel. They are in effect algorithmic sequences of decision-node configurable switch settings. Algorithms alone produce sophisticated biofunction. A phase space of stochastic ensembles has never been observed to produce even the simplest of biochemical pathways. See "algorithms" under the "Discussion" section.

By "prescriptive information (instruction)," the Foundation does not mean mere order or structure, as in a snowflake. By "prescriptive information (instruction)," the Foundation does not mean mere pattern or periodicity, as in a sine wave, kaleidoscopic image, or redundant inanimate crystal. By "prescriptive information (instruction)," the Foundation does not mean mere physical "complexity," as many complex conglomerates contain no instructional information. By "prescriptive information (instruction)," the Foundation is not merely referring to the probabilistic uncertainty concepts of Shannon, nor to Maxwell-Boltzmann-Gibbs entropy, nor to Kolmogorov-Solomonoff-Chaitin compression theory of mere sequence "complexity" alone. Internal algorithmic compression of alphanumberic symbol sequences defines "complexity." But such comlexity has nothing to do with external algorithmic meaning or function. Sequence complexity is no measure of algorithmic utility. "Function" extends into additional dimensions altogether from mere sequence "complexity." External algorithmic function changes its environment and accomplishes some task external to itself.

d. By "genetic code," the Foundation means "the linguistic-like, symbolic representation of commands from one alphabet and syntactic language (e.g., codon sequence) to another (e.g., AA sequence), conveying seemingly conceptual biological instructions to cell systems." Code is a one-to-one correspondence or "bijection" from one alphabetical system/language to another. See "The source of genetic code in nature" in the "Discussion" section.

Genetic instructions represent a form of "prescriptive information (instruction)" rather than just "Shannon combinatorial/probabilistic information." Genetic "recipe" has been called "aperiodic specified complexity." See "Aperiodic specified complexity" under "Discussion."

e. By sustained, free-living "life," the Foundation means any system which from its own inherent set of biological instructions can perform all nine of the following functions:

1. Delineate itself from its environment through the production and maintenance of a membrane equivalent, most probably a rudimentary or quasi-active-transport membrane necessary for selective absorption of nutrients, excretion of wastes, and overcoming osmotic and toxic gradients,

2. Write, store, and pass along into progeny prescriptive information (instruction) needed for organization; provide instructions for energy derivation and for needed metabolite production and function; symbolically encode and communicate functional message through a transmission channel to a receiver/decoder/destination/effector mechanism; integrate past, present and future time into its biological prescriptive information (instruction) content,

3. Bring to pass the above recipe instructions into the production or acquisition of actual catalysts, coenzymes, cofactors, etc.; physically orchestrate the biochemical processes/pathways of metabolic reality; manufacture and maintain physical cellular architecture; establish and operate a semiotic system using "signal molecules"

All classes of archaea, bacteria, and every other known free-living organism, meet all nine of the above criteria. Eliminate any one of the above nine requirements, and it remains to be demonstrated whether that system could remain "alive." RNA strands, DNA strands, prions, viroids, and viruses shall not be considered free-living organisms, since they fail to meet many of the above well-recognized characteristics of independent "life."

Even in historical science, there must be some degree of empirical accountability to our theories. Proposing a mechanism that explains the origin of life must not consist of "defining down" the meaning and essence of the observable phenomenon of "life" to include "nonlife" in order to make our theories "work." Any scientific life-origins theory must connect with "life" as we observe it (the "continuity principle").

Science will never be able to abandon its empirical roots in favor of purely theoretical conjecture. On the other hand, science must constantly guard itself against Kuhnian paradigm ruts. We must be open-minded to the possibility that life has not always taken the form that we currently observe. We must take into consideration the problems inherent in any historical science where the observation of past realities is impossible.

Biophysicist Hubert P. Yockey makes the unique observation that "there is nothing in the physico-chemical world [apart from life] that remotely resembles reactions being determined by a sequence and codes between sequences. The existence of a genome and the genetic code divides living organisms from non-living matter." (Computers and Chemistry, 24 (2000) 105-123). This may well constitute the most concise and parsimonious dichotomization of animacy from inanimacy available in the literature.

We must remember, however, that the full compliment of nucleic acid code, ribozymes, and protein enzymes are still present immediately after cell death. Life, therefore, would appear not to be reducible to coded prescriptive information (instruction) alone. Life is also not "a bag of enzymes." "Life" is characterized by ongoing homeostatic metabolic process and algorithmic function, including development, growth, and reproductive potential. The inability of mules to reproduce has no relevance to discussions of protocellular viability.

Discussion

1. Entropy By "entropy" as it relates to information theory, the Foundation adopts Hubert P. Yockey's distinction between Maxwell-Boltzmann-Gibbs entropy, Shannon probability-distribution entropy, and Kolmogorov-Solomonoff-Chaitin sequence/algorithmic complexity. (See Information Theory and Molecular Biology, Cambridge University Press, 1992, sections 2.2.2 and 2.4.1 - 2.4.6). (See also, Yockey, H.P., (1974) "An application of information theory to the Central Dogma and the sequence hypothesis." Journal of Theoretical Biology, 46, 369-406, and Yockey, H.P.(1981) Self Organization, Origin of Life Scenarios, and Information Theory, J. Theor. Biology, 91, 13-31, and Yockey, H.P. (2000) Origin of life on earth and Shannon's theory of communication, Comput Chem, 24, 1, pp 105-123) Yockey argues that there is no "balancing act" between algorithmic informational entropy and Maxwell-Boltzmann-Gibbs-type entropy. The two are not on the same see-saw. The two probability spaces are not isomorphic. Information theory lacks the integral of motion present in thermodynamics and statistical mechanics. In addition, there is no code linking the two "alphabets" of stochastic ensembles. Kolmogorov-Solomonoff-Chaitin complexity does not reside in the domain of stochastic ensembles of statistical mechanics. They have no relation despite endless confusion and attempts in the literature to merge the two.

"Highly ordered" is paradoxically opposite from "complex" in algorithmic-based information theory. The emergent property of "instructions," "organization," and the "message" of "messenger biomolecules" is simply not addressed in Maxwell-Boltzmann-Gibbs equations of heat equilibration and energy flux between compartments. Surprisingly, the essence of genetic "prescriptive information" and "instructions" is not addressed by current "information theory" either. Shannon information theory concerns itself primarily with data transmission, reception, and noise-reduction processing without regard for the essence of the "message" itself. The Foundation questions whether "order," physical "complexity," or "shared entropy" are synonymous with "prescriptive information," "instructions," or "organization."

Christoph Adami emphasizes that information is always "about something, and cannot be defined without reference to what it is information about." It is "correlation entropy" that is "shared" or "mutual." Thus, says Adami, "Entropy can never be a measure of complexity. Measuring correlations within a sequence, like Kolmogorov and Chaitin (and Lempel-Ziv, and many others) is not going to reveal how that sequence is correlated to the environment within which it is to be interpreted. Information is entropy "shared with the world," and the amount of information a sequence shares with its world represents its complexity." (Personal communication; see also PNAS, April 25, 2000, 97, #9, 4463-4468).

Differences of perspective among information theorists are often definitional. "Complexity" and "shared entropy" (shared uncertainty between sender and receiver) has unfortunately often been used synonymously with "prescriptive information (instruction)." But is it? Mere complexity and shared entropy seem to lack the specification and orchestrational functionality inherent in the genetic "instruction" system of translation.

The confusion between algorithmic instruction and Maxwell-Boltzmann-Gibbs entropy may have been introduced through the thought experiment imagining Maxwell's Demon - a being exercising intelligent choice over the opening and closing of a trap door between compartments. Statistical mechanics has no empirical justification for the introduction of purposeful control over the trap door.

Solar energy itself has never been observed to produce prescriptive information (instruction/organization). Photons are used by existing instructional mechanisms which capture, transduce, store, and utilize energy for work. Fiber optics is used by human intelligence to transmit meaningful prescriptive information (instruction) and message. But raw energy itself must not be confused with functional prescriptive information/instructions. The latter is a form of algorithmic programming. Successions of certain decision-node switch settings determine whether a genetic "program" will "work" to accomplish its task.

2. Is life autonomous? Some argue that life exhibits the characteristics of autonomy. Life is orchestrated by the prescriptive information (instruction) content and the inherent systems with which it finds itself. It is directed (passive voice) by its inherited genome. Because of this others argue that even a prokaryote's autonomy is apparent rather than real. Cells are fully dependent upon "recipe" and control mechanisms delivered from a prior and external source. Cells also remain dependent upon their environment, especially for energy. But without the transducing mechanisms instructed by its genome, energy will only accelerate its demise.

3. Does life display Negentropy?

1. It is mathematically impossible for entropy to be a negative entity. (Yockey, 1992, Information Theory and Molecular Biology, Cambridge University Press, p 84)

2. Organisms do not violate the Second Law of Thermodynamics any more than any other physical entity in open systems far from equilibrium. Their existing genetic instructions, command and control mechanisms, and machinery simply allow them to process incoming nutrients and energy in full accord with the Second Law. The problem lies in the derivation of the functional biological information that makes both of these processes possible. By what mechanism did initial instructions arise sufficient to produce such highly conceptual metabolic and replicative systems? This is the quest of The Gene Emergence Project, and the object of The Origin-of-Life Prize offer.

4. Specified aperiodic complexity Many investigators, such as Leslie Orgel (Origins of Life, 1973, New York, John Wiley, 189-190), for many years have regarded genetic information as having an additional component besides mere complexity. Matrices of prescriptive information retention are not only extremely improbable ensembles, but ones which are specified in a way that yields biofunction. "Specified complexity" means that only a certain few out of a large sample space of potential or real ensembles will produce metabolic function. Specified complexity instructs and integrates biochemical pathways into homeostatic metabolism.

5. Algorithmic instruction Life is an integration of many algorithmic processes that give rise to biofunction and overall homeostatic metabolism. Algorithms are processes or procedures that give rise to useful function. Algorithms are not merely linear sequences of symbols. They do something. Each symbol represents a choice from among symbol options. Each choice is critical to the determination of eventual function. There is an organizational property - a certain element of seeming conceptuality - to the biological information /instructions that produce the citric acid cycle, for example. This aspect raises bioinformation to a more instructive, orchestrational, and recipe level than mere physical order, complexity, probabilistic uncertainty, or "mutual uncertainty" between two sets.

Algorithmic compression schemes are valuable in defining plain complexity. Such algorithms address internal sequence compressibility only. But these types of algorithms tell us nothing about whether the sequence instructs any function external to itself. The degree of compressibility is not critical to defining prescriptive information (instruction). The functionality the sequence produces is.

Algorithms are usually schema of successive decision node "choices" that lead somewhere useful. The sequence of choices accomplishes some useful task. Each decision node represents a fork in the road. One false turn, and the potential function at the end of the sequence can be lost. Dendrograms of these decision node choice options give rise to many orders of magnitude of potential terminal tree branches. Only a few of these branches usually yield sophisticated function. A biopolymer represents this sequence of decision-node symbol "choices." Even when homologous protein sequences are considered from genetic drift, sometimes the equivalent of only one branch out of 2^30 (10^10) branches in sequence space "work" as the needed enzyme. Life-origin scenarios must provide explanation and mechanism for how such unlikely algorithmic strings come together at the same place and at the same time to produce not only the local function of each individual algorithmic program, but the integration of many hundreds of such strings into homeostatic metabolism.

6. The source of genetic information in nature No theory of genetic information is complete without a model of mechanism for the source of such prescriptive information within Nature. It is not sufficient for a submission to the Prize to limit discussion of prescriptive information (instruction) theory to its replication, transmission, modification, or matrix of information retention. All submissions must address the source of the prescriptive information through non-supernaturalistic natural processes. Which of the four known forces of physics, or what combination of these forces, produced prescriptive, functional information, and how? What is the empirical evidence for this kind of prescriptive information (instruction) spontaneously arising within Nature?

7. Genetic code In all known phenomenological life, genetic code manifests the conveyance of a functional coded message, using a sign system, to distant sites through an information channel to energy-consuming decoding receivers - ribosomal "machines," • symbolic, indirect representation of that message from one alphabet into another (e.g., codons of nitrogen base "language" being translated into the end-product of physical amino acid sequence "language.") • prespecification of extremely unlikely and complex future events (see Dembski in suggested readings below) suggesting "apparent intent," "apparent planning," or "apparent purpose." (as Richard Dawkins describes it, "apparent design"), • instructions capable of effecting and affecting many individual manufacturing processes, and of mediating the cooperation of all of those diverse processes toward the one organismal and seemingly "conceptual" end of being and staying alive, and • the ability of that information (instruction) not only to give the directions or orders of what should be done, but to bring to pass those orders in the form of actual physical molecules, products, and life processes. • the seemingly "irreducible complexity" argued by Michael Behe (see suggested readings below) • the initial writing of this prescriptive information by nature, not just the modification of existing genetic instruction through mutation.

8. Scaffolding models a. "Scaffolding" models of prelife (e.g., silicone/clay matrix models) constitute acceptable submissions as long as detailed and plausible hypothetical mechanisms along with empirical correlation are provided linking such inanimate crystal matrix models to "the arch" of current carbon-chemistry life. Such models would have to explain how random defects in crystal layers got arranged into functional genetic prescriptive information (instruction). The issue is not the medium or matrix that retains the information. The issue is the source of prescriptive information (instruction) itself in any medium or matrix of Nature. b. "Scaffolding" models would also have to explain how inanimate crystals or the ions/radicals adsorbed onto them progressively acquired the nine minimum functions and capabilities of living organisms listed under the provided definition of "life."

9. Biochemical correlation a. The hypothetical mechanism must demonstrate correspondence with "the real world" of biochemistry. b. The submission must provide adequate empirical support strongly suggesting that such a hypothetical scenario can take place naturally in a prebiotic environment. Simulation of abiogenesis must be independent of the factor of human intelligence that is so often subconsciously incorporated into computer hardware/software experimental design and simulation. c. Thermodynamic realities must be clearly addressed, including specific discussion of any supposed pockets of momentary exception to the Second Law of increasing Maxwell-Boltzmann-Gibbs entropy. The Foundation's view is that Prigogine's dissipative structures, and life itself, operate within the constraints of the 2nd Law. Maxwell-Boltzmann-Gibbs entropy must not be confused with statistical Shannon entropy or Kolmogorov-Chaitin-Solomonoff-Yockey "complexity." The latter two are nonphysical, abstract, mathematical constructs. All physical matrices of prescriptive information retention, however, are subject to the plight of Maxwell-Boltzmann-Gibbs entropy. They manifest a tendency toward deterioration in both closed and open systems. Repair mechanisms for these messenger biomolecules, therefore, require all the more starting instructional integrity. Prescriptive information would have been necessary in any primordial life form's genome to correct for continuous noise corruption of its functional protogenes. Deterioration of existing recipe in a physical matrix is far more probable than the spontaneous writing of new conceptually complex metabolic algorithms. Building-block synthesis, for instance, would have required something like a reductive citric acid cycle. There are no simple algorithms for integrating such a multistep, highly-directional pathway. d. Empirical support does not have to be original research, but can be gleaned from existing scientific literature. Previously published empirical support must be described in detail and well referenced within the applicant's published research paper, explaining exactly how those controlled observations demonstrate empirical correlation with the applicant's theory.

10. "Design" anthropomorphisms It is easy for us to attribute "apparent instruction" or "apparent design" to projections of human intelligence onto the data. Based on current knowledge of molecular biology, any protocell or protobiont imaginable clearly must have manifested such "concept" to come to life long before any humans appeared on the scene to project or anthropomorphize anything. The Foundation believes that use of the words "order" and "complexity" are grossly inadequate euphemisms for the clearly observable prescriptive information, genetic instructions, biomessage, and functional biochemical pathways inherent in the simplest free-living organisms. The Foundation further believes that these empirical properties of genetic instruction and life can be investigated scientifically.

The simplest known living organisms are replete with empirical evidence of organizational unity and coherence which directs future biochemical events toward undeniable ends and purposes. Prokaryotes such as Archaea exhibit the integration of multiple biological systems into extraordinary organismic cooperation. The Foundation interprets such observable genetic instructions to be undeniable empirical evidence of objectively existent "concept" independent of human mentation. The "unusual effectiveness of mathematics" in physics offers more evidence. Such conceptual capacity predated human intelligence in any evolutionary paradigm. A hypothetical mechanism is therefore needed for two aspects of life origin: • how seemingly unintelligent natural processes could have written such highly prescriptive recipe/message linguistic-like code, and • how such an indirect system could have arisen which effects (brings into existence) so many hundreds of integrated and far-removed phenotypic processes and products in the simplest organisms.

11. Appeals to unknown laws Appealing to unknown "laws" as the source of biological instruction constitutes a "category error" of logic theory. "Laws" do not cause anything. They are merely human generalizations, mental constructions, and mathematical descriptions of existing forces and mass-energy relationships. Even "chance" is a probabilistic rational construct. Neither chance nor "laws" cause effects. Unknown laws, therefore, cannot provide a mechanism for prescriptive information (instruction) genesis. Appealing to unknown laws constitutes a "naturalism of the gaps," corresponding to supernaturalists' appealing to a "God of the gaps" for scientific explanation. Neither is acceptable in naturalistic science.

12. Infinity issues Appeals to multiple or "parallel" cosmoses or to an infinite number of cosmic "Big Bang/Crunch" oscillations as essential elements of proposed mechanisms are not acceptable in submissions due to a lack of empirical correlation and testability. Such beliefs are without hard physical evidence and must therefore be considered unfalsifiable, currently outside the methodology of scientific investigation to confirm or disprove, and therefore more mathematically theoretical and metaphysical than scientific in nature. Recent cosmological evidence also suggests insufficient mass for gravity to reverse continuing cosmic expansion. The best cosmological evidence thus far suggests the cosmos is finite rather than infinite in age.

13. Computerization a. Computerized models must be free of subtle, inherent teleological design flaws which become incorporated into the model itself. The insidious role of human intelligence in both hardware and software must be acknowledged, addressed, and somehow divorced from hypothetical models themselves. b. Models based on conditional probabilities must justify empirically why the environment would have selected for each plateau along the way. For example, why would the environment have favored and preserved the intermediary steps in many metabolic pathways of archaea when no useful product is produced until the last step? Many of these multistep, indirect manufacturing pathways constitute all-or-none processes. Such biochemical pathways have no phenotypic "plateaus" in physical biochemical reality to support theoretical arguments of selectable incremental function. Yet the random occurrence of the full pathway as a whole is statistically prohibitive in a trillion billion years, let alone in the mere 15-billion-year age of our cosmos. c. Other factors limit the number of statistical trials for an exploding cosmic egg to randomly give rise to such sophisticated pathways in 15-billion years. The finite number of nucleons available in the cosmos to react with one another (10^80?) is an example. We can no longer appeal to infinity of particles, space, or time as a means of overcoming the statistical prohibitiveness within the only cosmos with which we have empirical experience. Abundant data, mathematical proofs, and our best theories all suggest that our cosmos is finite, not infinite. We have no scientific knowledge of any other cosmos, let alone an infinite number of imaginary cosmoses. d. Parallel computer models must similarly have direct empirical correlation with naturally occurring environmental, chemical, biochemical, and molecular biological scenarios. "Directed evolution" experiments must not incorporate the artificial selection of investigators into their experimental design. "Directed evolution" is a self-contradictory phrase. Evolution by definition is never directed. Evolution in fact has no goal or purpose.