This article offers alternatives to Darwin’s theory of natural selection as a complete explanation of evolution. Two other possibilities are discussed, including a revision of Lamarckism. The argument is offered that while there is probably not a direct, imminent mutative response to environmental changes as in Lamarck’s model, a feedback mechanism might exist within the interactions of DNA, mRNA, protein synthesis vis a vis environmental shifts that induce organic stress and lead to a state of genomic agitation, (i.e. negative feedback) that creates uncertainty in biochemical assembly and over time increases the probability of trait mutations.

Topsy-Turvey Naturalism

Sometimes it seems the idea of randomness as applied to adaptation is hard for even Darwinian adherents to conceptualize, possibly because it is non-deterministic and perhaps a bit alien to scientists. Darwinian advocates often allude to…”organisms developing large molars to accommodate a vegetarian diet”, or perhaps…”a coat color mutation creating a blend with tall grass to facilitate stealth and predation.” The fact that natural selection could lead to random mutations that could be advantageous is not in question here. Undoubtedly that process occurs at some (trial and error) level of probability. Yet it seems incomplete.

Take tooth development. The notion that there is an implicit congruence between a vegetarian diet and having small canines and large molars implies several things that are sequentially confusing. One pertains to the question of whether certain creatures began eating plants, then evolved teeth more suitable to grinding or whether the mutation to smaller canines and enlarged molars came first, leading to a “decision” by the creature to shift to a plant diet. Since most creatures have brains, evolution can arguably never be separate from cognition – a point that will be revisited in the discussion on Neo-Lamarckism.

It raises other questions. For example; why would a creature change its dietary preference? If it never ate plants before the mutation; its taste buds, perceptual attractions, dietary tract, metabolism and behavioral instincts would have to change along with its teeth. If it did eat plants prior to the dental mutation why the need for large molars and small canines? If the creature had subsisted as a herbivore prior to the mutation what benefit would the tooth restructuring provide – bearing in mind that many primates with large canines feed primarily on plants, which after all are plants. With regard to the efficiency of tooth structure and plant eating, meat eating involves chunking morsels down as opposed to chewing but one could do that with lettuce as easily as with a leg of zebra.

Another problem with Darwin’s theory is his notion of sexual selection. In choosing mates females could certainly select some traits over others but that selection process might run counter to evolutionary change; not just because the meshing of paternal and maternal genes tends to stabilize the gene pool but because females typically select males with traits that represent the current state of the species. For example human females value language capacities in men, which happen to be unique to our species. Some female birds select males who sing well or those with elaborate plumage that exemplifies the best status quo traits of the species. In that sense sexual selection would seem to mitigate against evolutionary change. The females in any given species cannot be prescient enough to select mates with unusual traits that may or may nor prove advantageous down the road. They do not mate by chance but by purpose, which would tend to skew the evolutionary process toward species norms.

Another problem lies in the fact that the biological world and all its creatures operate by a homeostatic process. When there is a disturbance in any physiological system, the tendency is for cybernetic (corrective) responses to restore stasis. The fact that there are regulatory genes in the genome and also communicative interactions between RNA and DNA regarding the appropriate synthesis of proteins suggests homeostasis operates at the biochemical level. That suggests a holistic process in gene alignment rather than peculiarities cropping up here and there that would be subject to nature’s scrutiny.

Finally, there is the problem of chronology. Whenever an opponent of natural selection argues that it is impossible for the order and complexity seen in all organisms to occur over time the counter argument is that life has been around for well over a billion years and that humans cannot conceive of such extended time spans – that they can’t impose anthropocentric judgments on evolutionary probabilities.

In some ways that argument is legitimate, except for one thing. Evolution does not happen “over time.” It often occurs in short bursts – in other words it is punctuated. For most of those billion or so years, organisms did not change much at all. Then in dramatic spurts they did. Unless one assumes some cause and effect relationship between the relatively sudden change in the environment and an increased rate of mutation.

Alternatives

Based on such snags in the theory of natural selection, there are alternative ideas that would not refute Darwin’s theory but might augment, or even surpass it as a model. One, proposed by this author is progressive encoding (DePaolo 2007). To explain this requires a look back at the origin of life. Many have grappled with the question of how life began on earth, especially since even bacteria are incredibly complex, it is clear such systemic entities did not arise from spontaneous generation (Wald 1954) or from a floating DNA molecule that out-competed other macro molecules that did not replicate (Muller 1966) or from some sort of proto-biotic protein that could build somatic structures and also reproduce (Fox 1977). One question raised by Shapiro (2006) who wrote a very insightful book on the subject of origins was how a system comprising life – with its self sustaining and reproductive capacities cropped up in the first place.

Bits of Existence

Progressive encoding is based on Information Theory concepts. Without going too far off the subject, this theory can be whittled down to two main concepts. One element is noise, which is a super blend of elements without distinction, thus without information content. In simple terms, if every component in a whole is exactly the same there can be no identities within the structure thus no capacity to separate one item from another). Nor could any one element or the “super blended whole” have any capacity for communication because information transmission requires at least two separate signals.

The other component is information, or a code, which does feature distinctions that operate outside the overall “blend” and while interacting with it, are not incorporated into the whole. As an illustration; consider a room full of people who all look exactly alike, have the same exact name, walk and talk exactly alike. In such an instance there would be no personhood. On the other hand if one person “broke loose from the pack” found a way to talk and look differently and assigned himself a separate name those distinctions would comprise three bits of information. One bit (or distinction) being unique language, a second bit pertaining to a distinct appearance, a third resulting from his having a unique name.

Information can be quantified. The general formula is that each resolution of noise or uncertainty (undoing of the blend) comprises one bit of information (Shannon, Weaver 1949), (Pierce 1961), (McGliece 2002). One interesting aspect in this scenario is that despite being differentiated from the others the “information man” would still have to interact with the others. If he remained isolated he would eventually become himself a monotonous system with no distinction or information content. He would regress toward a “noisy” blend. Thus the influence of information dynamics in nature is sequential and mandatory. A component can break out of a uniform system, become informed/encoded but to remain solvent it must interact with other components or systems that in turn are distinct from it, lest it lapse into a insular state of noise, i.e. entropy.

Yet that process has a major snag. Any system that expands and becomes more interactively variable will run the risk of chaos. In order to maintain the integrity of the overall system there must be interactive rules, that is, a governor. In the hypothetical social example discussed above the rules might revolve around a common language, and perhaps rules on social probity. In the body the rule is homeostasis i.e. an oversight process that recognizes errors in the overall system and can summon substrate organs to make readjustments regarding body temperature, blood flow, caloric count, metabolism, cellular maintenance etc.

From People to Molecules

The proto-biotic molecules on earth poised potentially to ratchet into life forms were faced with a “noise” problem. Such molecules no doubt cropped up periodically in some sort of assimilable form, but due to disruptive lightning storms, water flow and extreme shifts in temperature between day and night they would break up and re-blend with the surrounding milieu. In order for life to evolve into a system with anchor point structures and functions required a mechanism by which it could separate from its surroundings; specifically a semi-permeable membrane. Membranes insulated cells from the tumult of the outer world and enabled them to function independently. The advent of the first semi-permeable membrane did not allow for complete independence but rather created a capacity to remain separate yet communicate with the environment so it could absorb and discard energy and obtain and act upon information about the outside world. Membranes consist of lipids, which are fats. Fats provide an ideal insulation against water and other agents. With their incorporation into the cell structure came a new process of evolution, characterized not just by natural selection but by an increasing tendency toward organic insulation. First, came membranes to insulate the cell proper. Then with the advent of eukaryotic cells came nuclei and other organelles to provide further layering between the inner and outer world. And as organisms proceeded to become larger and more complex came further insulation in the form of organ structures with specialized functions, such the alimentary tract, lungs, (or gills), muscles, hearts and brains.

The advent of separate and distinct organs led to an increase in bio-information content. An important aspect of information content (even in a biological context) is redundancy. By way of explanation, the main purpose of all biological systems is cellular integrity. Cells need nutrition (including oxygen) to survive. Having lungs to inspire, a heart and vessels to pump and transmit, muscle to burn sugars and signal the need for replenishment creates a very efficient division of labor that insulates the organism against a failure in any one system and increases the odds on cellular survival. In short, the number of layers (distinctions/bits) in the body provides both increases insulation from the environment and a parallel increase in bio-information content. In that context the amount of information contained in any given organism delineates its separation quotient from the environment and correlates highly with adaptation and survival.

Snags in Complexity

According to Progressive Encoding Theory life did not merely emerge and evolve as a retroactive means of adapting to its environment but also as a means of avoiding environmental influence through the complexity of organic structural and functional insulation – in other words via enhanced information content. The progressive encoding process provided organic stability, complexity, inter-organic communication/cooperation and also produced a template by which organisms could continue to get larger, more internally complex and poly stable.

While speculative, this concept could be used to explain why mammals became homeothermic and why humans developed a brain capable of imaginative cognition. Being able to interact covertly through imagination, anticipatory thought and planning might have been just another step on the progressive encoding sequence that insulated us yet further against the influence of the outside world. This might have been a continuation of the rudimentary noise reduction process that created the initial separation of cell from environment and it might help explain why human brain expansion enabled us to create the separate, non-experiential and insular worlds of art science, empathic morality and politics. In other words the same mechanism that made life’s onset possible was also responsible for the paintings on the Sistine Chapel.

Another Alternative

Even if progressive encoding can be viewed as a co-causal process in organic evolution it is probably not the only complementary factor. This author is fond (in an ironic sort of way) of statements by dyed in the wool scientists that either personify basic biological phenomena or state them in deterministic language. For example Richard Dawkins (1976) and Carl Sagan (1980) both alluded to the idea of the “selfish gene”; the idea being that mere macromolecules are capable of instructing organisms on how to behave and think; all for purposes of maintaining the gene pool. The dictatorial qualities they assign to genes is interesting on many levels. First it implies that an entity without mind is controlling entities with minds (begging the question of what brains are for in the first place). Second, it suggests subjects like morality, social cooperation, sexual interest, even the use of deceptive behavior are driven by molecules; the rest of our bodies oblivious enactors of plans drawn up in the primordial soup several billion years ago. Perhaps we haven’t changed much after all.

At face value the selfish gene concept might seem dubious. Then again haven’t biologists discovered the highly communicative interactions between mRNA, DNA and protein synthesis? Is it not the case that chemicals correct errors, set up complex chemical pairings on the double helix, perform editing functions via RNA interruptions and tell protein composites how and where to line up in gestation? All these well documented functions do exist. Turns out many of the decisions we consider cognitive occur in the smallest of contexts. That leads to discussion on a newly emerging concept of evolution based on the initially refuted theory of Jean Baptiste Lamarck.

The Origin of Theory

Jean Baptiste Lamarck was one of several early thinkers on the subject of evolution. His model, often referred to as “soft inheritance” or “use/disuse” theory held that organisms evolved traits in response to environmental pressures. Unlike Darwin he saw organic change as more purposeful than purely accidental. The key element in his theory was not that individual organisms could change in the face of environmental pressures but that such changes could be passed on to subsequent generations. One flaw in his mode was the notion that evolution inevitably proceeds to order – that there is an implicit (almost Platonic) drift toward organic perfection. That was a bit too anthropocentric for most scientists. The purposeful adaptation vs. random change distinction is important because it is well known that any given creature can alter its morphology in response to environmental changes. A thin person living in cold climates can “fatten up’ by eating certain foods and by becoming less active – two mechanisms for sustaining energy reserves. The real question is whether such changes can be carried over to new generations. In other words, will the person’s newfound girth and metabolic shift show up in the body type or metabolism of his offspring?

Early research seemed to disprove Lamarck’s theory; the most notable being Weismann’s study (1889) which showed that cats whose tails were severed did not over several generations produce tailless offspring. However in hindsight this and other studies seem suspect. For example severing tails in an artificial, experimental context had nothing to do with extant environmental pressures occurring over time. The experiment did not include environmental pressures mitigating the need for tails – as for example if long tails over time made the cat more susceptible to predation, i.e. easier to catch.

Weismann’s refutation prevailed in any event and natural selection remains a mainstay of evolution theory. On the other hand science never stands still and recent research has led to a modification of use/disuse theory in a new model supported by the idea of epigenesis.

Softer Inheritance

Based in part on questions regarding natural selection by Gauthier ( 1990) an others, Neo-Lamarckian theories have arisen, with roots in several areas of study. All of these models adhere to the notion that traits acquired in light of environmental changes can be passed on to subsequent generations. All of these are refutations of the germ plasm theory, which derives from natural selection and holds that the somatic experiences of one individual or generation will not register with the DNA and consequently are not heritable. Studies in the field of trans-generational epigenesis have shown that cellular and physiological traits that do not correlate with changes in the DNA sequence are heritable by daughter cells. (Jablonka, Lamb (1995), (Jablonka 2006). That would seem to challenge the idea of an exclusive connection between genes and mutations. Their study showed that altering the diet of mice with dietary supplements led to changes in expression of the Agouti gene, which is involved in color, weight and cancer proneness. Thus it seems generational changes can transfer to the traits of offspring even without changes in the genetic code.

Other studies have offered challenges to the natural selection model. For instance the functions of stem cells as macro-generators of more specific cells raises questions about the direct link between specific genes and inherited traits. (Skinner 2015) In this instance changes were determined by stem cell generativity without a corresponding change in the DNA of specific traits. Offering still another challenge to natural selection, are “prion” studies which have shown that proteins can catalytically convert and reduce a protein’s activity and that micro-RNA can cause a delay or disruption in the communication between messenger RNA and protein synthesis (Krakauer, Zanotto et al 1998). In a sense epigenetics turns the entire concept of evolution upside down. It not only brings into question the legitimacy of natural selection theory but also offers an alternative mechanism on how traits are passed on.

Quite obviously the simple notion that genetic mutation, superimposed on environmental change determines which traits emerge and which organisms survive is a bit lacking as a complete explanation. Still, it is not a model easily abandoned, in part because of its simplicity. The question is: where does one find a good fit among the progressive encoding, epigenetic and natural selection models?

Feedback

A key element in evaluating evolutionary thinking lies in the concept of feedback. In a sense the epigenesis studies demonstrate that on some level a feedback/registration mechanism does exist between the outside world, the genes and the soma. It seems the genome (ancient and unmalleable as it might seem) is aware of an organism’ response in light of environmental pressures. Yet evolution is such a long term process that one must explain exactly what happens in that interaction; In other words if genes can change in response to the environment why would this only occur over after millions of years or in punctuated manner during dramatic environmental shifts?

One possible resolution is to assume, in line with the progressive encoding theme, that environmental changes can, over time cause genomic discord, featuring disturbances in the alignment process in the form of negative feedback between organs and genes which leads to disturbances in the pristine structure of the genetic code. If the genome and soma do communicate with respect to prolonged hormonal, anatomical and physiological duress might it begin to quaver a bit, producing noise in the system? Furthermore, might the noise go unresolved for long periods of time, perhaps exacerbated so much during environmental disasters such as glaciation, or volcanic-induced shifts in terrain that the amount of noise increases that genetic restructuring is made more rapid? In other words, with greater genomic discord do mutations reduce so much noise as to produce a leap forward of manifest traits by the thrust of emerging information content?

Looking at evolution in information terms allows for the implied purposefulness seen in epigenetic studies. In some sense this idea is reminiscent of Freud’s tension reduction theory, in that the organism can be viewed as a physical system governed by homeostasis. If feedback communications between genes and soma do exist, then the genes, “concerned” as they are with survival and propagation of the pool would tend to process threats to that mandate.

One way to prove or disprove an information-based model of evolution would be through research; specifically around the question of whether dramatic or prolonged environmental changes correlate with increased errors in genetic/molecular alignment, skewed reactions in mRNA or proliferation of discard genes that appear to have no influence in trait manifestation, but can signify an increase in noise in the genomic system. If such a cause-effect tumult can be proved to exist, there might be a theoretical shift beyond the scope of natural selection and epigenesis toward an information-based model of evolution that assumes environmental shifts, increased organic duress, an increase in somato-genetic uncertainty can lead to an increased probability of evolutionary change. If so, then perhaps, Heisenberg’s description of Information Theory as the “theory that decides” might prove accurate.

This article discusses a potential restructuring of the psyche in response to the increased volume of media, including the monitoring of behavior through various techno-media outlets. The argument is made that given current trends, the triadic id-ego-superego personality structure could regress in a maladaptive sequence of extreme inhibition, sociopathy and finally broad social apathy.

Sigmund Freud’s theory of personality was part biological and part social (Laplanche, Pontalis 1973). He proposed that the id was the bio-natural component of human nature that did not necessarily jive with the demands of complex human society and was oblivious to phenomena such as ethics, mutuality and restraint. Being a neo-Darwinian, he saw this as an aspect of mind designed to aid in the survival and propagation of the human species. He also believe (somewhat ironically) that the id actually provided an ergonomic impetus for all psychic functions, including the conscience (Freud, 1920), (Freud, 1933)

Conversely, in Civilization and its Discontents he viewed the ego and superego as social phenomena that only arose after agricultural society became more complex, sedentary and interactive (2002 – reprint). Large, diverse groups of people from different backgrounds united only by walls tilling the soil, building walls, temples and pyramids, having to combine forces in war against rival armies might not know or like one another, might resent the differences between their various gods, rituals and languages, but they still had to work together to sustain their proto-sovereignties.

While the id would not, as per its basic biological topography, change over time the ego and superego did evolve in accord with socio-cultural changes. Behavior forbidden in one era might become not only tolerated but common place in another. Superego evolution is of particular interest, since it provides a moral frame work for both the individual and society. As Freud wrote, it can be too rigid, for example when mores and the influence of conscience creates such behavioral restraint as to stifle creativity and paralyze the functional psyche. Or it can be too lax such that diffuse primal energies cannot be systematically extricated for socially productive pursuits (Freud – 1987 reprint).

In fact the triadic mind operates by an algorithm of proportion. While laws, religions and ethics are created to maintain order, rescue the soul, and preserve social probity, they all derive from a bio-social blend of instinct, vigilance and reason that provides not just order and spirituality but as Freud suggested, marriage, art, politics, humor and music (Matte, 2001), (Glover, 2014) Since the relation between the ego and superego will shift with changing times one might expect the current (and future) proliferation of media to effect the relationship between the two.

Forward and Back…

There seems little question that a major shift in the psyche has paralleled the spread of liberalism in western society; a phenomenon not confined to the post 60s peace-love-and drug use cultural explosion. While anyone alive during that time is aware that in prior decades sex without marriage, ribald media, drug use and other liberal manifestations were taboo, societies had been veering off into a liberal direction for centuries. Much of this was in response to cruel, punitive policies enacted by various social agencies and systems. In a limited context liberalism served humanity well. It also had a somewhat paradoxical effect on human culture. On one hand it led to disease (West 2008) substance abuse and social dependency (Rossiter, 2014). On the other hand, newer forms of art, music and literature emerged. It is a theme recurrent throughout history. New freedoms create both cultural progress and social duress.

Arguably, in more recent times, there has been a contraction of liberalism as a result of increased media scrutiny. As more subtle and sophisticated gadgets have been developed to monitor, view and record behavior the clouds of social rejection, defamation and ostracism have loomed. Having lived through an extended period of liberalism, man now faces an abrupt U turn. Like the imminent rampage of converging tectonic plates the previously liberated ego is about to collide with a newly aroused and dominant superego.
The end result might be manifest in various ways. One possibility is that acts of hyper-surveillance will be heavily reinforced by society and consequently be shaped into a fairly permanent, habitual behavior pattern. Just as B.F. Skinner’s pigeons obtained food pellets rewards by pecking at a colored disc, so shall the ministers of surveillance be reinforced for pecking away at the reputation of others. Human behavioral dynamics tends to support that assumption.

The Fruits of Observation…

It seems our primate origins sometimes clash our democratic principles. The very makeup of our genes suggest we are in fact a duality. While the notion of equality goes as far back as the reign of Darius II in Persia and certainly gathered historical steam under the collective ideas of John Locke and Thomas Jefferson we also have a fairly ingrained tendency toward social hierarchies. Some people are considered more beloved, more iconic (or less worthy) than others. That unfortunate polarity has persisted throughout human history.

An inevitable byproduct of a hierarchical mindset is competition, i.e. the quest to obtain more assets and social feedback than the other person. Any number of rewards, including fame, fortune and sexual access will arise from that. So the person who seeks out and records bad acts by others can attain a degree of notoriety. Obviously, as per the first amendment, some acts of surveillance are necessary to prevent tyranny and can obviously serve society well. On the other hand, there is, like all other aspects of human experience a threshold beyond which a necessity can become a liability. Moving beyond that point could have psychiatric as well as social implications.

For example it could lead to social isolation. The extreme inhibition resulting from fear of exposure would tend to manifest itself in idiosyncratic thoughts, self-socializing, excessive fantasy and ultimately social-inadequacy. To the extent that social inadequacy can be a prelude to social misperception and conflict it also augurs poorly for the future of human culture.

This is true with regard to both journalistic and social media. The latter is virtually risk-free regarding the expression of imprecise, often critical, at times incendiary language. That has significant implications. A “blogger” does not have to read facial expressions, gauge vocal tone or determine other signs of emotional expression on the Internet, which is a safe zone of hostility and deception. Hiding behind technology precludes retaliation. Being able to “trash talk” without consequences also reinforces negative expressive habits that in time can become entrenched as high probability behaviors.

The problem with any technological advancement is that no matter how progressive it can never exceed the parameters of human nature. We are finite creatures with a substantial but circumscribed temperament and range of capacities and we really can’t go beyond that without jeopardizing our social and political equanimity. The rapid development and enhancement of tools by which to observe one another cannot unfold without negative consequences. The blogger might avoid repercussions. Society will not.

The key in all this is whether the ultimate benefit outweighs the cost. We invented automobiles and airplanes that enable us to travel to all ends of the earth at breakneck speed. That led to expanding economies, greater access to medical facilities, and vast interaction among peoples from other lands. Despite the fact that many thousands of people die each year as a result of these modern inventions does not deter our need and appreciation for them because we know full well that the social, temporal, medical, vocational and political advantages outweigh the risk.

Secret Sins…

Another possibility also derives from a human duality. While the ego is described by Freud as a creative regulator of psychic energy the personality has another feature, best captured by Gestalt theorists and Jungians, the former of who wrote about a bad me/good me dichotomy within the psyche (Perls, 1957), the latter about a psychic component he referred to as “the shadow” which he felt was the seat of creativity. (Jung, 1951) By their reckoning both aspects are necessary for holistic, integrative psychological functioning. This concept is a bit more concise than Freud’s concept of the id but in both instance the existence of the primal aspect of human nature is deemed necessary for psychological functioning.

Yet not all negative impulses can be channeled into prosocial pursuits. Like miscellaneous (throw away) genes that have no apparent function, some id manifestations are fairly superficial. They have little overall effect on either social norms or creative pursuits. These are often manifest as mini-anti-social actions almost everyone engages in for a variety of reasons, including release, convenience, socialization, identification and need gratification. In other words we all exhibit behaviors derived from the “bad me” – regardless of how moral we claim to be. Speeding on a highway, expressing frustration privately to a friend using foul language, telling off-color jokes, engaging in “off limits” flirting at the office, ordering a fat-filled donut despite doctor’s orders: whether these acts serve any psycho-social benefit is uncertain but they do serve to assert one’s individuality in the face of social controls.

In that context, excessive surveillance can put too many restrictions on such behavior and lead to the an unraveling of the collective psyche. The question becomes; why should even low-end, anti-social behavior be tolerated?

One possibility is the quest for independence. We are, as a species rather continuously tense and prone to both pessimistic and optimistic cognition. For a creature at once dependent on the security of the Hobbesian state yet absorbed by the quest for individuation such acts separate us from the pack. Conformity is an act of compliance. As such it has no aggressive component and cannot resolve tension-induced conflict. On the other hand marginal rebellion against rigid norms does precisely that.

One can extrapolate from that regarding the future of human psychological development. If such pockets of expression are necessary yet increasingly stifled by the proliferation of surveillance a potential sea change in the psyche could occur. Some of those changes might prove to be problematic.

As discussed above, the fear of being monitored constantly will skew the psychic balance toward superego dominance. That in turn will produce behavioral inhibition, social withdrawal and a decline in creative risk taking. There will be a tendency to think more in terms of future consequences rather than in the present. In other words, remove impulse and you remove present-sense thought and action. In that context we could become slaves to social scrutiny, thus rendered incompetent as solving problems in the here-and-now. Parenting, education, artistic expression, financial exploits, relationships – virtually all human endeavors will accede to the mandates of a pathologically blown-up superego. But that is perhaps only the first step toward a more ominous outcome.

A Second Adaptation…

An analogy to Newton’s third law of motion might be applicable, i.e. that for every action there is an equal and opposite reaction. In behavioral terms – one might expect to see a massive, rebound effect typified by the dis-inhibition of behaviors, emotions and attitudes. There is research to support that assumption (Keltner, Grenfeld et al. 2003). It portends a heightened rate of sociopathic behavior whereby we could conceivably become morally oblivious to one another. Allegiance will shift toward technology (which we will come to trust) and away from people (whom we won’t). Beyond that, new moral standards and restrictions on human behavior would be created that no one could possibly meet.That could lead to a broad inter-species disdain – a sense that we are all bad actors. Mass estrangement could result, which could exacerbate violent and/or amoral tendencies. However it is possible the psychic transition would not end there.

Infolation…

A third possible step in the psychic sequence has to do with an information glut. Like money, the value of information will decline in proportion to its increasing volume. That is because, in Information Theory terms, more input creates more noise/uncertainty, thus less pure, encodable information (McEliece 2002). The impact of info-volume could create a trend toward what sociologists have termed “anomie” – a wide spread cultural disengagement devoid of meaning, passion and vigilance. Under such conditions there will be an increasing lack of distinction between moral and immoral, beauty and ugliness, quality and sloppiness, attitudinal focus and general apathy. The new breed of techno-sapiens will be tethered to nothing but love for the technology itself. The dreaded, unexpected consequence of hyper-information (surveillance and otherwise) which purports to make us more aware and better citizens will have a reverse effect. Instead we will become desensitized to input; unmoved, unable to determine relevance or gain a sense of historical, cultural and personal time and place: a final countdown to apathy, with far too many of us (previously) socially dependent creatures now plagued by emotional futility and depression – upright walking beasts unable to make their way through the grasslands of history for lack of an experiential anchor point.

Hopefully the possible events outlined here will not occur, but one can anticipate that some negative after-effects will result from hyper-information. Whether that means we will abandon humanism and self determination in favor of a new and subservient technological idolatry is anyone’s guess.

Perls, F. (1957) Interview with Adelaide Bry in Gestalt Kritik, where Perls stated: “Resistance consists of a impulse and resistance to that impulse considered like a dichotomy. Resistance is frequently referred to as “bad” and in the context of regulation grows into just personal dictates of the client and not the therapist. Taken as a polarity it is as integral to health as the trait being resisted.”

This article discusses an adaptive cognitive mechanism, applicable in therapeutic settings, whereby the client/patient learns to make experiential discriminations based on the alternating need for broad/encompassing decisions vs. narrowly focused decisions in adapting to psychological circumstances and maintaining emotional stability via an intact, functional ego. This involves a conceptually simple self regulatory process and a modification on the classic definition of the ego.

The Ego: Defined and Simplified

The Freudian concept of the ego is complex, in as much as it involves the orchestration of experiences, emotions and actions across a broad range of circumstances and social norms (Henriques, 2013). In its essence, the ego is presumed to be a moderator between the instinctive component of mind, which worked in the wild, indeed enabled our species to survive, and the conscience-driven mind, adhering to broad moral and ethical concepts inherent in increasingly complex human societies.

The psychoanalytic definition of ego is as a hierarchical mechanism that can override both excessive need gratification and excessive guilt in the pursuit of external and internal equanimity. The ego must see, hear and feel the totality of human experience, including the rewards for abiding by rules and the repercussions involved in not doing so. It involves a balance between self and others, immediate and long term moralities and at times it must operate by proportion. e.g. (Is it wrong to steal food from a grocery store if doing so is the only way to feed one’s family?

Because it involves so many variables, the ego is hard to conceptualize into a therapeutic teaching tool. Perhaps that is why neo-Freudian ego therapies have been so reductionist; for example basic exercises in logic (Wenzel, Brown et. al. 2011). The latest revisions are seen in cognitive-behavior therapy, which uses consequential logic to counteract irrational thoughts and actions. Although this method has yielded positive results, it lacks a definitive structure and lends itself to various interpretations on what is “rational” and what is not. In some instances it also neglects to include existential aspects of the client’s experience.

For example it seems logical to suggest to a client that overeating leads to health problems, poor self image and social rejection. Yet while eating the obese client is actually solving a problem; for example alleviating anxiety and prompting the release of endorphins that can overcome temporary bouts with depression.

Co-considerations of the Ego

It is possible to introduce a slightly different concept of the ego without diluting its importance but to do so requires some discussion of the polarities involved in this aspect of mind. Ordinarily, newly trained clinicians are taught that the healthy ego facilitates patience, weighs all the ramifications and repercussions of experience before acting. It might best be captured in the phrase…all things come to he who waits.

The problem with that model is that waiting does not always equate with a positive outcome. In a competitive society, it is often the impulsive, aggressive and self-centered person who wins. Not only does he obtain material rewards but sexual opportunities and other benefits. In other words it is conceivable that in some instances an ego-dominant personality might be maladaptive – or at least non-productive. That is in part because while Freud assumed human society transcends the primal world many aspects of the wild remain in play. Competition, aggression, jealousy, hording, tribal hostility have never really been erased from human experience. While we discourage these behaviors we also know they have a place when it comes to survival.

With regard to impulse-driven success it could be argued that money and success do not equate with mental health. Yet that argument can refuted. For instance it has been shown that a high correlation between behavior and positive feedback (outcomes) does have bearing on one’s self of hope and behavioral resilience on one hand, and a susceptibility to depression on the other (Zimmerman, 1990). Clearly even an impulsive connection between action and reward can be conducive to feelings of pleasure and equanimity.

A second consideration involves the relationship between the ego and superego; specifically the question of whether and to what degree inhibition and guilt are good or bad for the person. To wit: is it possible that in some circumstances a higher morality can be self destructive?

In either case the need for psychic balance is obvious. However by its sheer complexity the id-ego-superego psychic system seems to require more than logic for a true, useful and functional understanding by both clients and therapists. For that reason it might be helpful to consider an alternative.

Lights, Camera, Stability

All of the above ego-related factors can be encompassed in the word perspective. This has a broad meaning. It is difficult to apply in the ever-changing life of a client because it has little cue value. However it is possible to use a concise cognitive cue and teaching piece to help clients cope with duress and improve the chance of self actualization.

Signal Shifting

The cue referred to here is derived from the simple mechanics of a video camera – the old fashioned kind with a knob to zoom in and zoom out depending on the need for varying visual perspectives. In some sense a zoom mechanism is consonant with how the human brain works (Leukowicz, Ghadanfar 2009). We alternately narrow and broaden neuro-experiential searches depending on circumstances (Spector, Maurer 2009). In fact it has been argued that one function of emotion is to create a narrow focus because when faced with flight or fight exigencies we must act quickly and focally (Platonic contemplation does not often work when one is chased up a tree by a leopard)

Much of human experience (and for that matter the history of clinical psychiatry) revolves around the mental dexterity involved in shifting between broad and narrow concerns. Knowing when to block out superego-driven psychic noise in adhering to focal goals in order to problem solve and instate a hope inspiring response/reinforcement relationship vs. when to await longer term considerations is quintessentially important. There is nothing new in this assertion, as it permeates the writing of Jung via his polarities theme (Sandford, 1980) Adler, as per his social interest concept (Watts, 2003) and Erikson, with his stage-oriented description of perspective (Gross (1987). As discussed earlier, even behaviorists have demonstrated the importance and benefit of language as a mediating mechanism that reinforces secondarily while also enabling the person to postpone reward attainment (Boersman (1966).

The zoom lens cueing system is really a mechanism to prompt effective “cognitive switching.” It teaches the client how to apportion immediacy and perspective on the notion that both can be appropriate and necessary in sustaining mental health. Use of such a simple concept might be beneficial in helping clients switch between these two mindsets via cognitive streamlining, and avoid being caught up in the muddle headed vernacular of psychotherapy and serve clients with varying levels of cognitive ability across a wide variety of circumstances.

Methodology

All forms of counseling are arguably didactic. Even client centered and psychoanalytic methods involve the teaching of logical thought patterns, self expansion and other habits deemed necessary in maintaining mental health. However not all therapies use a concrete cueing method to fortify the ego. In fact, at the end of most successful therapeutic forays clients are hard put to describe the formulas that “worked” to get themselves on track. Even the rational therapies tend toward dialectic interactions whereby clients’ irrational thought patterns are challenged in various contexts. No single cue or concept with the potential to govern adaptation across circumstances is typically taught. That might be why even many therapists cannot describe how or why their clients improved. In that sense a visual, concrete teaching model based on a “zoom lens” adaptive cue might be effective.

Application of a “zoom cue” might be exemplified in the following way…

Having discussed the nature and origins of stress leading to the referral as well as the ultimate desired outcome, the client/counselor can eventually gravitate toward discussions of whether “resolving your problem requires turning the zoom up or down.” For under-assertive, anxiety-prone clients the development of narrow, goal-oriented “close up zooming” might be the didactic focus. For impulsive, aggressive clients with depleted ego functions focusing on a distal view might be more therapeutic.

Ultimately all clients could benefit from being able to assess when the narrow vs. broad zooming is most beneficial. It would entail an elasticizing of the ego to serve both self and society- as Freud ultimately intended.

Clinical Considerations

It might seem a bit simplistic to state that overly narrow and/or broad mindsets are at the core of psychopathology. Yet a glance at the bipolar nature of psychological disorders does lend support to that idea. Aside from the psychoses – which are by now viewed as having biological causation, a great many of the diagnostic categories in DSM IV bear some relationship to either narrow or broad psychological thinking (Payne, Hirst, 1957) One type involves the narrow (id-dominant) style; for example explosive disorders, borderline personality disorders, anti-social personality disorders and narcissistic personality disorders. In those instances, self concern and the exclusion of broader considerations tends to lead to not only pathologies but to antisocial behavior patterns that lead to negative outcomes, such as drug addiction, incarceration etc. Meanwhile the anxiety disorders are often accompanied by overly broad, inhibitory patterns, whereby the person is virtually blocked in his attempt to meet needs due to fear of retribution, rejection and guilt. In such cases not only is the person psychically immobilized but is also rendered incapable to being “narrowly selfish” enough to meet his needs, self actualize and develop an enduring sense of hope and emotional resilience.

One of the most significant byproducts of over-inclusive (distal-zoom) cognition is depression. The inhibition and extreme restraint that results from overly broad thinking carries with it a number of secondary effects. In not being able to meet needs and self actualize the person with an excessively distal zoom perspective can end up with a sense of futility, hopelessness and self deprecation. In psychobiological terms this can exacerbate feelings of depression because just as an adequate relationship between behavior and positive feedback leads to the production of catecholamines (pleasure-sensitive neurotransmitters (Guerra, Silva 2010). so too could an absence of behavioral success create an opposite effect. In that sense there is an obvious connection among hope, chemistry and the “cognitive zoom”

Cognitive therapies address this issue in broad terms; mostly with regard to the client’s specific experiences on a moment by moment basis. And there are templates used in some forms of this method- as for example the phrasing employed by Ellis in Rational-Emotive Therapy (Ellis, Dryden, 2007). However the use of a zoom lens cue as a self regulatory guide offers an economic, heuristic mechanism with which to elasticize the ego and insulate clients against self-destructive thought, emotional and behavior patterns and various pathologies.

Learning and Simplicity

A relevant question to ask is whether a simple cognitive cue is sufficient. One possible answer lies in the nature of learning itself. Educators have historically (and effectively) used concise poly-applicable symbols and cues to enhance both learning and memory: for example the alphabet song, the grammatical rule “i after e except after c”, and various math formulas such as Pythagorean Theorem. Being able to whittle down information into a single rule or concept does improve learning and to the extent that counseling involves learning such a mechanism might help improve clients’ mental resilience.

The Counseling Process

As with all therapeutic interventions, the use of a “formula” can never replace the necessary preliminary features of counseling such as empathy, relationship building, gathering of background information etc. However at some point a zoom cue model could be effective, with the counselor perhaps asking questions like the following…

1. Do you believe it is sometimes okay to look out for your own interests – even if that might seem selfish?

2. Do you believe that it is sometimes appropriate to delay pleasures because a greater good can come from patience and perspective ?

3. If both of those statements are true does it not follow that life involves a proportion between the concerns of self and broader concerns, between immediate goal seeking and delaying immediate pleasures in favor of greater rewards down the road ?

4. Do you think therefore that mental health means figuring out how and when to employ one or the other?

5. If so, we now have our “golden mean” – our ideal proportion, So let’s now review what’s happening in your life and see if we can figure out when to narrow the zoom or widen it as well as the reason for those decisions.

It could be argued that the above model runs the risk of oversimplifying life; especially since human experience entails so many gray areas. Yet psychotherapy need not address every need or set of circumstances. It is not a religion. What it can do is provide a learning tool useful in dealing with occasional duress, ameliorate the oftentimes excruciating juggling act between self-actualization and social responsibility and aid in the lifelong quest for psychological equanimity and social adaptation.

This article discusses the therapeutic effect of language in an evolutionary context. The point is made that while the evolution of human language served a number of social/interactive, analytical and action planning purposes a parallel benefit accrued as well, which enhanced the capacities for self control as insulation against environmental dangers and setbacks and the ability to use internal deception to provide psychic endurance past the point of apparent helplessness.

Parameters

The origin of human language is a very complicated topic. It is one reason why some insist that man in some ways transcends the rest of the animal kingdom, (Ptolemy 2009). Indeed a creationist might refer to human language development as proof of a higher power or at the very least a super-organizational entity.

A glance at the language development of a young child provides grist for the mill. The infant begins with a vast number of loosely connected neurons in its brain which are barely able to coordinate basic motor skills for the first several years In that same time frame the child acquires exponentially a capacity to think and speak in terms of past present and future.

This is particularly interesting because the child’s brain is developing on several fronts during the first few years. Perceptual circuits, motor circuits, emotional (mid brain) wiring, the arrangement of vertical and horizontal pathways in the cerebral cortex are also developing, and must in order for cognition (which depends on such experiential content) to expand. Yet while the child does not go from taking his first steps to running a 9.5 hundred meter dash in three years his language skills broaden at an astonishingly rapid pace. His brain development is skewed in the direction of language acquisition. One can speculate about how this ties in to human evolution.

The Anatomy of Pleasure

Some aspects of human language evolution were predictable due to emergence of opportunistic human anatomy. For example, the lowering of the hyoid bone in the throat enabled our ancestors to extend vowel sounds. This would have enhanced their phonetic variety and provided several advantages. One would have been esthetic.

Due to enhanced breath control the capacity to sustain and vary vowel sounds could have produced an auditory end product (including song) attractive not only to members of the opposite sex but to all members of the social group. The reason for this assertion derives from a common theory of esthetics, stated at various times by St. Augustine, Descartes, and in more recent times by Berlyne (1960), (1975) and Zeki (2009). The theory states that any stimulus that includes both familiarity and novelty will be perceived as pleasurable and that the pleasure could take any form or serve any purpose, including music, inspiration, seduction, even humor.

The underlying reason has to do with the two aspects of the pleasure response – search and closure. Pleasure is not a free experience. It requires some degree of perceptual work Freud (1928). One must earn it by converting irresolution into resolution. Faced with a totally unfamiliar stimulus complex the perceiver will tend to withdraw from it because he cannot to “wrap his brain” around the input. By the same token encountering a completely familiar stimulus complex will also lead to avoidance because no perceptual work is required. Any experience that precludes a successful search cannot result in resolution, i.e. closure. In that context, the ideal “pleasure proportion” featuring a workable, resolvable combination of sameness and novelty would have accompanied human expression from the outset.

The First Crooners

Many believe descent of the hyoid bone first appeared with Neanderthal., though it seems rather unlikely that while Homo erectus’ hyoid position might have precluded extended breath control and vowelization his cognitive abilities could have developed without some sort of linguistic impetus (bearing in mind that it is also possible to speak without vowels; as exemplified by the click language of the !Kung people in Africa).

Whether or not hyoid descent was confined to Neanderthal and Sapiens is not certain but considering the Darwinian concept of a conversion, whereby a trait initially favoring one skill is eventually used for others, it seems likely that early versions of both species quickly learned to profit from this newfound oral capability. It is also likely that this capacity was refined over time alongside its physiological correlate, upright walking (bipedalism allows for such changes in bodily structures, including the spinal cord, digestive tract and voice box). This trait obviously passed on to our own species.

The Expansive Song List

As his phonetic breadth increased early homo sapiens’ expressive skills were more finely orchestrated. That would have enabled him to learn search and closure expressive sound patterns that were attractive to listeners. Considering man’s primate-consonant penchant for mimicry this pleasurable communicative trick would have caught on. Early man would have created a new, dual aspect of experience combining pleasure with object, action and descriptive labels that could be applied to experiences and people.

At that point the stage was set for a new version of “alpha” who gained his or her influence not by the typical primate criteria of physical prowess and dominance but by esthetic expression. Over time, as human society expanded and roles proliferated the expressive alpha tactic filtered down through the ranks. Musicians, prophets, healers, ministers, writers, poets, minstrels, actors, politicians mind healers and eventually psychotherapists were included in the neo paradigm, each able to establish themselves in the hierarchy by word as well as deed.

In terms of that scenario one could argue that the benefits derived from being an “esthetic communicator” would have been enough to set human speech on its developmental path. Unfortunately, the analytic complexity of human language makes that doubtful. Running parallel to the esthetic factor is the element of meaning. In order for the listener to respond to the esthetic value of language he must understand the concepts and vocabulary of the statements. In terms of our own sociobiological evolution that required an additional capacity within the brain to gauge the level of intelligence and overall cognitive demeanor of the listener as well as an ability to process his facial expressions, body posture and movement trends in distinguishing approval/comprehension from confusion/ frustration and adjust the message accordingly.

The Evolution of Empathy

With regard to how, in the course of evolution homo sapiens developed a capacity to read the reactions of others, one possibility comes to mind. The brain of a primate is large relative to body size but while that allows for a variety of perceptual, motor and emotionally induced behavior patterns the most basic function of the primate brain might lie in a capacity to imitate.

Contrary to the oft-stated notion that the “brain is too complex for humans to understand” (Doglas 2006) one can reasonably view the brain as a kind of copy machine. If not for that foundation long term memory would be a pipe dream and our emotional depth would be restricted to momentary fight/flight experiences. When inputs impinge on the brain it appears the brain’s neuronal patterns replicate the energy signature patterns of the outside source in isomorphic fashion (Lehar, 1999). For example the visual energy signals of a tiger are replicated as if the color and dimension of the tiger are housed in the perceiver’s brain. Over time various associations can be developed around that neural process but in the first instance the brain copies what it sees, hears and feels.

Since imitation is conceivably neuro-primal one can assume that one person interacting with another is in some sense incorporating that person into a neural scheme via a representation process, e.g “The other person, place or thing becomes me.” This notion is supported by studies showing a tendency for people to mimic unconsciously the patterns of others. For example the menstrual cycle between females living together tends toward temporal synchrony as per the so-called McClintock effect (Arden, Dye et.al. 1999). Also the emotional arousal levels of one person will tend to rise and fall in sync with those of the person with whom he is interacting (Kovalinka, Xyglatas et. al. 2011). This is one reason why social interaction can be so stressful to therapists, or for that matter anyone involved in a commiserative exchange.

However another question arises; specifically what can be inferred about how internal mental processes such as imitation, empathy and communicative esthetics play out in evolutionary terms?

The Ice Age

The climate was in an extreme cooling phase during the first human occupation of Europe and what is now called the Middle East. Living on frozen tundra had enormous ramifications for survival. In order to understand this, one must look beyond traditional paleo-anthropological concepts toward one more psycho-anthropological. This is not to minimize the challenges of the physical environment. For example with a scarcity of plants the Neanderthals and Cromagnons had to rely more on a diet of meat. That altered their behavior patterns in a quite natural way. Also, most of the game animals at the time were massive and with only Achulean axes and perhaps spears, early man had to refine his hunting and planning skills.

There was a problem inherent in that. Early man had to work extremely hard and long to carry on these and other arduous, lengthy tasks yet cold weather has a dampening affect on hormones, activation level, metabolism, mood and motivation, especially if a species migrates from the African savanna to western Europe without sufficient time for physiological adaptation (Leppauoto, Hassi 1991). In modern times we recognize that seasonal shifts from fall to winter are often accompanied by psychopathologies. While naturalists typically distinguish mammals from reptiles based on our constant body temperature and their lack thereof, mammals also experience downward physiological shifts in cold weather. Indeed the fact that human metabolism slows down considerably in winter and speeds up in spring is one reason why suicides tend to be more frequent in the cross over between the two seasons. The transition from a dormant season to an active season can be physiologically as well as emotionally overwhelming for depression-prone individuals.

In order for early humans to carry out the strenuous activities needed for survival in a cold climate (bearing in mind that there was no techno-culture to offer support) required a reliable, internally driven mechanism by which to summon and sustain activity levels when nature created an opposite effect. In simple terms it required the development and implementation of an energy regulating mechanism – the psyche.

The emergence… hesitate to say ‘evolution’… of such an ergonomic aspect of mind was probably less than completely adaptive because, as both Freud and Maxwell suggested, energy is a neutral phenomenon that can create or destroy. The new aspect of mind was more complex and vagarious than, say, upright walking or tool use and had to be channeled properly to advance the cause of survival.

An Adaptive Paradox

It is generally assumed that Neanderthal had a sense of his own mortality, as per his habit of burying the dead (Than 2013). That speaks to his psychological sophistication Yet in some ways mourning is an irrational behavior. The deceased is no longer functional in the group since his absence has no subsequent influence on survival and adaptation. Moreover since man was/is quite familiar with his own mortality there is no logical reason to feel emotionally “robbed” by the death of a family member or colleague. Yet death is a devastating experience, largely because we internalize it beyond mere logic. “How can I bring him back – through memory and legend?”…”What if I’m next to go?”…

Such contemplative, emotional largesse, combined with a cold climate could have led to feelings of helplessness, futility and depression and issued a double whammy if not for an adaptive conversion that modern clinicians will undoubtedly recognize.

The Self and the Artist

The adaptive mechanisms enabling early man to overcome these obstacles were likely in the language domain; not just regarding communication but three other forms that create a capacity for self deception, specifically; art, self-talk and self regulation. With respect to art; having an internal language capacity would have led to an ability to imitate and represent the outside world. In his cave paintings early man could depict the various animals around him, and create symbols of people within his social group. He might even come up with the idea of an alpha even higher in rank than those in his tribe, who could not only rescue him from his obscurity within the group but orchestrate all of nature – a god.

The artistic depictions at Lascaux and Altamira might have recaptured life but perhaps only due to a prior skill set. In order to find his inspiration the artist had to preplan as to subject, background, color scheme etc. In so doing he would have begun with a self talk exercise (not necessarily overt but perhaps covert) whereby his expanding frontal lobes vacuumed associations from Broca’s site in the fronto-parietal lobe so that the vast inhibitory circuits of the former could whittle down language into an inhibitory, fractional version of speech resulting in inner contemplation.

Through that guiding process size, angulation and action depictions could be altered. All aspects of life could be restructured so that victories could be imagined, dangers avoided, animals spirits created from scratch, joyful outcomes conjured up in the absence of direct experience or concrete possibilities.

The benefits of this new, internal capacity would have proliferated as a result of social influence and mimicry. It would have facilitated both artistic expression and, through its capacity to re-shape experience, also enabled our ancestors to summon and sustain arousal, motivational and behavioral activity levels in a sparse and hostile terrain.

Up to that point cruel nature had decided which creatures could carry on and which would fade into oblivion. Then man came along and was able to outwork and to an extent circumvent its whim.

From the Cave to the Couch

Any psychotherapist can see the connection of the above elements to the present day use of defense mechanisms in providing emotional equanimity during difficult times. Homo sapiens appears to have embarked on a somewhat ironic evolutionary path. In taking a ramp off the road of natural selection we developed a secondary, alternative and internal world which we can distort experience at will in accord with our needs and frailties. Darwin’s theory of natural selection holds that organisms survive through random mutations that over time are selected or rejected, based on their juxtaposition with the outside world. That rule was overturned to an extent with the emergence of homo sapiens. Indeed one could argue that we have actually been able to separate ourselves from the outside world and that human evolution offers a partial exception to the Darwinian rule. Does this mean we are transcendent, or as Darwin and Huxley surmised, merely a biological after-effect arising from the initiative of mindless nature? Such a question is highly philosophical, maybe even moot. Yet it is difficult to refute the notion that the rise of mankind was marked by an ongoing competition between nature and the imagination.

This article discusses the dynamics of human aggression and specifically the proclivity toward killing strangers, via the mass murder scenario. The point is made that a package of traits involving perceived estrangement, pseudo-familiarity toward the victims, a nihilistic outlook (depression), obsession and ego-dysfunction combine to increase the probability of such an act.

Genes and the Ego…

If one asks a liberal democrat why people engage in mass murder he or she will typically say it has to do with the availability of guns in American society. If one asks a conservative the same question, he or she will likely say it has to do with mental illness and that the mere presence of guns does not correlate with the act of committing mass murder- anymore than watching violent movies automatically foments violent behavior. If one were to conduct a statistical correlation study of the number and/or percentage of gun owners vs. the number and percentage of people who use guns to commit mass murder the correlation would be so low as to be statistically insignificant (Peterson, 2014) (Kennedy, Skeem et. al 2012). The same would be true with regard to people diagnosed with mental illness, including psychotic disorders. In fact it takes a fairly organized mind to plan such an act, as well as an intact enough reality orientation to select targets, obtain weapons, plan on time, place etc.

Of course guns are used to kill, and some mass murderers have been presumed to have mental disorders. Yet developing a predictive model by which to identify potential mass murderers would seem to require more than a liberal or conservative argument and perhaps encompass more than gun ownership or severe mental illness. That raises a question. To wit, if neither mental illness nor gun ownership are necessary and sufficient antecedents where does one look to find a preventative model?

Genes and Detachment…

A number of biologists have surmised that imbedded in virtually all organisms is a disposition to behave altruistically toward those within the same gene pool, and/or those with whom one has familiarity on a regular basis (Thompson, Hurd et.al. 2013), (Wendell, 2013) – especially if the relationship involves cooperative, mutually beneficial interactions. On the other hand it seems that as the genetic and social ties drift off into dissimilarity and non-mutuality the potential for aggression increases. It isn’t necessarily that our genes foment hate based on the differences between organism A and organism B. It’s just that with differences (real or perceived) the protective instinct dissipates. With respect to the psychology of Homo sapiens it is left to the individual to fill in gaps. Various outcomes can arise from that sense of estrangement.

As the most social of the primates we are extremely dependent on social interaction. As exemplified by a survey conducted by the Self Help Collective, various studies on public speaking and other potentially performance-related experiences suggest humans fear ostracism more than death (2014). When children are neglected or ignored in early development they not only become un-socialized but resentful and aggressive.

In a sense the necessary ties we have with one another comprise our greatest strength; giving us hospitals, caring parents, mentors, supportive agencies, even clerics willing to hear about our sins, trials and tribulations and offer solace. However when that “social compact” (i.e. the expectation of nurturance and reciprocity) is violated it is often perceived as betrayal. That can lead to retaliation.

If detachment revenge were all there is to it, prediction and prevention would be fairly easy and rather ominously, there would be more potential mass murderers. Obviously there is more to it than that. In fact another, paradoxical and dual psychic component seems prerequisite. One part consists of a self-perceived closeness to the target group or individual because it is difficult to hate, let alone kill people who are inconsequential in one’s life. In order to develop the essential mindset enabling one to pass beyond the regulatory barriers of ego and superego the avenger must categorize his victims to a point of utter familiarity. A second part (comprising the paradox) involves the perception that the targets are alien and distressingly unrecognizable.

It is a paradigm adopted by Adolph Hitler, who first espoused clear separation between the “Arian race” and virtually all others, then assigned distinctive traits to his targets (Jews and others) by turning them into them familiar entities. Recognition – separation; these elements can set the stage for ostensibly any anti-social act, including mass murder.

Such a dynamic can be seen with what might be called estrangement murders as well. Mark David Chapman (the murderer of John Lennon) and Ted Bundy, who targeted women who resembled a woman who rejected him prior to his rampage yet were in fact complete strangers are prime examples of this. It has also typified the onset of wars throughout human history, whereby specific traits and distinctions were assigned to enemies – by race, religion, ethnicity, thus releasing the demons within our species to freely kill and maim.

The Freedom of Psychic Rigidity…

Thus far, the elements of estrangement and quasi-familiarity have been viewed as requisite psychological components. In fact there are other variables to consider. One is considered here to be a kind of “theory of everything” with respect to anti-social behavior – the obsession. In order to discuss this it might be best to begin with Freud’s concept of the ego. This notion is a bit general but can be said to correspond to the self-awareness, linguistic, planning and regulatory functions of the prefrontal cortical lobes. Functionally the ego is designed to prevent extremes. It is a psychic clearing house, an arbiter of reason invariably weighing factors such as risk, reward, self image congruence, social ramifications and emotional repercussions in the process of behavioral selection (Strahey, 1990). If it can be bypassed, all those modulating capabilities can go by the boards, making each of us capable of extreme behavior patterns. With a depleted ego man can indeed potentially devolve into a “killer ape.”

The intact ego is pan-influential, so most of us operate via multiple faculties, weaving cognition, emotion, perception, memory and behavior into socially acceptable patterns. That’s why most people with guns do not murder and why most viewers of violent movies do not go out and commit violent crimes. The ego filter is a greater deterrent to sociopathy than gun laws or artistic restrictions. It provides a pro-social tether that can rein in aggression so we can think beyond the moment. On the other hand the tether can be severed by obsession-compulsive features.

In order to override the ego’s considerable breadth and influence requires extraordinary force. An emotion-induced obsession suits that purpose very well. But it must be more than an episodic occurrence produced, for example by situational or “state” anxiety. It must be a fairly ingrained trait, created either by innate temperament, doctrinaire conversions, trauma or neurologically induced rigidity – the latter of which can produce hyperactive, perseverative tendencies in the brain, forcing it to operate on a single track to the exclusion of peripheral inputs and sensations.

Language and Tunnel Vision…

As Simon has demonstrated emotional states tend to create a narrow focus (1967). This trait is adaptive in nature but often maladaptive in society. Being pursued by a predator certainly requires a singular focus on escape. On the other hand many social situations require subtlety even in emotional circumstances. Thus the human animal must be able to modulate what is essentially an adaptive mandate. The only way for this to occur is through the self-regulatory capacities of language. The ego is ultimately a language medium. One must not only have expressive skills, but two other linguistic skills are needed to rein in destructive emotional behavior. One is a capability (honed through practice) to talk one’s self through duress. The other is a set of experiences whereby, through interactions with others, a sense of consensual reality is developed so that the individual can “think as others think” and thereby preclude extreme thoughts and feelings.

All of those skills are ego-driven as Freud suggested, thus language habits (particularly a capacity for self-reflection and regular opportunities to “check in” with others in conversation) are a critical component in deterring the urge to kill.

Depression…

Seligman wrote about depression as being typically accompanied by a sense of helplessness (1975) i.e. the perception that one’s behavior cannot resolve disparities between needs and frustrations. While depression might seem a harmless emotional state- at least in its impact on others, it is in fact quite dangerous. Running out of solutions leads to nihilism, i.e. a belief that nothing matters. That leads to a ‘nothing to lose’ mindset which can not only remove the fear of retribution but make it seem somehow attractive.

Aggression…

Thus far several variables have been discussed with respect to the psychological makeup of a mass murderer. One obvious feature – aggression – has been left out. One might assume that to engage in mass murder requires an aggressive outlook. There is one problem with that, however. People get angry all the time yet do not act out in such extreme ways. Moreover, mass murderers often do not seem to respond to grudges or momentary pique. Nor do they necessarily have a history of assaultive behavior (Hsu 2012). Indeed the act can sometimes appear random and devoid of imminent rancor toward those around him prior to the act. Thus mass murder might not be purely an act of rage. As strange as it seems the act might be in its essence more cognitive than emotional. The question is…what cognitive process would it involve?

Totem Pole Demons…

Recently, President Obama (and many others) expressed frustration over the fact that mass murder happens much more often in American society than in all other advanced societies – thus his call for stricter gun laws. He is probably wrong about that, since many nations have experienced mass murder, including Russia, England. Germany and many other advanced countries, going even further back than Jack the Ripper. In that context it makes sense to look beyond a specific nation for the solution – indeed to the very nature of our species. Homo sapiens is a primate, or at least shares many neurobehavioral traits with other primates. One such trait is a large brain, which in turn disposes all primates to be highly social. The reason is fairly obvious. Big brains enable their owners to perceive faces, behaviors, physical characteristics, vocalizations and visual inputs in more detail. As a result primates are better able to distinguish among members of the group; for example to determine which ones are dominant, submissive, sexually receptive, emotionally unsteady, intelligent, wily, deceptive and so on. As Pinker (1997) has noted some of the impetus for expansion of the human brain was to enhance social perception to meet the demands of increasingly complex human interaction,.

One of the most significant byproducts of this process in humans is something that can be referred to as the “equi-hierarchy.” In chimp and gorilla social groups there is little question of rank. The silver back runs the show in gorilla groups and the alpha male rules in chimp society. Most of this is based on size and strength. Yet the chimps in particular experience political revolutions from time to time as lower ranked males stage a coup against the alpha male. That is in part because they can see beyond size and strength. They see numbers as an advantage. They can also consider the age and physical decline of the alpha and other signs that he is “losing it” and plot against him in accord with those perceptions.

Human perception enables us to see an even greater number of variables. Size strength, intelligence, artistic ability, social notoriety, sexual prowess, financial status and so many other features are symbols carved out on the totem pole of human experience. The net result is that the human hierarchy is so fluid as to be virtually in the eyes of the beholder. One might even argue that beyond historical stimuli such as the Magna Carta, Locke’s treatise on the social contract or the Declaration of Independence, it was basic human neuro-psychology that created an inevitable drift toward democracy, particularly after the advent of the handgun, which really leveled the playing field.

The equi-hierarchical phenomenon has enormous impact on human thought which is reflected in the obsessive human need for attention, approval, fame etc. It disposes us to feel entitled to totem pole ascendency; not just as a possibility but as a basic right. One byproduct of a doctrine of equality (not just before the law – which was the sole intent of the constitutional framers – but in all aspects of life) is potential conflict; both within the individual and in his interactions with society. This dynamic can provide justification for a mass murderer. The attitude…”if society deems me equal to all others yet I fail to ascend the ranks it is the fault of society and I am no longer obliged to live by its rules” can prevail, making mass murder less an emotional act than an intellectual assessment.

In that context, President Obama might be more accurate to say such crimes are more prevalent in societies with a low emphasis on acceptance, with highly fluid versions of equality and in societies where (perhaps through media-fueled emphasis on universal fame and notoriety) citizens are unable to find satisfaction in life’s smaller conquests and achievements. By that reasoning a philosophical society, or one with religious overtones – for example via the teaching of acceptance and a capacity to defer to a higher power – might well produce fewer violent individuals than a non-philosophical or godless society.

Diagnostics and Prevention…

The ultimate question is whether the above discussion can be applied to prevention. While no system is even close to perfect in predicting human behavior there are criteria that could increase the likelihood of detection and prevention of mass murder. For example an individual might be disposed to mass violence when…

1. There has been a long period of social isolation wherein ego functions could have been depleted by lack of normalizing social interaction and where an internal language capacity honed by faith or moral teaching is absent or diminished.

2. The individual expresses a sense of threat and betrayal due to a perceived, threatening disparity between his aspirations, self perceptions and his idealized status and actual life outcomes.

3. When parents and family dynamics emphasize the “specialness” of the individual despite his or her average traits and accomplishments – which would set the stage for threat inducing cognitive dissonance.

4. When cognitive and language habits reflect the ultra-familiarity with the traits, habits and motives of potential targets with whom the individual has no familiarity.

5. The presence of mental rigidity, in the form of repeated complaints, use of phrases in the characterization of others, reliance on rituals and habits, extreme emotional resistance to change, and even rigid eye focus, i.e. lack of shifting eye movements that correspond to deliberation, the weighing of ideas and also reflect a pause capability between feeling and acting that can indicate working ego functions.

6. The presence of depressive traits, with accompanying feelings of helpless and a nothing to lose mentality. That particular trait removes to an extent the fear of getting caught and punished and skews the risk-reward ratio toward anti-social behavior.

Each of the above factors could be seen in the behaviors of various mass murderers and serial killers. Certain factors might be more pronounced than others; for example some might have ample social contact but excessive rigidity and dissonance between self image and actual status as well as a closeness-detachment fixation toward others. Just how these factors could be used for prediction scenario, either by clinicians, family members, police or the courts is hard to say. It could be a question of proportion whereby some of these factors might be non-existent while others might be pronounced.

This does at least hopefully provide a beginning model to identify individuals at risk that can be used even in non clinical settings as a mean of reducing, if not eliminating the occurrence of such acts.

REFERENCES

Freud, S. (1990) The Ego and the Id; The Standard Edition of the Compete Works of Sigmund Freud. J. Strahey (Ed.) W. W. Norton.

This article discusses the roots of cognition and decision making. Those phenomena are assumed to originate with the first molecular interactions in semi-closed systems made possible by the advent of lipid-secured membranes create interactive feedback as opposed to the “drift” seen in open systems. This prototype is viewed as a precursor to advanced cognitive abilities, culminating in human cognition.

Whatever is meant by terms such as cognition, intelligence and memory cannot readily be described in concrete organic or neuro-functional terms because there does not seem to be any direct correlation between the wiring or interconnections within the animal brain and the capacities to use basic behavioral mechanisms such as flight, aggression, altruism, or even instinct. (Timmer 2012). Part of the problem lies in semantics; particularly since it is homo sapiens sapiens doing the classifying. We are the only scientists in nature thus our descriptions of nature are both confined to and embellished by our own values and experiences. As owners of a massive cerebral cortex we typically view functions such as altruism and long term planning as the result of big brains. Indeed several renowned thinkers in fields as diverse as psychology,anthropology, neurology have surmised that such functions are a luxuries bestowed on us through the evolutionary expansion of the frontal cortical lobes (Rowe. Bulloch et.al 2001) Others have through meticulous research concluded that limbic circuits within the human brain like the hippocampus provide memory consolidation capacities (Suzuki,Yaneke,et.al. 2004). It appears during a learning task the hippocampus becomes active when the task has been learned, analogous to a librarian who gathers the books and re-organizes them into proper catalogue sequence after the readers have gone home.

On the other hand, most animals and all plants lack a hippocampus and have no frontal lobe. For example a female cobra’s brain is miniscule compared to ours, yet once she recognizes she is pregnant. she digs nests to hide her young, obviously contemplating the possibility that predators and/or unfriendly climatic conditions might jeopardize their safety.

So many other ostensibly complex cognitive abilities are seen in this neurologically simple creature. To prepare for the birth of offspring, the female cobra must on some level be a naturalist, cognizant of the habits of predators. She must be a prognosticator, able to guess as to possible variables related to temperature and terrain. She must also be empathic enough to put herself in the shoes of a hungry predator who feeds on cobra eggs. In effect she must sense their hunger by extrapolating from her own. She must be a chronologist, capable of determining when her offspring will emerge from the eggs and altruistic enough to not only become enraged and frightened over threats to her offspring but also to feel satisfaction when her protective gestures lead to successful outcomes as her offspring make their way into the world safe and sound.

Interestingly, one can attribute all these cognitive and emotional traits to organisms even simpler neurologically than the cobra. It is even possible to extend discussion of so-called cognitive abilities to simple celled creatures because many appear to make decisions in adapting to their surroundings. For example amoeba engage in foraging behaviors tantamount to a huting strategy (Cumming, Hall et. al. 2010). Even at the level of the most primordial organisms, bacteria and viruses – for that matter mere molecules such as RNA and DNA there are decisions being made without use of brains, language or memory substrates – at least as viewed in a conventional sense.

The Search for a Cognitive Monolaw…

In discussing the nature of cognition, one can attempt to gain closure in several ways. One way is by assuming all these behavioral traits seen in brainless and meagerly encephalized organisms are instinctive, thus built into their rigid neurologies. In that sense nature can be said to provide then with the luxury of not having to use cognition or make decisions. A second way is to assume that our concept of cognition is anthropocentrically derived and a somewhat reductionist take on the nature of mind. Unfortunately, neither approach resolves very much.

The instinct argument has flaws, as was pointed out by Maslow (1954). To assume organisms are endowed fortuitously with an ability to act in adaptive ways with little or no neural mechanism by which to do so, one has to first ask about the sources of these fortuitous abilities. Did nature say “presto” – you now have an instinct that just happens to coincide with the demands of the outside world? Unlikely. Even if one invokes the Darwinian argument that various behaviors were honed by evolution over time it does not explain the structural and functional origins of the behavior to begin with. While it is possible that some organs and behavior patterns evolved to serve purposes other than ones for which they were eventually employed – for example the wings on insects and birds that favored temperature regulation before being employed in flight, this still does not address the question of why structures evolved coincident with purpose – as opposed to being random and inconsequential. It also begs the question of how many behaviors had to be tried, winnowed out and selected before the (temporary) right ones arrived through evolution. In fact that juxtaposition of the random with the purposeful is what makes natural selection seem a bit problematic (Russell 2007).

In his book How the Mind Works (1997) Steve Pinker addressed this problem.by proposing a quasi-computer model of mind. He suggested that each layer of the nervous system has built in modules, i.e. for language, perception etc. and that they arise not through random mutations but through apriori functional circuits requiring little or none of the trial and error process inherent in natural selection. He also suggested they blend smoothly with previously established brain sites both structurally and functionally – a hugely determinstic, if not fatalistic concept for a Darwinian advocate.

Even if Pinker’s well-written thesis is correct, it in nonetheless difficult to explain decision making at the anencephalic and molecular levels.

It would seem there is another way to resolve this issue. It is by assuming the existence of a transcendent law or mechanism in nature that requires no brain yet provides for and encompasses phenomena such as memory, stimulus recognition, decision making and cognition. The obvious question is where to look for such a cognitive monolaw.

The Roots of Cognition…

How can decisions be rendered, circumstances recognized, prior experiences referenced/recalled without neural mechanism by which to conduct these operations? One way to address that question is by analyzing the most basic equipment available to the first organisms. A basic requirement of a life form is the capacity to replicate via the transactions between RNA and DNA to align genes in the proper sequence. Another requirement is the alignment of amino acids coupled with the release of water for binding during this process to build proteins leading to the construction of cells and organs. Life also requires mechanisms to react to the outside world for energy replenishment and to facilitate approach and avoidance responses. Yet there is yet another, often overlooked component – a capacity to undergo flux and at the same time maintain internal stability. In other words a homeostatic, self-correcting (cybernetic) mechanism. If not for the latter, life could have come and gone many times over without lasting long enough to undergo changes in complexity.

Resilience…

In many ways the latter typifies the essence of life more than any other quality because longevity is the key to development and fitness. While replication is often cited as the sine qua non of life, and while protein synthesis is considered similarly important, crystals and other non biotic elements divide under certain chemical conditions in a process that runs parallel to replication.(Ferro, 2010) Moreover, as Shapiro has pointed out, amino acids can come together to form proteins yet not in themselves comprise anything resembling what we might call “life” (1986) Any number of pre-organic entities might have formed in the primordial soup, some providing organic essentials. Yet many undoubtedly broke apart from the onslaught of temperature shifts between day and night, oceanic tumult and/or due to the fragility of the compound itself. In that sense, for life to ensue required a holding together of a proto-organic entity so that its mechanical resilience would enable the prototype to expand, become more systemically complex (for example prokaryotic cells incorporating other components to evolve into nucleated eukaryotic cells. Thus if one seeks the foundation of biological existence it might be described in one word…”glue.”

The Origin of Stasis

A most essential glue factor in the advent and continuation of life forms was of course, carbon, a compound that is neither plentiful in the universe nor in the earth’s atmosphere but one that makes up for its lack of availability with extreme resilience. It is most easily seen in the diamond, the hardest substance on earth, which consists exclusively of carbon atoms. Yet carbon is not just firm and resilient, it is also highly chemically accommodating… or “social” if you will. It is capable of bonding with a wide variety of other molecules and provided a perfect base for not only the origin but the continuance of early life forms. Because it has these communal properties it can be said to be the most communicative of all the compounds – a central grammar in the expression of biochemical synthesis.

Yet if carbon provided the mechanical and attractive necessities for life to form, what then provided the functional properties, i.e. the reactive essentials that enabled living things to multiply, heal, move, and ultimately make decisions? To an extent the answer might lie in yet another single term…electricity. Nothing in nature. including carbon can interact with (attract or repel from) other components without the impetus of an electrical charge. Positively charged atoms repel other positively charged atoms, while they are attracted to negatively charges. (The reason probably has to do with the inherent symmetry of the universe but that is beyond the scope of this article). In any event, on some level, stability, resilience, organic communication and genetic alignment – all those things that comprise life, are ultimately rooted in attractions and repulsions. The centralizing feature of carbon is itself rooted in that process. Therefore in seeking the roots of pan-organic cognition this might be a good place to start.

Where have you gone Albert Einstein?..

Watson and Crick, discovers of the DNA molecule, can be compared to Isaac Newton with respect to the latter’s contribution to an understanding of gravity. Newton used precise mathematical terms to determine the “what” of gravity. However it was not until Einstein came along and developed his theories of relativity that we came to know the “why.” As of now there is no “why” (or how) when it comes to an understanding of how life began and more particularly how organisms (indeed molecules such as messenger RNA) are able to make decisions. Some discussion of electricity might be helpful in that respect.

The Roots of Communication…

Communication can be discussed in other than anthropocentric terms, without compromising the nature of human communication. In other words there can be said to be a substrate or monolaw of all communication in nature that encompasses, and set the stage for human language. In its essence communication is really nothing more than an interaction of two components with a systematic (repeatable, predictable) behavioral outcome., i.e. a nonrandom interaction. When a positively charged atom interacts with a negatively charged atom, a galvanization occurs. That is a form of communication.

There are several types of communication. For convenience sake three can be discussed. One type is unilateral – like a breeze pushing an old newspaper along the street. In this instance the newspaper does not have much impact on the breeze, other than perhaps some minor degree of interference from friction. Another type of communication is reciprocal and is exemplified by a protozoan moving back and forth in response to temperature and light differentials.

Communication can also be multi-factorial, whereby complex channels within a system interact with one another to produce fluctuations without loss of systemic integrity. In each instance the level of decision making is different. In the unilateral model (the breeze and the newspaper) the decision is so basic as to be inconsequential. The paper drifts off and does not turn back on the breeze to cause a readjustment. In a two way, reciprocal model the system is modifiable and can be defined as proto-cognitive because the interaction does not end with the first stimulus. In a sense the advent of feedback produces a communicative imperative – seen in the to and fro reactions of the protozoan. At that level, and as a result of feedback, decisions can be said to exist on every organic level.

One does not need a brain or even a primordial nerve bundle to exercise cognitive decision making. One only need electrically-fostered closed, reciprocal interactions. In fact, as Cenik, Cenik et. al have demonstrated with respect to the origin of life the mystical, heretofore unexplained ability of RNA to “send messages” to DNA in creating protein structures in the body might be the result of a simple electrico-chemical reciprocity within a closed system (actually semi-closed is a more apt description) where feedback forces mechanisms to interact in orderly, systemic (and redundant) fashion. The charge originates from a nucleic acid compound) which conveniently contains a highly combustible sugar and energy catalyst known as ribose. A message is conveyed because the system is contained rather than unilateral inputs drifting off into the ether In effect, reciprocity is trapped within the membranes. Boundaries enable feedback to occur. A similar process applies to the famous Miller-Urey experiments where charges impinging on chemical compounds helped produce amino acids (1959).

Membranes are crucial. The availability of lipids in the earth’s earliest environment was necessary since they can form a sheath to both insulate cells and organisms from the outside milieu yet enable nutrients to pass through its border so that organisms can extract energy from the sun and other sources. Semi-permeability is a crucial necessity for life sustenance.

From the Simple to the Complex…

In terms of the above discussion, the most essential difference between cognition at the most basic level and within the realm of human thought might lie in the difference between a simple and complex feedback systems.

Complex Cognition…

In some ways complexity is how life overcomes the law of entropy which holds that all systems in nature will proceed toward decay. The more diverse yet internally regulated the variables in any system the less likely it is that decay will ensue. Since we all die life does eventually proceed toward decay but only after staging an enduring battle against entropy in the face of constant changes in metabolism, maturation, injury and general wear and tear. Organisms endure and can delay entropy due to what H. Ross Ashby called a cybernetic process. Here he was referring to a self correcting system in which higher order rules govern the ebb and flow of internal variations. This applies to brain and soma because while the mind recognizes dangers, for example fleeing from or avoiding unpleasant experiences, so does the immune system and all other bodily structures and functions. Indeed one could argue that a cognitive process exists on every level of organicity and in its essence has little to do with brain or mind.

Utra-stability…

In that regard a related concept proposed by Ashby was that of ultra-stability. This involves a system where local events can fluctuate but in being tethered to a larger system ultimately re-synthesize in adherence to the rules of the larger system. As confusing as this might seem, we can use the game of chess to create a comparison. Each move by an opponent leads to a counter move. One player might see a direct line between his opponent’s rook and his queen and choose to move the queen to evade to avoid the problem of exposing the king. However in contemplating the move the player notices that his knight is within striking range of the opponent’s rook and that taking out the rook would also save the queen and king. During all this the main “code” of the game (to capture his king and protect yours) remains the same but within that rubric a number of sub-plans are orchestrated.

There is a difference between cognition per se and a game of chess. There are also similarities. The similarity owes to the fact that both operate by multiple sub-plans and actions revolving around a central theme. The point to playing chess is to win. The point to garden variety cognition is neural stability. In that context, Cognition is a dual process, serving two functions; one being neurobiological ultra-stability, the other involving the experiential correlates of that closure.

Arousal, Quiessence and the Sheffield Study…

The above reference to ultra-stability need not be confused with the Freudian notion of tension reduction. For example with regard to the restoration of stability – a mechanism seen in all organisms – it is not a question of arousal or tension being reduced by either neural or experiential closure. It simply refers to restoration of a steady state, which, as conveyed in the classical study by Sheffield could take the form of tension reduction or tension induction (1966).

This view of cognition can theoretically be applied to a wide variety of decisions rendered in nature, and perhaps explain phenomena like messenger RNA and Dawkins’ notion of the “selfish gene. (1976).” While in many ways this might make human thought seem less unique, perhaps even be discouraging to the anthropocentrist it provides a possible clue as to how nature initially paved the way for human intelligence. While speculative, it does offer a possible connection between the hunting amoeba, the maternal behaviors of the cobra and Newton’s calculations on gravity.