Star Larvae

Sunday, October 16, 2016

As we near the end of 2016 a discontent grows among the practitioners of evolution theory.

Research results keep outrunning the theory’s capacity to herd them—the results, that is, but the practitioners, too—and corral them inside the Modern Synthesis.

Evidence of this discontent keeps piling up. For example, in July 2008 researchers from various fields of biological science convened in Altenberg, Switzerland, to formalize a so-called Extended Synthesis of evolutionary theory. MIT Press published the conference papers as a sourcebook, Evolution - the Extended Synthesis.

The book's contributors reassure readers that the findings presented pose no fundamental threat to the Darwinian model. The collective attitude seems to be that new discoveries in genomics, epigenetics, ecology and related fields merely complicate—without undermining—the natural selection model of evolution. These reassurances reek of cover, of maneuvering to avoid the label, “fringe” or the academic equivalent. Maybe, “Traitor.”

A discontent among those working in the evolution biz is brewing if not already boiling over.

It seems that the evolution theory we learned in school is getting an earnest retooling. The breadth and depth of that retooling divides scientific opinion. It might involve tightening a few knobs, or it might involve a fundamental rebuild—a Kuhnian paradigm shift. Which way things shake out remains to be seen.

Saturday, February 20, 2016

Update: on Feb 27, 2016, I posted this book review, minus the hyperlinks, to Amazon. The next day I received the following reply via email:

Your review could not be posted.
Thanks for submitting a customer review on Amazon. Your review could not be posted to the website in its current form. While we appreciate your time and comments, reviews must adhere to the following guidelines:
http://www.amazon.com/review-guidelines

No specifics were given regarding my violation of the guidelines. After reading the guidelines, my best guess is that, because my review included the word, "turd," Amazon’s filterbot flagged the review as obscene or somehow otherwise objectionable.

So I removed the offending syllable and re-posted the review. I changed the phrase, "turd duly polished" to "spin duly spun" -- not sure if that makes sense, if spin can be spun, but it's no less coherent than much of what circumvents the filters and gets posted. In any case, I hoped that that might do it.

Well, spinning the turd did no good. The next day, March 3, I got another rejection email from Amazon, a duplicate of the original. The revised review still violated some undisclosed aspect of Amazon's Customer Review Creation Guidelines. Amazon provides zero guidance as to what precisely violated what in those guidelines, so I am (as are, I suppose, other reviewers in the same boat) left guessing as to how to remedy the situation.

Having said that, I have to wonder if the sentiments I expressed in the review were the real cause of the censorship. Here's what I attempted to post:
________________________________________________

The author of The Influencing Machine, Brooke Gladstone, works for National Public Radio (NPR), which means that she is linked financially to the patronage of grant-bequeathing foundations, those social-engineering tools wielded by the wealthy, who have an interest in maintaining the status quo, which explains why the author of this psy-op artifact (which is delivered in dumbed-down comic book format for easy digestion by the semi-literate) is so intent on defending and rationalizing and justifying things as they are, on leaving an ostensible well enough alone.

The book is a transparent apologia for mainstream media and the media’s version of journalism (namely, the parroting of pronouncements of politicians and bureaucrats and their agents). But anyone who dives into today's online alt media with a mind toward objectivity (an elusive thing that the author seems to think is forever out of reach) will find plenty of primary documentation and reasoned analysis and skilled presentation of political, economic and cultural occurrences—and outright plots—that the mainstream media and its government cohorts would rather keep from public view. And the curious delver into alt media also will find plenty of garbage. And so one has to keep one’s wits about one. But underscoring such a practical approach is no part of this author’s agenda. Her mission is to provide cover for the powers that be.

Dear friends, when you imbibe mass media you are not gazing into a mirror, as author Gladstone contends. You are internalizing an image of the world and a set of concerns manufactured by the wealthy for the purpose of maintaining a certain socioeconomic distance between themselves and you. They are the guys who, in effect, employ this author, and they are interested in nurturing, so as to maintain, class distinctions.

The book kicks off with a familiar bromide, namely that if you suspect an organized malfeasance lurking, then you're mentally ill. This is the takeaway implied by the story of Natalija A. a turn-of-the-(20th)-century psychiatric patient who happened to be deaf and mute and is said to have written about a machine that used some kind of broadcast signal to influence her thoughts and behaviors. If you suspect that mass media constitute such an influencing machine, then you're not just a reasonably observant person. No. You are damaged goods and psychological kin of Natalija and in need of psychiatric attention. You fail to understand that when you amuse yourself with media you are absorbing your own neuroses projected outward, creating a tightening gyre that you have no business complaining about. Your suspicions about someone else writing the script and zooming the camera can be swept aside with a broad brushstroke of watered-down psychobabble.

Friday, January 01, 2016

Intelligent Design (ID) theory proposes that her complex workings make nature look so much like an artifact that we should take her to be one, one that, according to ID discourse, was willed into being by a pre-existing creative intelligence. This is in distinction to the prevailing explanation of nature offered by science.

Problem is, the putative intelligence responsible for nature is, alas, a stunningly peculiar species of intelligence. According to the IDers, it is unlike any intelligence anyone ever has encountered anywhere outside of theology (or science fiction). IDers propose a brainless intelligence and an omniscient one at that.

Somehow, despite all inductive evidence to the contrary, there exists, they argue, an intelligence that requires no brain, nor even solid-state circuitry, for it to attend its business. Nature’s intelligent designer doesn’t need a body of any sort at all. The designer belongs to a unique class of intelligences, of which it is the sole member.

This odd construct, a disembodied supermind, derives from a faulty analogy. IDers observing human invention, notice that nature in suggestive ways looks and acts like a high-tech machine. They reason that since explaining machinery requires reference to an intelligent designer (or several), then so must explaining nature. This line of analogical thinking constitutes a case study in the disreputable practice known as cherry picking.

Sunday, November 15, 2015

This screenshot shows the same YouTube page in two browsers: Google Chrome at left and Mozilla Firefox at right (tho which is which is not important - I don't think). Notice that the comments are for the same video, which is the premier Unspun show with Jan Irvin and Joe Atwill. When I first watched this vid, I left what I thought was a reasonably thoughtful comment. And it posted just fine. No reason to think anything was screwy. There it was.

But when I went back to re-watch the vid the other day, being not logged in, the comments for the vid did not include mine. Puzzled, I opened a second browser, went to YouTube, and logged in. And then, there was my comment.

So, displaying the same vid simultaneously in two browsers--one, where I'm logged in and one where I'm not logged in--gives me two different displays. When YouTube knows it's me, because I'm logged in, it includes my posted comment. When it doesn't know (at least by way of log in) who's watching, then my comment doesn't appear.

Notice that each screen gives a different count for ALL COMMENTS. One gives five and one gives six. The latter count includes my comment. So, whether the comment shows is not a function of whether my comment is among the newest or most popular.

Given the subject matter of the show and of my comment--the machinations of the ruling class--it's tempting to believe that the selective display of the comment is a function of some algorithm's politically calculated filtering.

Anybody else notice selective editing of ALL COMMENTS by YouTube--your comments displayed in ALL COMMENTS only when you're logged in?

Sunday, October 25, 2015

Intelligent Design (ID) advocates call attention to the ways in which nature works. They says it looks like a complex machine. And, they have a point. Photosynthesis in plants, nucleosynthesis in stars, complicated nanoscale assemblies, and so on, make nature look like a high-tech science project.

But the metaphor—nature as machine—does not reveal nature as having been intelligently designed. It’s a metaphor.

Invert the assumption: Maybe it’s not the case that nature is a machine, with an implied design (as if technology was primordial and nature modeled on its principles). Maybe it’s the obvious case that nature came first, with humans being smart enough to study nature’s ways and apply that experience to building tools and towns and space stations. Nature inspires. But that does not make it an example of what it inspires.

Technicians make progress when their designs harmonize with natural law, but that does not mean that nature itself, with all its laws, was designed. Nature is just ground level, ontological bedrock. It does not come with a requirement that something supernatural, behind the scenes, got it started or propels it along.

Certain chunks of nature resemble humanly crafted artifacts, and so some people conclude that nature must itself be an intelligently designed artifact. But many things resemble things that they are not. Sometimes the similarities are striking.

A bat is in striking ways similar to a bird. Both are warm-blooded vertebrates, send out distinctive vocal signals, eat insects, flap wings to fly, congregate in social groups and so on. But an expedition in search of bat eggs will end up with egg on its face.

The God of classical theology, a deity that preceded and designed the physical universe, is a kind of bat egg.

The conclusion that bat eggs and a classical God must exist is justified if bats are birds and nature is an artifact. But if the similarity between bats and birds and between nature and high-tech is just a resemblance, then both conclusions land in the dump.

Sunday, October 18, 2015

Evolution theorists seem to assume that a successful creature marches into its environment equipped with an armamentarium of adaptations. Such a creature inherits its adaptations from its parents. The adaptations bestow fitness upon the creature. This fitness in turn bestows reproductive success, at least relative to that of the creature’s local conspecific peers/competitors. The reproductive success conveys the creature’s DNA to its progeny, who thereby themselves enjoy an armamentarium of adaptations similar to that of their parents (unless a changing environment turns those adaptations into neutral or even deleterious traits).

Point is, reproductive success is taken to be an effect caused by adaptations.

This position, stated otherwise, contends that, but for the adaptations, the creature would experience reproductive failure relative to its local conspecific peers.

But we can invert the assumption: Maybe the default position ought to be that organisms normally enjoy reproductive success, absent any factors, endogenous or exogenous, that undermine that success.

Organisms are integrated wholes, adapted, but not possessing adaptations. Organicism suggests that a creature doesn’t possess adaptations any more than an atom possesses protons. A proton, or a fused bundle of them, just isan atom. Atoms typically come decked out with other particles, the neutrons and electrons, but no protons, no atoms.

Similarly, a creature does not possess anatomical and physiological adaptations. It simply is its anatomy and its physiology. It comprises them, and they compose it. No physiology or anatomy, no creature.

In the case of a bird, for example, wings might be called an adaptation, but lay a couple on the ground and not much will happen. Lay a wingless bird on the ground and not much will happen beyond the suffering and demise of the bird. Whatever gets designated as an adaptation contributes no more to the rest of the creature than the rest contributes to it. Without the wings there just is no viable “rest of the bird,” that happens not to own wings. For evolutionary purposes, there just is no critter.

So, does the presenceof adaptations enableor the absenceof lethal/sterilizing circumstances allow a creature to live and reproduce?

Saturday, July 18, 2015

Ongoing research in molecular biology has delivered to the world the peculiar finding that the term gene refers to no determinable thing at all.
Gene is an idea. It grew from an assumption. The assumption, implied by the prevailing scientific philosophy of reductionism, is that there must be a smallest unit of biological expression. And a smallest unit of biological inheritance. And that atomic entity was called a gene.

Despite gene losing its power to denote, the science of genomics continues to advance and looks determined to keep doing so. And this is also despite reductionism’s shortcomings increasingly becoming apparent, as research reveals the seemingly intractable interwovenness of the processes and subprocesses of biological metabolism.

Reductionism fails because everything that goes on inside a living cell depends on—is caused by, directly or indirectly–everything else that goes on inside the cell. And then, adding more complication, there are exogenous influences. And above it all, no locus of control. There is a lacuna of control. The cell has no brain.
The sequencing of the human genome, that recent triumph of reductionism, like the cataloging of elementary particles, provides a compendium, but it resides far from the macrostructure, far from an accounting of gross outcomes. Structures and processes that define the macro world do not map readily onto elementary bits.

Friday, June 19, 2015

The processes that operate inside a living cell might tempt us to credit for their organized complexity some kind of executive intelligence. And any such intelligence as might be involved in the metabolic churnings of a cell must reside either beyond this world (i.e., in deity) or within this world (e.g., in DNA). Admittedly, the former conjecture asserts intelligence literally, while the latter attributes it more or less figuratively.

Point is (aside from the prospect that intelligence is no natural kind at all but a construct of human definitions and usages) recommending either approach toward understanding nature’s organized processes is to anthropomorphize: Both approaches, metaphysical and merely material, project human capacities onto things that are not human. To project intelligence onto supernatural entities or onto master molecules is to anthropomorphize, a conceit that inquiries into nature ought to avoid.

To suppose that a deliberating mind is needed to design or operate the biochemical levers that trigger or impede processes inside a cell is to anthropomorphize. To suppose that somewhere physically inside the cell is a something that makes such decisions as are made is to anthropomorphize. This latter observation is particularly the case now that research into gene regulatory networks demonstrates that the biochemistry inside a cell operates as an organic whole. There are dependencies and interdependencies, but no executive intelligence sits atop a hierarchy of control.

We have “intelligent” and “design,” “master genes,” “control switches,” “codes” and “programs” from which to construct an understanding of the cell as a representative organism. Such concepts are fine work-a-day metaphors, but literalizing and projecting them onto nature is a detour into anthropomorphism. Nature is not designed or programmed by an intelligence or anything else. Nature is not a whew! of chance. Nature is not of gods or fortunate happenstance. Nature is neither a miracle nor a machine.

Peel back the curtain, and there’s nothing to see. Nature, in all its messy complexity, in all its nurturing and desolation, in all its unlikely satisfactions is all there is: Organism. Nature earns its living by weaving novelty, habit, objects and subjects into ever more intense, elaborate and sublime aesthetic processes and experiences. It suffers the setbacks inherent in being alive. Its animate soul inspires each new universe it bears. Nature is ontologically animate, exuberant, irreducible, and non-contingent. This is the broad sense of organicism, the last philosophy left standing once dumb dead matter and disembodied consciousness have slapped one another silly.

Tuesday, June 09, 2015

Q: How does a fertilized egg cell give rise to such a variety of cell types as compose the body of a complex organism?

A: That fertilized egg cell’s DNA arrived pre-loaded with the genetic information needed to craft the specialized cell types that compose the body of that complex organism.

Although each cell in that body inherits all of the fertilized egg cell’s genes, specific genes get switched on and off by other genes that produce regulatory molecules, and those genes are switched on and off by other genes and the molecules they produce according to what is needed for each type of cell. And all of this biochemical management occurs by feedback loops. The biochemistry that oversees the differentiation (and stabilization) of cell types in the developing organism is organized into gene regulatory networks (GRNs), very elaborate chemical feedback loops.

The point to be made about GRNs is that, by regulating gene expression, the cellular machinery can coax from a highly conserved set of genes (those of the original fertilized egg) a liberal diversity of cell types (skin, muscle, nerve, etc.).

So far so good. But one noteworthy development is the increasing significance that evolutionary theorists ascribe to GRNs. Gene regulatory networks, not the acquisition of new genes, manage the differentiation of species from their common ancestors in much the same way as they manage the differentiation of cells in a body from their common ancestor, the fertilized egg cell.
Diverse descendant phenotypes lurk within the DNA of ancestral genomes and genotypes alike, waiting to be switched on.

The science of comparative genomics confirms a fundamental conservation of DNA across species, a finding that came as a surprise to everyone. The genetic similarities seen across species are too striking to sweep under the rug, and at least some researchers are candid about the new data’s impact on evolutionary theory.

"In fact, our present understanding of morphogenesis indicates that new phyla were not made by new genes but largely emerged through the rewiring of the gene regulatory networks (GRNs) of already existing genes"

This observation does double duty, because it also describes how a fertilized egg cell gives rise to descendant cell types. The descendant cell types emerge through the rewiring of gene regulatory networks.

Evidently, we can say that the genes for descendant species were there already in remote ancestors, or that the GRNs for descendant species were there already, or both were there already. In any case, what were they doing there? If they functioned one way, or were silent, in ancestors, how is it that they just happened to be re-wirable to produce highly dissimilar, yet “adapted,” descendants? That seems far fetched. But it makes sense and is to be expected if evolution is a case of development. It looks like GRNs manage the differentiation of species in an ecology in some manner similar to that in which such networks regulate the differentiation of cells in a body, namely by rearranging patterns of development.

Consider this adaptation of the quote above, "In fact, our present understanding of cellular differentiation in developing organisms indicates that new cell types are not made by new genes but largely emerge through the rewiring of the gene regulatory networks (GRNs) of already existing genes."

In Universal Genome in the Origin of Metazoa (Cell Cycle 6:15, August 2007) researcher Michael Sherman also argues that diverse species develop from a common, conserved, genome. His case rests largely on the presence of anomalous genes in ancestral species that are needed by descendant species, a circumstance called, pre-adaptation. He summarizes,

Monday, December 08, 2014

During the past few weeks I'd been enjoying an engaging exchange about evolution theory on an online science forum. One of the threads I was posting in posed the question as to whether evolution theory adheres to scientific method. My comments in the thread focused on the veracity of the concept of natural selection. I implied that natural selection theory was lacking.

Things seemed to be going well, but then *Kablooey*, the moderator recognized, rightly, that I had diverted the conversation into a rabbit hole. I had wandered off script. And he threw me in the penalty box --- my posts in this thread were relegated to the Trash Can.

Given the unpopularity of my posts -- my arguments won no converts -- it's tempting to believe that their final resting place in the Trash Can had something to do with that unpopularity. Nonetheless, the moderator did recognize that I was not addressing the topic of scientific method directly and deserved some kind of rebuke. But the Trash Can?

Sunday, September 21, 2014

During the development of a complex organism, a fertilized ovum, or zygote, divides in two, then again into twice as many cells and eventually into all the cells that compose the organism's body. As the cells proliferate, they differentiate in form and function into the various cell types of that particular kind of body. This differentiation into skin, stomach, nerve, and other cell types occurs even though the cells of a developing body all share a common genotype, that of the original zygote. The paradox of one genotype yielding many cellular phenotypes has been resolved, in a general sense, through the mechanisms of epigenetics. A relatively new branch of molecular biology, epigenetics addresses issues related to gene regulation and gene regulatory networks. The new discipline aims to explain how, during development, genes get turned on and off and when (as in larval or adult forms of organisms) and where (as in spleen or kidney) they do.

The new discipline is an upstart. Epigenetics would seem to demote DNA from being the cell's chief executive to its merely utilitarian, dumb server. DNA includes an archive of messenger-RNA templates (and the messenger RNA molecules transcribed from the templates still pass through an editing suite before being escorted to the ribosome, where they get translated into proteins). The molecular machinery of epigenetics, through normal chemical bonding, excites or inhibits DNA "expression" or "action." The countless combinations of sections of DNA that can be expressed and repressed here and there in sequence or in tandem produce multiform cellular phenotypes from the highly conserved DNA of the original zygote.

From a complex database a skilled operator can extract many kinds of reports, by slicing the data this way then that. DNA is such a complex database, responding to many and diverse calls for data. The creatures of the Earth are reports summoned from DNA, not expressions of any executive talent that resides in the DNA. This is the new view of things from the world of epigenetics.

Thursday, August 21, 2014

The inhumanities of the Holocaust swept from public view the until-then popular cause of eugenics. But before World War II eugenics thrived as a civic cause in the United States and the United Kingdom. Improving human stock through selective breeding and the sterilizing, or even the euthanizing, of the "unfit" had come to be regarded widely as scientifically sound public policy. The Wikipedia entry for "eugenics" provides an introduction to this mostly forgotten fashion of the times, which Nazi ideology took to extremes of horror and which today is banished, at least in its overt forms, from policy discussion, though social critics continue to uncover its covert forms & see embedded video below: Advocacy of eugenics continues under the banner of population control and similar euphemisms.

The Anglo-American eugenicists of the early 20th century invoked Darwinian natural-selection theory to gird their ideological bent. But, according to the arguments and evidence that Alan Bennett presents in "Evolution Revolution", these social engineers did not hijack Darwinism, nor twist it into service in a way to which Darwin would have objected. On the contrary, Darwin embraced the eugenicist agenda from the outset. Not only did Darwin himself promote eugenics, but the agenda's advocates also included Darwin's half-cousin, Francis Galton, who formalized the concept and propounded it as civic duty; Thomas Henry Huxley ("Darwin's Bulldog"); and Huxley's grandsons, Julian and Aldous Huxley, Julian serving for a time as president of the British Eugenics Society and Aldous sketching a blueprint for a caste society in his "Brave New World."

The objective of the Darwinian offensive was twofold, as Bennett summarizes:

Cast the working class in the role of the unfit.

Denigrate religion.

This history, the dark heart of Darwinian theory, presented in thoroughly referenced detail, makes up the first major portion of the book.

But the anti-Darwinian angle of Bennett's argument unwinds in a complicated way and extends beyond discrediting the motives of Darwin and his acolytes. That is, the attack is not merely ad hominem. Bennett establishes it as a point of historical fact that the concept of "descent with modification" had been around for some time prior to Darwin. Victorian society was not hostile to the idea of evolution, which it saw as evidence of God's wisdom, in His having crafted natural law so as to give rise to the diversity of life.

Neither was the mechanism of natural selection original with Darwin. It too was a concept familiar to Victorian scientists. But natural selection failed to gain traction as a scientific idea, before Darwin and his propagandists took up the cause, because the scientists of the day perceived that it was inadequate to account for the diversity of life. Under the influence of an optimizing mechanism, such as natural selection, they reasoned, phenotypes should converge, not diverge, with the passing of generations.

Natural selection theory never has rested on solid scientific evidence or reasoning. Although, by appealing to statistics and common prejudice, Darwinians grafted onto natural selection theory the trappings of a science. As a result, the sequentially amended theory became almost infinitely elastic in its capacity to absorb anomalous findings. It managed consistently to re-describe "how nature works" in ways contrived to preserve a niche for itself in the explanatory scheme. From the time Charles Darwin foisted it upon the world, natural selection theory effectively served the ideological ends of diverse brands of racists and elitists, despite its lack of scientific rigor.

However, if we follow Bennett in rejecting natural selection as the primary engine of evolution, then we are left with a process minus any explanation as to how it works. We still have to account for evolution's particular outcomes. Bennett proposes to fill the void, but the mechanism that he nominates to serve as evolution's centerpiece arrives with its own baggage.

Saturday, April 19, 2014

"The new forces, elevating in their nature though they be, do not act upon the social fabric from underneath, as was for a long time hoped and believed, but strike it at a point intermediate between top and bottom. It is as though an immense wedge were being forced, not underneath society, but through society. Those who are above the point of separation are elevated, but those who are below are crushed down."

— Henry George, Progress and Poverty

Move over 1984 and Brave New World. Tyler Cowen conjures his own bleak, dystopian future for humankind. Writing fiction is not his intent, but we have to hope that fiction he writes.

Cowen is an economist at George Mason University. He achieved fifteen minutes of fame via an e-book, The Great Stagnation in 2011. The e-book created a buzz loud enough to grab the attention of a publisher with a printing press. Stagnation insinuated its way between hard covers, from where it continued to make the case that a low-wage, slow-growth economy is something that the world had better get used to. It’s the new normal.

Average is Over, evidently a hurried sequel to Stagnation, reprises Cowen’s message: Extreme income inequality is here to stay. We can't tax the rich, after all, because they have too many channels through which to transfer the burden to the middle class and the poor.

Cowen offers up education as a tool that sub-millionaires might use to elevate themselves economically, but such education as Cowen conceives of might more candidly be called instruction, or schooling, or obedience training. So, operating in a bimodal economy—one that concentrates wealth at the tippy top and diffuses poverty across the broad bottom of the barrel, with no middle between—how does the top dispose of the barrel bottom? Cowen seems to think that that’s a problem to be solved and that he has a solution:

“What if someone proposed that in a few parts of the United States, in the warmer states, some city neighborhoods would be set aside for cheap living?”

Cowen describes the housing in these cheap zones as being modest but not ramshackle, and it all seems fuzzily commonsensical. But then,

“We also would build some makeshift structures there, similar to the better dwellings you might find in a Rio de Janeiro Favela. The quality of the water and electrical infrastructure might be low by American standards, though we could supplement the neighborhood with free municipal wireless (the future version of Marie Antoinette’s famous alleged phrase, “Let them watch internet!”) Hulu and other web-based TV services would replace more expensive cable connections for those residents. Then we would allow people to move there if they desired. In essence, we would be recreating a Mexico-like or Brazil-like environment in part the united States, although with some technological add-ons and most likely greater safety.”

Ok, so let’s fill another flute with bubbly and kick back to get a good look as the great unwashed descend ever deeper into collective poverty. It’s a kind of spectator sport for the well heeled, really. Let’s anticipate this decline in quality of life and contain the newly impoverished in ghettos modeled after Brazilian slums. If we call these habitats for impoverished humanity camps maybe they’ll seem almost recreational. Maybe FEMA would do a good job running these camps, keeping everything orderly and responding to emergencies. They might even cook up a motivational slogan. Maybe something like Arbeit Macht Frei.

No one should find the favela prospect objectionable, Cowen opines, after all, “no one is being forced to live in these places. Some people might prefer to live there. I might prefer to live there if my income were low enough.” He goes on to remind readers that some neighborhoods deteriorate naturally into shantytowns: “The end result is no different from the deliberate shantytowns already discussed.”

Sunday, February 23, 2014

A pair of insidious memes is making the rounds in the mainstream media. These memes have to do with your standard of living, and they declare that the "Great Recession" is here to stay. George Mason University economist Tyler Cowen calls the persistent economic anemia, "The Great Stagnation," and in his book of that title he argues that we had better get used to hard times. The second meme of the pair is a corollary and is one of the primary legs on which Cowen's argument rests. This meme concerns "the end of innovation." All of humankind's potential inventions evidently already are with us. Or, at least we've picked all of the technological "low-hanging fruit." Innovations currently occupying the pipeline consist of embellishments; they are not game changers, like the inventions of the last century.

This assertion, about technological innovation, admittedly, invites debate, given the ongoing computerization of pretty much everything. Nonetheless, say the pessimists, for whatever reason, recent breakthroughs lack the economic potency of the technological breakthroughs of the early to middle twentieth century. Those innovations fueled deep and broad economic gains. More recent breakthroughs are of a different nature. They are hood ornaments glued onto tried-and-true technologies, and they concentrate their relatively meager economic gains selectively in the pockets of the already rich. Evidently, the "New Normal" applies only to the middle class and the poor. Cowen seems too cavalier about this implication of the new economic order: It's just the way the cookie crumbles.

Columnist Paul Krugman attributes much of the economic stagnation to a lack of demand for goods and services, essentially telling us that the legendary "job creators" won't be creating jobs any time soon, not until they see more cash in the pockets of prospective customers. If that's a major hold-up, keeping the economy from recovering, then putting cash in the pockets of consumers might be a good thing--the kind of economic stimulus that might work. If only that were the objective of the professional theoreticians and technicians "working" on the problem.

Despite these implications of Cowen's and Krugman's diagnoses, the prescription from both sides is simple: Get Used To It. Krugman glowingly cites economist and political advisor Lawrence (Larry) Summers as having arrived at the same conclusions about the New Normal and The End of Innovation. Now, there's a champion for the working bloke.

Push-back from the contrarians, those who reject the end-of-innovation premise, predictably cites the ubiquity of computing and communications technologies. The new devices have reshaped thoroughly the habits of consumers and big business alike. Computers streamline manufacturing and make comparison shopping as simple as swiping a touchscreen. The impact of the more recent technologies can hardly be said to pale next to that of the twentieth century's marvels. Nonetheless, the scope of their impact is easier to appreciate when one
looks under the hood and sees that tiny little integrated circuit, or
IC.

Saturday, December 21, 2013

Let's see. A protein normally inhibits
expression of genotypic "mutations" (that is, their expression into
variant phenotypic traits). But under stress, production of the protein
itself becomes suppressed and, as a result, more of the genotypic
"mutations" become phenotypically expressed. And those "mutations",
which have been riding along silently in the genome of this fish, having
originated we know not how, just happen to include variants (when
phenotypically expressed) that are adaptive in the environment that
produces the stress.

It's just too pat. Carrying around a
reservoir of unexpressed mutations? Just for fun? Just "in case"?
Just "by chance"? I don't think so.

Thursday, August 29, 2013

The scene at right occurs a few minutes into the episode, "Say My Name," in Season 5 of the cable TV show, "Breaking Bad".In the screenshot, the lower right quadrant of the sky features four or more more-or-less parallel white streaks. Maybe stunt pilots just finished an air show. Who knows?Maybe the streaks are just oddly configured cirrus clouds. Nah.They can't be normal jet-engine contrails, because those dissipate too quickly. They wouldn't hang around long enough to share the sky with the next flights to come along in the same path. Hmmmm . . . .One thing these pop culture insertions accomplish is to help normalize the new sky. The camera doesn't linger, so we can't know whether or not in an hour or so these streaks unfolded themselves into feathery, striated, filamentous, fields of rippled gauzy gray haze that finally swallow the sky.

Sunday, July 21, 2013

The Braindead Megaphone is George Saunders' metaphor for mainstream corporate news and related media. Saunders' problem in crafting the metaphor is that he aims at, and hits, the wrong target. The braindead voice that irritates him is not an organic creature, but a connivance. Saunders acknowledges as much in the title essay, at least implicitly, and thereby undercuts the metaphor.

Saunders acknowledges that the media's braindead incantations are agenda driven: "[. . . ] it's clear that a significant and ascendant component of that voice has become bottom-dwelling, shrill, incurious, ranting, and agenda-driven." So, he's got himself into a contradiction, though you have to untangle the essay to get a good look at it.

He continues, "It strives to antagonize us, make us feel anxious, ineffective, and alone; convince us that the world is full of enemies and of people stupider and less agreeable than ourselves; is dedicated to the idea that, outside the sphere of our immediate experience, the world works in a different, more hostile, less knowable manner. This braindead tendency is viral . . . ."

Why would he characterize the deliberate undermining of an accurate picture of the world as reflecting braindeadness? The propaganda ministers who employ the megaphone are hardly braindead. How about crazy like a fox?

Saunders aims too low. He mistakes the messenger for the composer, the program for the programmer, the on-air personality reading a teleprompter for the executive directors of the telecommunications conglomerate who employ the teleprompter reader. The collage on the cover of the first paperback edition makes clear his target: the on-air script readers.

But these poor toilers are just reciters, pieces of the corporate complex's manufactured public face. Why does Saunders fail to target the people who write the scripts that the on-air personalities are forced to read? Or their bosses? Or theirs? He calls the on-air personalities "informants," as if they possess information and share it in a spirit of civic mindedness, in a marketplace of ideas, hobbled only by their stupidity. Oh, and the profit motive. But they don't possess anything like information. They possess a knack for projecting an amiable facade. Witting or unwitting, they serve nonbraindead masters.

Sunday, July 14, 2013

You are looking at a photo of a sheet from the June 9, 2013, edition of the StarTribune newspaper, published in Minneapolis, MN. Left-hand page is page A8. Right- hand page is page A5.

Notice the advertisement that occupies the bottom half of page A5. It is an airline ad for AirFrance.

The grid of trails in the sky evokes patterns commonly associated with chemtrails, with the ropes from the swings contributing to the aerial coverage. The ad does not include any headline or body copy that references playground activity, vacationing, family travel, or anything to create a context for the kids on swings. They are context-free props, but for the chemtrail ropes. Since this ad ran, the skies over the Minneapolis-St. Paul area have received extensive, repeated trail coverage.

What went through the mind of the graphic designer who laid out this ad, or the art director who approved it, or the agency rep who sold it to AirFrance? It is implausible to suppose that the reference to chemtrails is unintentional. If similar ads appear where you live, you might want to prepare for a heavy dose of heavy metals.

A common rebuttal to warnings about chemtrails is that they are equal-opportunity toxifiers, that not even the perpetrators could avoid inhaling the contents of the spays. It might be that through advertisement, the perpetrators signal their cohorts as to where the whammy will fall, giving them fair warning to take a vacation or other leave of absence.

Sunday, April 28, 2013

This frustrating exposition, after all, yields a satisfying conclusion. Nagel frustrates with his opaque writing style, and, nearly as frustrating, he rejects the NeoDarwinian account, when it comes to the origin of consciousness, rationality and value—yet retains it in his teleological alternative. That is, he schizophrenically honors the heart of the Darwinian doctrine, natural selection, while insisting that the doctrine is almost certainly wrong. Go figure.

What Darwin Got Wrong." But Nagel dismisses the centerpiece of their attack—the incoherence of natural selection theory—cavalierly, to my mind, opining in a footnote that they misinterpret the theory. Really? How so? Now, that would be worth reading.

Which is not to say that the present book is not. When someone of Nagel's stature presents secular objections to the NeoDarwinian paradigm, feathers are bound to fly, as they do in any number of critical reviews of the book. Science seems to feel pressured to circle the wagons around the Darwinian account no matter the veracity of the counter arguments. It's time to take a break.

Conventional thinkers who keep natural selection theory at the top of their list of explanatory tools can use it to explain any aspect of organic nature. They need only contend that whatever is observed, say consciousness and rational thought, or blue feathers and big beaks, is as it is because that phenotypic trait was "more adaptive" than the alternatives expressed in the ancestral population. If in some instances that explanation seems implausible, then the explanation is “genetic drift.”

In any case, Nagel sidesteps this catch-all application of Darwinian reductionism by pointing out that nature from the outset must have had the potential to sprout living beings with minds. Nothing in conventional scientific thinking accounts for this potential inhering in nature. The prospect transcends the materialist, NeoDarwinian paradigm, or at least that's Nagel's contention.

Sunday, March 31, 2013

The United States is a Christian nation, right? That's what the TV preachers tell us. Christianity might be good enough for the masses who toil in the private sector, but their masters who rule from the public sector look to the classical world for inspiration.

This struck me during a visit to the Minnesota state capitol building in St. Paul. The artwork inside the building's rotunda is uniformly and inescapably pagan. I take responsibility for the poor quality of the photos that follow, but the inherent pagan imagery is clear. Nothing Christian to be gleaned.

Outside the capitol building is the iconic gold-leafed copper and steel statuary group, "Progress of the State." According to the Minnesota Historical Society, "The four horses represent the power of nature: earth, wind, fire and water. The women symbolize civilization and the man standing on the chariot represents prosperity." Why should secular government pay homage to the elementals?
(image from the MN Historical Society web site)

And if you're visiting Minnesota's twin cities, be sure the check out the lower level of the old Hennepin County courthouse in downtown Minneapolis. There you'll find this impressive statue of the Father of Waters, a personification of that venerable elemental force, the Mighty Mississippi.

A visit to the Wisconsin state capitol, in Madison, confronted me with similar imagery in the form of statuary, shown below.

Where in all this are Adam and Eve? Noah and the ark? The exodus from Egypt? The arrival at the promised land?

Where's the immaculate conception? The loaves and fishes? The crucifixion and the ressurrection?

Isn't the United States a Christian nation?.

I suppose that the dearth of Biblical imagery in seats of government has something to do with the notion of a Constitutional wall that separates church and state. And the reason pagans get a free ride might have to do with that word, "church," as opposed to, say, "temple." Curiously, Mormons maintain both churches and temples, covering the bases, I suppose.

In any case, it's clear that Christianity is good enough for Joe and Josephine Blow, but paganism the elite retains to itself for its own veneration.

Saturday, March 09, 2013

August 4, 2013, update to this post:Narrative Science is a company that helps businesses communicate by turning data into stories. The company's software program, Quill, "is an artificial intelligence engine that generates, evaluates and gives voice to ideas as it discovers them in data." The company's website elaborates:

"Quill imports your data and builds an appropriate narrative structure to meet the goals of your audience. Using complex Artificial Intelligence algorithms, Quill extracts and organizes key facts and insights and transforms them into stories, at scale. Quill uses data to answer important questions, provide advice and deliver powerful insight in a precise, clear narrative."

Once upon a time, there was a public policy think tank at the University of
Virginia, called the Miller Center. In October 1998, the Center invited historians,
editors and journalists to mull over “the state of the art” in political history. There was a problem to be addressed, which apparently was that historical narrative had slipped through the hands of its rightful authors. And with it went those failing authors’ control of public sentiment. Implied in the conference’s mission was the felt need to figure out not only how this situation came about, but also how history’s
rightful authors could recapture the historical narrative. Controlling narratives,
people in high places seem to believe, facilitates controlling mass behavior,
making it a subject worth study.

The Conference on Contemporary Political History
kicked off with remarks from then director of the Miller Center, Philip Zelikow.
(A few years later, during a leave from his directorship, Zelikow kept on
a short leash the Kean-Hamilton Commission, also known as the 911 Commission,
which he steered as its staff executive director.) In his opening remarks
to the history conference, Zelikow laid out his concerns regarding contemporary
political history:

“’Contemporary’” is defined functionally by those critical people and events that go into forming the public’s presumptions about its immediate past. This idea of ‘public presumption’ is akin to William McNeill’s notion of ‘public myth’ but without the negative implication sometimes invoked by the word ‘myth.’ Such presumptions are beliefs (1) thought to be true (although not necessarily known to be true with certainty), and (2) shared in common within the relevant political community. The sources for such presumptions are both personal (from direct experience) and vicarious (from books, movies, and myths).”

”First, public presumptions can be ‘generational.’ They are formed by those pivotal events that become etched in the minds of those who have lived through them [. . . ]. The current set begins in approximately 1933, although the New Deal generation is fading. The Second World War and Vietnam, however, continue to resonate powerfully.

“Second, particularly ‘searing’ or ‘molding’ events take on ‘transcendent’ importance and, therefore, retain their power even as the experiencing generation passes from the scene. In the United States, beliefs about the formation of the nation and the Constitution remain powerful today, as do beliefs about slavery and the Civil War. World War II, Vietnam, and the civil rights struggle are more recent examples.

“Third, public presumptions often concern 'dramatic stories plucked out
of time,' such as the Alamo, Pickett’s Charge, or the Titanic.

“Fourth, some public presumptions gain currency because they have a particular resonance for us today, either because they invoke powerful analogies to the present [. . . .] or because they offer a causal link and seem to explain ‘why we are the way we are today.’ Taken together, we see that presumptions that remain ‘contemporary’ are—with few exceptions from the 18th and 19th centuries—events
and episodes from the last 60 years.”

Depending on which conspiracy theory one subscribes to, that of the Bush administration
and the Kean–Hamilton Commission or some variety suggested by the 911 Truth Movement, it takes little imagination to perceive in Zelikow’s assessment a premonition or foreshadowing of events to come, his remarks about searing events taking on transcendent importance being uttered just three years prior to the attacks of 9/11.

And nowhere in his comments will one find any concern that public presumptions should reflect history accurately.

Sunday, March 03, 2013

The cover, which appears on Discover's March, 2013, issue, looks like something dredged from a 1950s science-fiction serial. A guy in a suit lifting off to his next sales meeting represents Evolution's Next Stage? Does it get any more pedestrian, or vintage, than that? What's the next breakthrough—lawn-mowing robots?

America's visionary futurists peered over the edge—and all I got was this lousy jet pack.

Is it really plausible that the editors behind this popular science magazine can't envision a future grander than one that might have dazzled them in grade school? Or, maybe technological deprivation is an idea that media owners would like us to get used to.

The cover image is so dated that its cheesiness looks insidious. Is it an artifact of the project to deliberately dumb us down? Another exercise in normalizing middle-class expectations of a stagnant-to-declining standard of living?

Surely the technological imagination can conjure something more exciting than this to sell us as Evolution's Next Stage. The next stage in evolution? Try THIS.

Sunday, January 27, 2013

Lecture: Evolution in Four Dimensions
Click the image for an excellent talk by Eva Jablonka, in which she describes provocative new findings in epigenetics and animal behavior. The findings move natural selection farther toward the periphery of evolutionary theory. Phenotypes, as they differetiate during evolution, seem to self-organize, as do the differentiating cells in a developing organism. My contention is that evolution and development resemble one another because they are two appearances of the same process, which is development.

If one could lay Darwin's concept of natural selection next to the current model of evolution, one would have a hard time finding much in common between them. Outside of the vaguest generalization of the evolutionary process, captured in Darwin's phrase, "Descent with modification," nothing much of the original formulation survives to contribute to the current model, the so-called Extended Synthesis.

Epigenetics, niche construction, phenotypic plasticity and other intriguing new developments in bioscience sit at the center of that synthesis and throw into question the foundations of the Darwinian model.
Despite evolution theory's provisional character, however, the fossils record descent’s modifications, and they taunt us: What during descent accounts for these modifications?

Saturday, November 10, 2012

If you believe a lie then learn you’ve been lied to, you feel betrayed, embarrassed and angry. Maybe you’re not the shrewd operator, after all. Maybe you’re just a chump.

This realization produces a discomfort that psychologists call cognitive dissonance. It happens when a person holds beliefs that are incompatible, as in “I am a shrewd operator” and “I’ve been chumped.” A wave of cognitive dissonance is washing over the U. S. electorate, as voters face up to their chumpdom.

The electorate is seeing that political and economic fundamentals rest on something other than which party perches on the branches of the U.S. government. Wings left and right fly in and out of office, but election wins of neither camp deflect entrenched trajectories:

U.S. troops and their private-sector surrogates occupy more foreign turf.

Our sources of sustenance become increasingly adulterated.

My friends who navigate the evening news from the cargo bay of the left-right circus-train explain away the immunity of these trends to election results. Like UFO cultists whose deadline for the messianic rescue has come and gone, they circle the wagons tighter. It was a fluke, they say--the next time our team wins, that’s when things will change.

Fortunately, self-deception has its limits.

Philosopher of science Thomas Kuhn found in the history of science examples of self-deception's limits, and he described what happens when those limits are breached. In The Structure of Scientific Revolutions, Kuhn pointed out that when scientific experiments produce unexpected results, and eventually they will, then scientists explain away the anomalies, somehow or other, to preserve their foundational beliefs, what Kuhn called their paradigm. Scientists accommodate the anomalous results by amending the paradigm. In the above example, if “I am a shrewd operator” is the prevailing paradigm, and “I’ve been chumped” is an anomalous occurrence, then “I must have had a bad day” might be an adequate amendment to accommodate the anomalous data while preserving the paradigm.

But a paradigm can be stretched only so far. Eventually the anomalies, such as strings of coincidences or runs of bad days, accumulate beyond the elastic threshold of the paradigm, and it snaps. When that happens, the paradigm’s adherents convert or, eventually, die off, and a new generation adopts a new paradigm that accommodates the anomalous data.

The Structure of Political Revolutions.

Politics, like science, rests on foundational beliefs, or paradigms. A majority of the electorate, for example, seems to believe that the ruling class comprises more or less decent human beings who act in good faith (more or less) toward the ruled. But when outcomes contradict this belief, when its expectations are not met, then the electorate stretches the paradigm to accommodate apparent acts of bad faith, such as, for example, a pile of broken campaign promises.

A generally accepted amendment grants that our elected officials and their advisers are beneficent actors who act in good faith but are stupid. (This is the honest-but-stupid hypothesis.) Maybe they make big mistakes, because deep down they are boobs, or, more charitably, they just lack sufficient smarts to figure out how to keep their promises. The economy, the Middle East, the environment—it’s all so complicated. Sometimes things go wrong. The rulers mean well, but they bungle. Dang. Or, maybe they’re honest, well meaning and competent, but cowardly when opposed. (This variant is the honest-but-spineless hypothesis.)

Armed with these amendments, adherents of the prevailing paradigm can mitigate the dissonance created by anomalous outcomes. They can still gather and console themselves under the old paradigm. But the paradigm must have an elastic threshold; it cannot accommodate an infinite number of anomalous outcomes and amendments.

Sunday, September 23, 2012

The issues are in the tissues.

A massage therapist once shared with me that piece of trade wisdom. She was making a point about the interplay between mental and physical discomforts. She was not the first to link the two.

Scottish Psychiatrist R. D. Laing in the 1960s and '70s, following onto Freud and Jung, proposed that earliest experiences in life inform not only personal psychopathologies but also the myths of the tribe. He was interested particularly in myths that share a pattern with prenatal and early postnatal developmental histories and suggested that the mythic tales might recapitulate the adventures of zygotes, embryos, and fetuses.

Uterine Endometrium Adopting The Newly Arrived Blastocyst‏

In The Facts of Life, Laing cites the story of King Sargon. As an infant, the future king was placed by his parents in a reed basket and sent down river. He drifted until AKKI, the gardener, rescued him, and AKKI then raised the foundling to adulthood. The adult Sargon then stepped into the world to become king of Assyria. Laing, as have others, pointed out the parallels between this story and that of the biblical Moses. The infant Moses also was placed by his parents in a basket and set adrift, was adopted and raised by his rescuer and eventually left the home of the rescuer to assume a position of prominence. Laing could have included among his examples a more contemporary hero, Superman, whose parents placed him as an infant into a "basket"—a rocket ship—and sent him downstream—through interstellar space—from the doomed planet, Krypton, to Earth, where the Earthlings Jonathan and Martha Kent discovered and adopted him. The Kents raised the infant, whom they named Clark, to adulthood. The adult Clark Kent then relocated from Smallville to Metropolis, where he assumed his superhero identity.

The stories recapitulate the journeys and development of the zygote, embryo, and fetus, suggested Laing. That is, the stories parallel the prenatal story: A zygote, encased in a membrane, called the zona pellucida, travels downstream—down the fallopian tube—until, as a blastocyst, it is adopted by the uterine endometrium. Attached to the uterus, it matures into a fetus, and at the requisite time it is born into the world.

Laing also cites correspondence between Freud and Jung in which the psychoanalysts discussed another natal motif, that of the doppelganger, the hero's atrophied and subordinate twin. Examples of hero/doppelganger pairs include Gilgamesh and Enkidu, Romulus and Remus, and Don Quixote and Sancho Panza. Cain and Abel could be cited. The psychoanalysts interpreted the weak or unfortunate twin as representing that lost, discarded companion, the placenta.

Although Jung is most associated with the notion of a collective unconscious, Freud, too, toyed with the idea of a sort of phylogenetic memory. In a manuscript he shared with colleague Sandor Ferenczi, published posthumously as A Phylogenetic Fantasy, Freud speculated that some psychopathologies developed as reactions to conditions encountered by our ancestors during the ice ages.

From DNA to Phenotypes

The arrival of the first neurons provides the embryo with a mechanism, prospectively, that can record experiences, that is, with a memory. But the developmental stages prior to the appearance of the first neurons must record experiences by other means (if at all). Some other means must come into play also if phylogenetic memory can be a realistic prospect.

Saturday, August 18, 2012

The guardians of intellectual propriety at RationalWiki.org (un)kindly have taken notice of the star larvae hypothesis. Here are some of the things that you can learn about the hypothesis by reading its entry on the wiki.

It is a creationist hypothesis. (The star larvae hypothesis has no use for, nor does it address, supernaturalism, so how it qualifies as creationism is hard to figure, unless creationism has an infinitely elastic definition.)

Its mix of ideas includes “religious creationist arguments” and “paranormal topics.” (The hypothesis includes creationist arguments only insofar as it includes arguments that are critical of the theory of natural selection, but the criticisms of natural selection have no root in any religious consideration. The hypothesis has no use for, nor does it address, paranormal topics, unless one is using “normal” in the Kuhnian sense, in which case any reference to anomalous data is a reference to something “para”normal.)

It is guilty of “quote mining and misrepresenting the Gaia hypothesis and panspermia ideas of Fred Hoyle.” (The hypothesis cites sources in the ordinary way that such presentations do. If there’s any mining, it’s in the sidebar quotes, but those are for color. They’re not essential to the hypothesis. The accusation of misrepresentation is strange, but mudslingers tend not to aim very carefully.)

It denies macroevolution and claims there are no transitional fossils. (This characterization could be made only by someone who has not read, or understood, the hypothesis.)

And so on goes the entry, into assertions about the author’s lack of relevant education, his religious fundamentalism, and his paranoia about “a conspiracy by the scientific community to deny his hypothesis”. Actually, the scientific community doesn’t seem to have taken any notice of the hypothesis; I’d be thrilled if a scientist took time to bash it.

I can’t help but ponder the P.R. cliché about no publicity being bad publicity. Since the wiki entry appeared, visits to the star larvae site have ticked up a bit.