How the next apocalypse will be caused by this instance of science run amuck

I suppose we can thank Wired magazine for adding "technology entreprenuer-heros" to Number Two above. I love Wired magazine, and I'd really love to write for Wired, but I gotta tell you, the feeling that I'd have to include some variant of Number Two in every story I'd query makes me hold my nose and write for my blogs instead. I wonder if Wired deliberately chose this narrative path, a variant of the cyberpunk anti-hero cowboy, jacked-in to the system, or if it fell into it with the tech boom. It's been copied by other entreprenuer-hero magazines ad nauseum (Fast Company, Business 2.0).

So if mainstream media is screwing up its coverage of science and producing pathetic science writing, I'd hope bloggers, many working in technology fields, would do something to correct the larger error in their mainstream narratives.

This is for those bloggers who enter these spaces not to write endlessly about themselves, but rather, to use their blogs as a way to SHARE what they find interesting in the world, to say "Hey, lookee here!" and let their blogs be a way to interact with the wonder and silliness and oddness and create or enter watercooler conversations about these things (Boing Boing has too much oddness and not enough conversation, if you ask me).

Translating science into something relevant and understandable to a wider audience could bring real joy to bloggers who love and are fascinated with science, even if they've had to face the fact that most jobs in science don't allow them to indulge very much of the wonder that first drew them to the field. Yeah, I'm talking Popular Mechanics readers, Wired readers, sure, but ANY workshop tinkerer, ANYONE who loves to read and engage with how things work. Science is NOT impossible arcana. It can be accessible to those who don't have precise disciplinary specializations.

Just as I preach the need for public intellectuals in the Blogosphere, I also hope this space can help popular interdisciplinary science to thrive, so that it becomes an avocation that doesn't stop when you discard your childhood chemistry set, or rock tumblers, or that fun build-it-yourself radio kit.

Yeah, that was me as a kid. Somewhere along the line, someone in school (maybe me) decided my aptitudes ran toward the humanities (they surely do), but I never fell out of love with science and curiosity and a desire to invent. I know there's others out there like me.

Michael Bugeja has some advice below for would-be science writers, and by extension, bloggers. Bloggers have an opportunity to do mainstream media one better, because they don't have the commercial imperatives of advertisers to force every science story into one of those three godawful narratives. Bloggers are expanding this world of discourse, so I'm hoping they can expand on these narrow-minded science story templates.

But if bloggers excerpt and condense these stories, they run the same risks as journalists. They can reduce things to sound bites, decontextualize, and oversimplify just as badly.

Sound Science or Sound Bite?

I direct a journalism program at a science-oriented university where
my colleagues are modern-day alchemists, turning corn into fuel,
conjuring twisters in wind tunnels, or morphing visitors at our virtual
reality lab into plant cells during photosynthesis.

These professors rank among the most ingenious, passionate people I have ever met.

Put some of them in front of a reporter, however, and all bets are off.

Being misquoted in the media is commonplace, especially when the
topic concerns science. Depending on the error, a quotation out of
context can catapult a scientist into the national spotlight where the
person gets to clarify the remarks and do it again, only this time for
a mass audience.

[...]

Journalists, of course, are partly to blame for overselling science.
True, big national newspapers and broadcast outlets have seasoned
correspondents. Science happens everywhere, including college towns
like mine, Ames, Iowa, where agricultural biotechnology is on display
in fields and on shelves of supermarkets. Many reporters who cover
science do not fully grasp it, interviewing sources with polar
viewpoints on genetically modified products or exotic animal diseases.

My colleagues diagnose mad cows. Reporters love mad cows because the
beasts in question have or do not have the disease. Better yet, we eat
on average 67 pounds of beef annually per person, ensuring the story
will be read. But the science of immunohistochemistry to test for
bovine spongiform encephalopathy at the U.S. Department of Agriculture
laboratory is, on occasion, an arcane topic for the reporter who also
does restaurant reviews.

[...]

To put this into perspective, consider this: The scientist who
visited my university and who reportedly made that comment happens to
be the same person who wrote the essay, titled, “Creation Myths: What
scientists don’t — and can’t — know about the world” in the journal In
Character. His name is Robert Hazen, author of the extraordinary book, Gen-e-sis: The Scientific Quest for Life’s Origins, and a professor of earth science at George Mason University.

Read Hazen’s book, if you haven’t already. When you do, you realize that his comment as reported in the Ames Tribune
actually is based on the molecular fossil record. Most reviews of his
work note how fair and balanced his theories actually are.

You can’t deduce that, however, by reading the 387 words in the
story about his talk at Iowa State University on February 3, 2006. You
need to glean the 339 pages in Hazen’s hard cover book.

And in this numerical comparison is also the problem at hand.

Bites from Books

Below are some of the most influential books that helped shape a century of science, according to The American Scientist,
the magazine of the Scientific Research Society. To illustrate my
point, I have reduced each work’s premise or conclusion into a sound
bite — an excerpt taken out of context — [...]

What would be the outcome, I wondered, if reporters attended
lectures by authors of these great books, quoting them out of context
in the year of publication, given the social mores of those times?

Aldous Huxley, The Doors of Perception & Heaven and Hell
(1954): “Although obviously superior to cocaine, opium, alcohol and
tobacco, mescaline is not yet the ideal drug. Along with the happily
transfigured majority of mescaline takers there is a minority that
finds in the drug only hell or purgatory” (p. 66).

Pierre Teilhard de Chardin, The Phenomenon of Man (1959):
“[M]an is seen not as a static centre of the world—as he for long
believed himself to be — but as the axis and leading shoot of
evolution, which is something much finer” (p. 36).

Rachel Carson, Silent Spring (1962): “Future historians
may well be amazed by our distorted sense of proportion. How could
intelligent beings seek to control a few unwanted species by a method
that contaminated the entire environment and brought the threat of
disease and death even to their own kind?” (p. 8.)

Benoit B. Mandelbrot, Fractals (1977): “Why is geometry
often described as ‘cold’ and ‘dry’? One reason lies in its inability
to describe the shape of a cloud, a mountain a coastline, or a tree.…
Mathematicians have disdained this challenge, however, and have
increasingly chosen to flee from nature by devising theories unrelated
to anything we can see or feel” (p. 2).

Jane Goodall, In the Shadow of Man (1988): “Who knows what
the chimpanzee will be like forty million years hence? It should be of
concern to us all that we permit him to live, that we at least give him
the chance to evolve” (p. 252).

Steven Weinberg, Dreams of a Final Theory (1992): “If
there is a God that has special plans for humans, then He has taken
very great pains to hide His concern for us. To me it would seem
impolite if not impious to bother such a God with our prayers” (p. 251).

Denise Schmandt-Besserat, How Writing Came About (1996):
“[W]riting emerged from a counting device.… Each change of reckoning
device — tallies, plain tokens, complex tokens — corresponded to a new
form of economy: hunting and gathering, agriculture, industry” (p. 122).

If you have read these books, you would realize that the above
citations require substantiation. Those excerpts make great pull quotes
in print or sound bites on air. However, taken out of context, they
also provoke as much as inform. That is why I caution scientists to at
least qualify similar remarks with humbler disclaimers, especially if
they believe passionately in their assertions.

[...]

[Robert Hazen says]

“So what’s a scientist to do? My approach is to explain three things:

“First, describe what we think we know about the topic (and, if
possible, provide a little background about the measurements and theory
that support that knowledge). How do we arrive at our conclusions?

“Second, explain what we DON’T know about the topic, including the
uncertainties, the controversies, and a sense of how much weight to
place on different ideas. It’s always best to be honest about our
imperfect state of understanding.

“Third, and equally important, explain what we’re doing to find out more.”

According to Dr. Hazen, science is a never-ending adventure.

I feel the same way about journalism.

[...]

Michael Bugeja, who directs the journalism school at Iowa State University, is author of Interpersonal Divide: The Search for Community in a Technological Age (Oxford University Press, 2005).

I like the idea of thinking about this, although I don't know if I like the writer's prescription of how to go about it. I don't want to necessarily become a neo-Victorian (thanks Neal Stephenson!) keeping my tidy Commonplace Book (what the devil is a blog, anyway?).

But I do think too many of us read like students and multi-taskers, skimmers, in other words. The beauty of a beach novel is that you can pause after a paragraph and stare into space, ponder an idea the passage raises, reflect, question, argue in the margins. But how many so-called "beach novels" are worthy of such a ponderous approach? But are you ready to take Middlemarch to the beach?!

More appropriate, I think, for blog authors and would-be blog authors, would be to reflect on what makes a piece of writing valuable outside of the moment that spawned it, that might even make it stand up longer, make it stand the test of time.

I had a non-fiction workshop teacher who urged us to think about our long essay-like pieces as an attempt to be THE definitive take on that particular topic. Now how's that for something to shoot for?

Bloggers tend to respond so much to the moment, and there's nothing wrong with that. But Thomas Paine was also responding very much to his moment, and his little pamphlet surely has stood the test of time. We could say the same about Ben Franklin's little aphorisms, or even what he wrote under the throwaway name "Silence Dogood." (Of course Ben Franklin also believed in his own aphorism, "Fart Proudly," which I think conveys both the right degree of irreverence and timeliness about those things we might wish could stand the test of time.)

Learning How to Read Slowly Again

The demise of print looks as if
it will be a long, drawn-out affair. John Sutherland, the chairman of
last year’s Man Booker Prize Committee, offers an arresting statistic:
Today more novels are published in one week than Samuel Johnson had to
deal with in a decade. As he calculates it in “How to Read a Novel,” it
would take approximately 163 lifetimes to read the fiction currently
available, at the click of a mouse, from Amazon.com.

So what to read? That’s the question. But as Mr. Sutherland’s title
suggests, there’s a second question entangled with the first, addressed
in several new books devoted to the lost art of reading. It’s a
Malthusian problem. The amount of printed material increases
exponentially, but the time available for reading remains static or, in
many cases, decreases arithmetically. So once we have decided what to
read, the question then becomes, How to read? And the paradoxical
answer is, Much more slowly.

In “Reading Like a Writer” the novelist Francine Prose
shows how to do it. She forces the act of slow reading by singling out
excerpts from her favorite writers and zeroing in on single words, then
sentences, then paragraphs, teasing out the specifics that transmute
raw language into style and an artistically meaningful form. She has a
notion, quite correct in my experience, that all readers start out
slow, savoring individual syllables and words. Gradually, under
pressure, they speed up, consuming more but enjoying and absorbing less.

Reading becomes information processing. The sheer bliss of the
childhood reading experience comes to seem like a lost Eden, recaptured
only in thrilling fits and starts or when time, mercifully, stands
still. Prison and vacation make good readers.

Ms. Prose sets out to rewire the reader’s circuitry and get the
electricity flowing the right way again. She has excellent taste, and
she picks fights, which is fun. She heaps scorn, for example, on the
standard advice that a writer should show rather than tell. She also
admits to a prejudice against using brand names in fiction. It’s the
lazy writer’s way of placing a character or establishing a social
setting. Nothing can date a work more quickly, she writes, “than a
reference to a brand of bed linen that no longer exists.”

This argument raises an intriguing question. Balzac and Dickens did
not rely on brand names, but they did minutely describe clothing to
indicate social status and character. Like obsolete brand names, these
styles and, in many cases, the articles of clothing themselves have
become extinct. Only period experts understand the meaning of clothes,
carriages and interior decoration in the world of Turgenev or Flaubert.
What’s a literary realist to do?

[...]

These impediments do not figure at all for Edward Mendelson, who
holds seven classic novels up to close moral scrutiny in “The Things
That Matter.” Each book is chosen because it sheds light on a
significant stage in human life, beginning, naturally, with birth (Mary
Shelley’s “Frankenstein”) and ending with death, or at least the uneasy
prospect of a future minus us (Virginia Woolf’s “Between the Acts”). In
between, Mr. Mendelson, a professor of English and comparative
literature at Columbia University, tackles “Wuthering Heights,” “Jane Eyre,” “Mrs. Dalloway,” “Middlemarch” and ‘To the Lighthouse.”

All seven novels are by women. Three, perversely, are by one author,
Woolf. Mr. Mendelson defends his choices in a rather sophistical
introduction, but then gets right down to the heavy work of close
reading. He can be oppressively earnest. “The Things That Matter” can
seem like an endless sermon or a higher form of Cliffs Notes (“ ‘Jane
Eyre’ records a journey out of a childish world into an adult one and a
journey out of inequality and into equality”), but the author’s shovel
work generally turns up riches. He takes the reader deep into the moral
universe of his authors and pulls together thematic threads with
extraordinary skill. He is a good reader. Not my kind of reader,
perhaps, but he thinks books are important and reads them as if his
life depended on it.

So do the 55 contributors to “You’ve Got to Read This Book!” That’s
how excited they are about “the book that changed their life.” They
need an exclamation point to express it.

There’s something profoundly depressing about seeing “The Seven
Habits of Highly Effective People” listed as someone’s No. 1
life-changing reading experience. But so it was for Lisa Nichols,
described as “a motivational speaker, personal coach and the founder
and C.E.O. of Motivating the Teen Spirit.” Uplift and go-to-it
entrepreneurship trump Virginia Woolf and George Eliot, although a few
fiction titles, like “To Kill a Mockingbird” and “A Tree Grows in
Brooklyn,” make the grade. Otherwise it’s “The Science of Getting Rich”
and “How to Make Millions With Your Ideas.” Maybe the Europeans are
right about us, after all.

I'm not a natural science fiction fan, or I didn't come to it naturally, being biased by spending all my reading on the literary side, from getting my M.F.A. I came in through cyberpunk, during another grad program.

But the story of James Tiptree Jr. being a psuedonym Alice Sheldon held a mild interest when I first heard it, some years ago. This Salon article by Laura Miller opens up a whole new side of things and I'm totally fascinated. Maybe it's how Miller constructs it, but like Tiptree's correspondents, I may even be seduced.

Stranger than science fiction

Before JT Leroy there was James Tiptree Jr. -- the writer and alter
ego of Alice Sheldon, a beautiful woman who struggled under the weight
of her talent, depression and sexuality.

By Laura Miller

Aug. 10, 2006 |
People are understandably fascinated by the lives of great artists. We
scrutinize them for the formative experience or the light-bulb flare of
inspiration -- whatever it is that pushes a human being beyond the rim
of the merely good and results in a work for the ages. But in a way,
the lives of the near great are just as illuminating. They're more like
us in both their fears and their limitations, and they're often better
at showing us where the threshold is by not quite managing to cross it.
With them, you can see the precise point when nerve failed,
perseverance ran out, vision faltered.

Take the case of James Tiptree Jr., who for a few years during the
heyday of science fiction's "New Wave," in the 1960s, ... The reclusive Tiptree carried on involved,
intimate correspondences with at least a dozen other writers and
editors. They knew that their friend had gone on safari in Africa at
the age of 6, learned to fly a plane and shoot a gun, worked for
military intelligence during World War II and for the CIA afterward,
published a short story in the New Yorker and obtained a Ph.D. in
clinical psychology. What they didn't know was that he didn't exist, or
not exactly. The person writing under the name James Tiptree Jr. was
actually Alice Sheldon, a woman in her 50s, living with her husband in
suburban McLean, Va.

[...] Yet "James Tiptree, Jr.: The
Double Life of Alice Sheldon" offers a rich exploration of the
attractions and perils of writerly personas, ...
Alice Sheldon, as Phillips portrays her, was a woman who struggled all
her days to do justice to her own knotted and painful experience of
life; she came closest in Tiptree's fiction. But this biography conveys
the pervasive sense of a gift thwarted on the verge of consummation,
and Phillips' meditations on why that happened make this book
exceptional.

What's particularly
provocative about James Tiptree is that almost everything "he" told his
epistolary friends about himself -- down to several passionate but
doomed infatuations with unavailable women -- was essentially true.
Sheldon lived an extraordinary life, and was a woman of immense charm,
intelligence and talent. Yet somehow, she needed the mask, or rather
the alter ego, of Tiptree to write her best fiction. When Tiptree's
real identity was discovered ...
nothing she wrote afterward "was ever as direct, honest and exciting as
her work before she was exposed."

The most difficult
and preoccupying relationship in Sheldon's life was with her mother,
and it's not hard to see why. Mary Bradley was a popular author (she
supported the family with her writing when her husband's business
interests faltered during the Depression), a glamorous Chicago
socialite and a fearless adventurer.

[...]

In a letter,
Sheldon described her mother as "a kind of explorer-heroine, highly
literate (Oxford & Heidelberg), yet very feminine whatever that is.
You help her through doors -- and then find out she can hike 45 miles
up a mountain carrying her rifle and yours. And repeat the next day.
And joke. And dazzling looks ... I am still approached by doddering old
wrecks, extinguished Scandinavian savants and what have you who want to
tell me about Mother as a young woman."

... Sheldon would spend
most of her 72 years trying to figure out how to be a woman. A chief
obstacle was her own mother's manifest success at doing whatever she
wanted while remaining "feminine whatever that is." Sheldon, who
accomplished enough in her time to make the child of a more ordinary
mom feel exceptional, wrote that her mother "didn't provide a model for
me, she provided an impossibility."

[...]

Sheldon and her
mother were very much alike -- but not exactly ... As a
stylish debutante, she was photographed by admiring society
journalists. Then she eloped with a bad-boy poet to live the boho life
of a painter in 1930s California. Six stormy years of marriage ended in
divorce, whereupon Sheldon joined the Army as one of the first WACs.
She got into the burgeoning intelligence field known as
photointerpretation (studying aerial reconnaissance photographs for
enemy installations and activity). Stationed in Paris, she challenged
an Army colonel to a game of chess, played blindfolded, beat him and
shortly thereafter married him.

... Alice returned to
the U.S. and the couple spent a few quiet years running (of all things)
a chicken hatchery in New Jersey. In the 1950s, they moved to
Washington to work for the CIA. Ting ranked high enough to sit in on
National Security Council meetings with the president, but Alice soon
got tired of photointerpretation and went back to school to study
clinical psychology. She eventually earned her Ph.D., studying the
effect of novelty on lab rats, and struck up a lifelong correspondence
with the great psychologist Rudolf Arnheim.

Sheldon had loved
pulp science fiction... but didn't make a concerted attempt to write it until
she was past 50, when research psychology was turning out to be as hard
to stick to as anything else she'd tried. She picked the name James
Tiptree as a lark, inspired by a jar of Tiptree jam in a supermarket...

[...]

This new s.f.,
Phillips writes, aimed for "real characters, atmosphere, social
criticism, style" at a time -- the 1960s -- when speculation about
social change was in the air. Tiptree's first important story, "The
Last Flight of Dr. Ain," coolly recounts a multistop international
journey by a doctor who is in love with a mystical female vision of
Earth. It gradually becomes clear that he's intentionally spreading a
lethal influenza virus as he goes, wiping out the human race to save
the planet.

[...]

Tiptree's stories
fused themes of sex, death and alienation in ways that many of his
readers hadn't encountered before. "I read the first two sentences and
felt like I'd fallen off a high tower," one critic wrote. Tiptree's
fiction gained a following, and the persona blossomed as Sheldon began
regularly exchanging letters with such innovative s.f. writers as
Philip K. Dick, Ursula K. Le Guin, Harlan Ellison and firebrand
feminist Joanna Russ. Sheldon was a charismatic correspondent. (Under
her own name she wrote fan letters to mainstream writers like Tom Wolfe
and Italo Calvino; Calvino was so impressed he wrote back asking to see
her stories, but she never responded.) Those who exchanged letters with
Tiptree felt they really knew him, and both Russ and Le Guin have
confessed to being more than a little in love with him. "Tiptree was a
man designed by a woman," Phillips writes, "and that made him as
appealing as any Darcy or Heathcliff."

[...]

Yet when the
truth about Tiptree was finally revealed, Sheldon didn't feel
liberated. Her writer and editor friends were overwhelmingly supportive
and many were intrigued by Tiptree's true gender. But despite freeing
herself from a deception that had become unwieldy, creatively, Sheldon
felt enervated and wary; she'd interpret the slightest friction in any
interaction with editors and publishers as a sign of her demotion in
status from male to female.

Sheldon wrote in her journal of Tiptree, "I had through him all the
power and prestige of masculinity, I was -- though an aging
intellectual -- of those who own the world. How I loathe being a woman ... Tiptree's 'death' has made me face ... my self-hate as a woman."

[...]

Sheldon's distaste
for her gender wasn't consistent. She was an enthusiastic supporter of
second-wave feminism who joined NOW and subscribed to Ms. Magazine from
the outset. She started and abandoned several sympathetic treatises on
the dilemma of women, especially those women with "atypical" ambitions
and desires.

[...]

Still, Phillips
believes that Sheldon never shook off the ill effects of a youth spent
trying to live up to her parents' expectations and her mother's
example. In school, Phillips writes, "Alice had the bad luck to be
extremely pretty. If she hadn't been, she might have given up the
popularity contest. She might have studied harder, prepared for a
career, and not cared what people thought ... Instead, she cared about
appearances, practiced femininity and flirtation, and got addicted to
the rewards for being a pretty girl." The result was a woman of
tremendous charm who felt exhausted by the company of other people,
even those she liked. Every interaction was a life-sapping performance.

Phillips suggests that if Sheldon had been able to accept those
parts of herself that defied her parents' image of a good girl --
homosexual desires, anger and grief -- she might have been able to
integrate Tiptree into Alice and sustain a brilliant career as an
author without resorting to disguises.

[...]

Sheldon also
suffered from some more commonplace creative problems. Throughout her
life, she rushed into a profession -- painting, the military, clinical
psychology, writing -- with idealistic, grandiose notions of how things
ought to be done. Inevitably, she was stymied by the inglorious
practicalities. She worshiped Mexican muralist José Orozco, only to be
disappointed, upon meeting him in Mexico City, when she learned that he
was painting a rich woman's portrait for the money. Her hopes for
finding a utopia of female empowerment in the WAC were dashed when the
women insisted on behaving like the imperfect human beings they were.
She refused to accommodate the realities of academic life -- department
budgets, grantsmanship -- and thereby quashed her chances at a real
career in science.

Sheldon's
struggles remind me of a famous conversation between the minor British
writer Stephen Spender and the great poet T.S. Eliot. The young Spender
told Eliot that he had always wanted to be a poet. Eliot's reply was
that he'd never understood this thing of wanting "to be a poet"; all he
understood was having some poems you wanted to write.

When what you
really want is to write some poems, you don't let the ultimately
ancillary issues of how a poet should live or whether you're an
exceptional talent get in the way. Often, the difference between a
minor writer and a great poet is a matter of insufficient -- or,
rather, misplaced -- commitment.

With Sheldon, the nagging problem of her identity, who she wanted to be -- a genius, an artist, a scientist, a writer -- kept interfering with the things she wanted to do.
By creating the persona of James Tiptree Jr., she was temporarily able
to finesse the block. In time, though, the puzzle of identity intruded
again, as this new imaginary self sucked up more and more of her time
and energy. (Ellison, complaining that Tiptree wasn't producing a
promised novel, insisted that all that letter writing was the cause.) If she'd managed to
solve her identity dilemma, she might have, as Phillips suggests,
figured out how to write about a girl growing up into a "whole woman."
On the other hand, if she had cared more deeply, obsessively and
passionately about any one of the half-dozen types of work she tried in
her life, she might have looked up from it one day to find that the
whole woman had arrived unbidden.

August 03, 2006

Egad! Should we start watching for these sell-out bloggers to appear on a spammer-scale? Will machines generate these blogs, these three-links-per fake-promo-link-farm posts, and then massively spam the comments field too?

I mean, direct mail folks (and by extension, spammers) operate under a low low LOW percentage response rate, five percent or something, but they look at that five percent (or whatever the number is) as rock solid, an entitlement that justifies a calculus of MILLIONS, gazillions perhaps, of no-friction messages sent out, just to get that rock solid single digit return on an investment of next-to-nothing.

I believe they think it is a valid method of creating value out of thin air. And by their calculations, it may be, but actually, it destroys far greater values to create that single digit value, in the same way fouling your own well does. They say dogs at least know enough not to defecate where they live, but it is a lesson many humans apparently never learned.

Suppose vast numbers of bloggers accept that blog version of an envelope-stuffing "job" below. Would that be enough of a disruption of the noise-to-signal ratio to disrupt the ecosystem of the blogsophere itself?

Comment spam and trackback spam are disruptive, but marginal (at least now they are, because of CAPCHA, but they were once out-of-control enough to radically turn an interactive space into one-way monologues). Google and Technorati already have a hard time parsing link-farms out of their records. What if it became impossible because humans were conned into becoming willing link-farm agents?

The Turing Test generally holds when it comes to distinguishing humans from machines... but what happens when humans are hired as willingly volunteers to become the machines, because machine-generated spam blogs can technically still be detected (AI not really being good enough yet to simulate uniquely human-style randomness)?

Blogsvertise is looking for bloggers to post their ads:

Blogsvertise.com is looking for bloggers who are
interested in getting paid to post a variety of assigned topic entries
in their already-existing blogs.

Bloggers will be asked to post short entries on a variety of topics,
including three links to an advertiser's website in each entry. You
decide what to write; it is not necessary to endorse the advertiser's
product or service.

Each completed task will pay $10 via Paypal after the entry has been online for 30 days.

This is a great way for bloggers to earn a small income in addition to or instead of featuring pay-per-click advertisements.

I include the addresses above, hoping someone with the capability to spam this person will have at it, as karma for the blogosphere pollution this ridiculous link-whoring will generate.

The blogosphere exists because some bloggers chose to make a stand on integrity, to be REAL in a world increasingly of media marketing-created surfaces, the NON-reality-based universe.

Now those media-marketing types want to remake the blogosphere in their image, to make the world safe for NOTHING BUT mindless promotions of every sort.

I have nothing against honestly using advertising on a blog, nothing at all. I do, however, have a problem with the influx of the direct mail/spamming forces who want to apply math on an absurd scale, in order to create highly questionable returns, and in the end, to ultimately destroy the system by overwhelming the search engines necessary to creating the blogosphere's value.

I remember a time before there were search engines on the web. In 1993 and 1994. Spiders, we called them then, when they first started appearing, Web Crawler and Lycos.

BEFORE search engines. Can you wrap your mind around that concept? It's sort of like imagining the world without nearly-universal electrical service (during this massive heat wave, it brings the point home) . I overheard a news report the other referring to the very IDEA of living without electricity as "primitive."

You know, I about gagged at such utter stupidity. Had this news anchor ever read Victorian literature? Would people in the modern age call Victorians, with their drawing rooms and excruciatingly correct manners and social customs "primitive"?!

Ah, nobody reads much anymore, and they wear their ignorance out in the open without even the good sense to be embarrassed by it. I was on a Jane Austen kick again last night, so I'm filled with outrage.

There can be civilized worlds without electricity, and yes, Virginia, the Web once did exist in its present form without search engines (I'm really not counting Gopher, Archie, and Veronica, because those were tools for the Internet proper, the blinking cursor Internet, which is not to put down their rimportant ole in the development of the Net).

My caution here is that if the "Well" of the Internet becomes so fouled by these entities or agents as to render search engines inoperative, if their scale overwhelms even Google's massive server farms (server farms, good, link farms, bad), we may find ourselves in the same sort of anarchy online as the world would go into without electricity.

But the Blogosphere COULD survive.

What we'd have to do (and maybe this is obvious to old-timers from the 1990s, but I want to make it explicit to newer arrivals) is simply recreate the original purpose of the Blogroll, as a TRUE ROADMAP to what would become an utterly roadless world, a chaotic sea of information and expression.

That, btw, is what made one of the first million-of-hits web sites a success, in 1994. John December's Internet Web Text was a guide for a world without search engines. Because of the carefully-selected value of his travel-agent's guide to the Web, the space became intelligible to people as a landscape, in a way that the blinking cursor and Gopher never could create for most of us.

If glorified link-farm spam comes close to killing the blogosphere, this is something we will have to do as well, to rebuild all the roadmaps for a world without search.

May 20, 2006

A high school student did this terrific independent study, and I just LOVE the questions below that came out of it. Most excellent. Should note, also, as Ben does below, that he worked with a really extraordinary teacher, so I've gotta put a link and a shout-out to that teacher's blog too: Jesse Berrett teaching at San Francisco University High School.

Kudos, y'all! And I may be emailing for the whole document of your research as well. Fascinating stuff. I'm especially fond of the second bullet point below.

Is the “wisdom of crowds” always better than the opinion of one, and if so, how does that wisdom get “mined” on the web?

This bullet point also makes my socks roll up and down too!

Is objectivity in media “a view from nowhere”? In covering any controversial story, the media tends to simply let whoever has been defined as "the sides" dictate their beliefs and just do an "X said, but then Y said" story.

From September-December (first semester) I embarked on an academic study on blogging and the intersection of journalism, media, and the 'net through my school's Independent Study program. I thus received academic credit for this work (I know, I was elated too!). My faculty sponsor was Jesse Berrett - he's the chair of our History department but well versed in a broad range of topics including popular culture and internet stuff. As a book critic for the San Francisco Chronicle, Salon, and others on the side, and armed with a handy dandy PhD from Cal, he brought a healthy dose of skeptical perspective I needed. (His own blog is, for now, brief reviews of books he reads - all genres and types - and pictures of his baby. In 2004 he read 254 books, and he reflects on that year of reading here.)

Excerpts from the results of our research and work follows. If you'd like the whole document, email me.

Questions discussed:

Do all conversations lead somewhere? How effective are conversations with many talking compared with one person lecturing?

Is the “wisdom of crowds” always better than the opinion of one, and if so, how does that wisdom get “mined” on the web?

What process do people go through to change an opinion? Are opinioned-blogs and the ensuing spirited conversations changing anyone’s opinion? How often do blogs (or any conversation for that matter) go beyond “I think this” and “I think this.”

What role do blogs have besides the obvious one of being a watchdog/critic of mainstream media?

What is the future of “hyperlocal journalism” where neighbors and community members write local stories in an online format?

Is objectivity in media “a view from nowhere”? In covering any controversial story, the media tends to simply let whoever has been defined as "the sides" dictate their beliefs and just do an "X said, but then Y said" story.

What are the limits/constraints of the blogging form versus the future possibilities?

Does a lack of referees on the web tends to support an everyone-has-his-own-truth world where “truth” is up for grabs? Is it realistic to hope for a higher-up authority to separate truth from fiction either on the web or offline? How does the increasing lack of trust in institutions in America affect this?

[...]

Here's another good observation:

There seems to be a “truth” coalescing about blogs in mainstream media, which is that they are usually good watchdogs, but tend to be prey to all sorts of crazy rumors, speculations, and conspiracies. Thus, “the jury is out.” This may not be true, but this is the account that most major papers run whenever there’s a story covering aspects of blogging—a couple good things, a couple bad things. In essence the “they say X, these others say Y” story.

One venture capitalist blogger I read a few days ago said that his bet for the “the next big thing” is around an emerging “architecture of participation” or as he put it, “the revolution of the ants.” Everyone getting into the action. The participatory nature of blogs versus the one-way lecture of mainstream media is crystallized for me every time I read a column in the New York Times or Chronicle that I want to talk to someone about. I may agree or disagree or want to learn more. How can I scratch that itch? I can write a blog post linking to the column with my thoughts and solicit feedback or read others who have blogged about that column.

[...]

I like that, "revolution of ants." Granular. Cumulative. Asymmetrical. I just find myself nodding through this whole thing, yes, yes, yes. Vulcan mind meld, dude! I wish I wrote this well when I was in high school.

Blogs At Their Worst

“One of the biggest criticisms of blogs is that so many are self-absorbed tripe. No doubt, most are only interesting only to the writer, plus some family and friends,” writes Dan Gillmor in We the Media. He goes on to say that’s no reason to dismiss the genre, but it does raise an important question: does society need a lot more people voicing opinions or thoughts and does that create more produce intellectual, cultural, moral, etc. progress? I mentioned in the “best” section that my blog gives me a voice. It would be arrogant to argue that my voice needs to be heard, but not that nut-job propaganda-spreading conspiracy-theorist. The leading bloggers and pioneers in this field seem to agree that there should be virtually no restrictions or exclusivity in the blogosphere with a bet being placed on the notion that the best blogs will bubble to the top through links.

[...]

Do blogs promote an opinion-first, evidence-later trend in our society? Jay Rosen sees a new trend unrelated to blogs pertaining to information-gathering: first get opinions, then analysis, then hard news. One could extend this trend to people first expressing opinions, then maybe finding some articulate analysis to back up their opinions, and possibly some real data supporting their points. It is easy to blog an opinion or rant. A good footnoter is also a good linker, hence the emphasis by respected bloggers to link to sources or other sites to back up posts. But without some sort of “authority” deciding what has some foundation versus simple crazy rants, the blogesphere can house bundles of unsubstantiated opinions.

Young author's book has passages similar to other published work

BOSTON --The debut novel of a Harvard University sophomore includes several passages that are similar to a work by another author published in 2001.

Kaavya Viswanathan's "How Opal Mehta Got Kissed, Got Wild, and Got a Life" was published in March by Little, Brown and Co., which signed her to a hefty two-book deal when she was 17.

On Sunday, the Harvard Crimson reported the similarities on its Web site, citing seven passages in Viswanathan's book that parallel the style and language of "Sloppy Firsts," a 2001 novel by Megan McCafferty published by Random House.

Viswanathan, whose book hit 32nd on the New York Times' hardcover fiction best seller list this week, did not immediately return a phone message seeking comment. When reached by the Crimson on Saturday, she said: "No comment. I have no idea what you are talking about."

Michael Pietsch, the publisher of Little, Brown, said Sunday that the company planned to investigate the similarities.

"I can't believe that these are anything but unintentional," Pietsch said. "She is a wonderful young woman."

McCafferty told The Associated Press in an e-mail Sunday that some of her readers pointed out the likenesses.

[...]

Viswanathan said last month that she wrote the book during her spare time during her freshman year at Harvard, clicking away on a laptop in Lamont Library.

She is the youngest author signed by Little, Brown in decades, and the movie rights for the novel have already been sold to Dreamworks.

McCafferty is a former editor at Cosmopolitan who has written three novels.

"I do think this is one of the most difficult situations for an author," Joanna Pulicini, McCafferty's agent, told the Associated Press in an e-mail.

Difficult situation for an author? WHICH author? It's difficult for the person who had entire passages from her work lifted and put into another work by some Harvard freshman. It's difficult to see such a person climb the best seller's list when perhaps you didn't.

The ONLY way this isn't plagiarism is if you believe you can stick 100 monkeys in a room with typewriters and eventually one of them will type the complete works of Shakespeare.

April 20, 2006

One morning last month, I woke early, finished a book I'd been reading, and shut down my blog.
I had kept the blog for nearly five years, using it as a repository for
personal anecdotes, travelogues, and the occasional flight of
fiction—all of which I hoped, eventually, might lead to a novel. And
then, somewhere between the bedsheets and 6 a.m., I realized something:
Blogging wasn't helping me write; it was keeping me from it.

[...]

[What she writes here below... I remember
feeling EXACTLY this when forced to crank crap forgettable articles for
newspapers, either the stress from constantly meeting new people and
feeling shy while at the same time having to commodify them, or the
personal humilation at knowing I was just dumping my interview stories
into a rote formula. That was in the 1980s, though, so I went into an
MFA program in creative writing instead, to learn to write for myself
and my own standards again. And now, many many years later, blogs have
taken me past the creativity-cramping I used to feel after some of my
poems got published in good literary magazines, the fear of writing a
poem that was pure crap. Blogs are works in progress, and publications.
They're a dessert topping AND a floor wax. Can't beat that.]

Just
prior to that, I'd been writing for an alt-weekly in Austin, Texas.
What began as a great job had curdled into an anxiety nightmare. I
would burn to write a certain profile and then, deadline looming, I
would stare at the computer as another beautiful Saturday ticked away.
I can remember crossing the street one night and thinking, absently,
"If I got run over by a car, I wouldn't have to finish that story!"
Don't get me wrong—I didn't want to die. I just wanted a really long extension.
Thus my decision to leave the job. Thus my journey to the southern
hemisphere. Thus the blog that I started, thinking no one would read it
and secretly hoping they would. The blog was the perfect bluff for a
self-conscious writer like me who yearned for the spotlight and then
squinted in its glare. [...]

Eventually, I began enjoying my writing again. I stopped worrying
about deadlines, audience, editors, letters to the editor, all the
stuff that had smothered me before. I was writing so fast that I didn't
have time to double-think my sentence structure or my opinions. What
came out was sloppier but also funnier and more honest. I started
getting e-mails from people I'd never met, and they were actually
encouraging.

[...]

OK,
now here's the bit that's my favorite quotation. Maybe I'll put this
line up in the banner rotation on my blog. If it isn't too long.

At
times, I started to feel that jokes and scenarios and turns of phrase
were my capital, and that my capital was limited, and each blog entry
was scattering more of it to the wind, pissing away precious dollars
and cents in the form of punch lines I could never use again, not
without feeling like a hack. You know: "How sad. She stole that line
from her own blog."

[...]

[Isn't
that cool? And it's just SO true. I've also felt that way about some
rants I've written to a listserv I manage. I never know what's going to
bite me in the butt, or when the stars will line up and the universe
or the tenth muse is going to speak through me, so stuff comes out that
wouldn't otherwise. I have to appreciate that. But if that's the ONLY
place where the universe is channeled through my writing, then I'm
pissing away my best ideas too.

It did help a lot when I was
writing my dissertation, to be active in private listserv discussions
in the Xenaverse, the fandom community I was studying. Invaluable,
actually, and I owe those people so much for being a sounding board,
and for arguing with me when I got it wrong. Sometimes, now, when I read my dissertation,
I wonder who wrote it, because I do feel that the tenth muse, or at
least Xena and Gabrielle, were using me for a channel. Doesn't feel
like I wrote it at all, and on really good days, I swear the
dissertation wrote itself. I don't know too many harried and harassed
grad students who can say that about the process, but it was one of the
best times of my life.

There's one more bit that the writer has here that I thought was a particularly pithy observation:]

I suspect I'll come back to blogging eventually. It will
be something I quit on occasion, like whiskey and melted cheese, when
the negative effects outweigh the benefits. Practically every blogger I
know has taken their site down at some point—for personal reasons, for
business reasons, for boredom reasons. It's no different from the way
we have to turn off our cell phones or stop checking e-mail so that we
can actually focus on something. As much as I loved writing online,
it's a relief writing offline: taking time to let a story unspool, to
massage a sentence over an afternoon's walk, to stew for days—weeks,
even—on a plot line. What a modern luxury. Now, if I could just turn
off the TV, I think I could finally get started.

I've been there and done that, and yes, it most certainly did feel
very good. I really started blogging in earnest with Radio Userland in
early 2002, and the blog I was posting to at that time (nameless here
forevermore) rose up respectably in the blogosphere, and in the mostly
tech-blog atmosphere of that time, got linked to by the A-listers of
the day, which satisfied me but wasn't the be-all and end-all of my
existence, having gone through that previous newspaper experience that
drove me into the Arkansas MFA program. Regardless of who linked to me,
I was really still writing for myself.

So by early 2003, and for certain by the start of the Iraq War, I
was immersed in war coverage at work, putting in a lot of overtime,
worrying about some war-bloggers in Iraq who are my friends, and I just
needed some time off. I went inward, started learning the I Ching, and
filled four handwritten journals in six months. Felt real good. And
then I started blogging again. That felt good too. It's called "being
your own boss."

March 10, 2006

I've been obsessively following all the Octavia Butler obituaries and kicking myself up and down the street for not having found more of her work when she was alive. The more I read and learn about her and her unique sensibilities, I keep wishing I'd have had a chance to meet her. But it is unlikely that I would have, because she was an intensely shy and private woman. Yet the impact of her words ripples across the Internet and the blogosphere while it barely causes a brief blink in the world of mainstream media or even the walking-around life of a lot of people.

Why is this? I hear all the time assumptions about "all people" or "most people" or about how much "people" don't read or think about anything except what they're told to read or think about, and even then, "they" just to respond deterministically, as buttons to be pushed, automatons to be easily manipulated. Folks I work with tell me that such people rule the world. Some mass media types I know even truly believe that sonambulistic masses are all that exist, and in that world view, I guess they are, because the fewer number of people who choose to live their lives in different ways are so marginalized into irrelevancy.

And I think, "Oh, 'those people' must be the kind who never get around to reading the works of Octavia Butler."

Don't think I'm about to go into a "Philistines at the gate" rant, because that's not what I'm drawing a bead on at all.

Rather, I want to simply highlight a peculiar divide, one that may be obvious to many people, so much as to beg the question, but I think it's a divide that we do need to make explicit, make visible, make conspicuous, even if it is obvious.

The divide isn't the so-called "Great Divide" between literacy and illiteracy. I don't believe in that as some kind of magical cognitive shift. It isn't the growing divide between the technology-haves and have-nots either. This is a self-selected divide, but even so, it exerts social force. Power? Maybe, or maybe not. (I do like reading stuff about something called "The Long Tail")

Those on the side of the divide who choose to read and think and do so actively, pro-actively, and interactively seem to be migrating away from passive media, advertising inundation, and rhetoric that insults their intelligence, and TOWARD the Internet, blogs, TiVo, and podcasting, not that they are more intelligent spaces (generally,those spaces are as mixed in intelligence as our general society is mixed, among the ideas that get published and spread around), but rather because the mix of stuff out there yields results for those who do think and actively search and pursue their own research questions or ornery and contrarian questions, or burning personal questions. These people know how to analyze and sort through a mixed bag of information and ideas.

Those who are frustrated in the wild and woolly spaces of the Internet may not have good sorting or analysis skills, or may prefer a passive style of media that rewards a lack of independent curiosity ("push"), or I just frankly don't know what, because I confess I have a hard time wrapping my mind around the motivations of such folks, the idea is so far away from my own experiences. But that is the divide I see self-sorting itself into our society.

Except that I perceive that the active and interactive thinkers, writers, and searchers are more marginalized and disempowered in U.S. culture right now, and the manipulators of the passive masses appear to be strongly pushing for dominant hegemony. This feels counter-intuitive to me, since cleverness ought to trump dullness, to my mind. I rarely see that happening, so perhaps I am wrong. Somehow people with a greater dullness of thought are being allowed to oppress people who are involved with more active thought.

One friend of mine likes to compare the (hermetic?) insider codes of the active online groups to other subcultures that have to move through and remain largely invisible inside a more dominant and restrictive mainstream culture, like the gay subculture, for instance.

This has happened in history, so I shouldn't be too surprised by it, I guess, with the rise of fascism and totalitarianism in the 1930s, with the Inquisition, the medieval period, various so-called "dark ages" when those who want to know less become righteous and demand that all people be like them, while seekers and thinkers are oppressed.

Are Internetties the "freemasons" of our growing dark ages, passing on
hermetic secrets in traditions designed to preserve certain kinds of
knowledge and skills that are being forgotten in our society as a whole?

I turn to Octavia Butler, just as I would Ursula Le Guin, as in "The Dispossessed." And I see all around me online people turning to Octavia Butler as well. This is a hopeful sign, except when I look outside our closed world of the Internet and the blogosphere fans of Octavia Butler. Where beyond here is her influence felt?

Octavia Butler, A Lonely, Bright Star Of the Sci-Fi Universe

There
she was, this woman of great intellect, of immense talent, of
tremendous passion, and, it seems, so very much alone. Her death on
Friday after falling and hitting her head outside her home in Seattle
has rattled those who loved her work. She was 58.

There she was,
a tall, awkward and shy black girl thinking that she wanted to write
science fiction, of all things. A young woman who believed the genre
could deal with more than ray guns and transporters, and that she had a
right to create fiction that tackled race and class and what it meant
to be human in worlds where humanness had all but been obliterated.
Publisher after publisher must have been puzzled. How could science
fiction be set on a plantation?

Octavia Butler showed them how.

She
was an African American woman claiming her space in a literary universe
dominated by white men. After years of rejection, she eventually won
science fiction's most prestigious awards, the Nebula and the Hugo. She
picked up other honors along the way, too, including a PEN West
Lifetime Achievement Award and a MacArthur Foundation "genius" grant.

Her
following was loving and loyal -- protective even -- for they seemed to
know instinctively how precious and powerful and simultaneously tender
and fragile a spirit like hers had to be.

"That's terrible,
terrible, terrible news," my mother kept saying over and over at word
of Butler's death. A die-hard science fiction fan, she is one of those
people who gobbled up many of Butler's 11 novels. I was proud of myself
for having turned her on to Butler's first work, "Kindred." Soon she
was devouring the other works, among them "Dawn" and the highly
regarded "Parable of the Sower."

[...]

The public and private lives of Butler, Due says, were remarkable to watch. "It's almost as if she lived in two worlds."

"I'm
very happy alone," Butler once told Post writer David Streitfeld. "If I
had to change myself into something else, I'd probably be unhappy."

She
grew up poor in Southern California, where her father shined shoes
before he died when she was a young girl, and her mother cleaned
houses. Butler was a young black woman coming of age at a time when
black women were mainly invisible. And when she was noticed, it was
with unkind eyes. She was six feet tall by the time she was in her
teens, a girl with deep brown skin and short hair. She was sometimes
mistaken for a man, she would say. Early as a child, she cocooned
herself in a world of books and nurtured audacious ambitions.

"She
obviously had spent a tremendous amount of her early life feeling very,
very alone," Barnes said. "She had no tribe. She didn't fit in any
place. Her own family thought she was nuts . . . because of what she
wanted to do with her life."

At one time Barnes lived just six
blocks from Butler and they would spend time together, having dinner or
just talking. One of the questions she seemed to care greatly about
was, "Why is it that we are so cruel to each other?" Barnes says.

"The fact that she was so concerned with that made me think she had faced a lot of that" cruelty in her life, he adds.

She explored the question in a field that was forced, whether it wanted to or not, to acknowledge her talents.

"Women in general were rare in the science fiction field, and black women, ha," Barnes says.

She
had to cloak her ideas thickly in metaphor, he says. "She was forced to
speak through layers of obsfucation." Those challenges may have
ultimately made her a better writer but must have taken their toll.

"It was like trying to drive in the Indy 500 with your brakes on," Barnes says. "You burn up."

God, I love that metaphor!

[...]

They worried about her, up there alone and probably pushing herself
far too much, both in her writing and her travels. But she was drawn to
the Pacific Northwest, they say, with its natural beauty and its
opportunities for true solitude. Due wanted to call, but worried about
interrupting her writing, the words that seemed so hard to come by
lately.

I wonder if in all that aloneness, in all her solitude, she knew just how beautiful she was and that she was loved.

September 19, 2005

A very interesting use of a class blog for high schoolers. Also of interest to teachers will be the instructors statement of responsibilities and introduction to blogging on the site. Bernie Heidkamp is the teacher, and it looks like some good things are happening in the classes.

Each week you have a blogging assignment. You need to provide original entries to the blog as well as engage in a discussion of other entries by your
classmates and teachers. You can always post an entry on any topic you think is related to the course (quickly linking to a story or a website that you think is informative, for example, or maybe
offering a full-blown cultural analysis of a TV show you just watched) – but in
order for your post to count for the assignment, your entries have to be connected to one of the following:

The text we are presently reading (either your own individual response to the text
outside of class or a response to the discussion we are having about the text
within class)

or

*The “Big Idea” we have associated with the text we
are presently reading(or other Big Ideas that you see as clearly informing
the text)

or

*A piece of contemporary culture that relates to at
least one of the two above

Note: When I say they must be “connected” to one of the above, I mean that
in a very broad sense. The blog is your space for your ideas. The only
absolute requirement is that your response be original.

Each week you must

~Post at least one
substantive entry on one of the above topics (at least one paragraph of 8-10
sentences or, more appropriately for the web, a series of smaller paragraphs) and

~Post at least
two thoughtful, constructive comments on another recent entry (while you
may disagree with the entry, do so respectfully, always trying to give credit
along with criticism)

June 04, 2005

I stew about this all the time. Just had a discussion with someone last night over dinner over whether U.S. television viewers are pure Play Dough in spin doctors' hands, or whether critical thinking and questioning can make a difference.

Sophists would claim that words can be shaped to accomplish any desired result, with enough skill. Does it follow then that an audience is simply a mechanistic button to be pushed by the likes of Karl Rove? If that is the case, I argued at this wonderful Bangladeshi restaurant, why is the Bush Social Security pitch foundering?

I mean, when folks in the media (I work in TV news) saw the study showing that people who supported the invasion of Iraq believed more factual inaccuracies that had been clearly enunciated in multiple sources (in other words, "fact points" that could be easily checked and verified, confirmed) than people who were better informed and basically COULD pass a short multiple choice current events quiz, our incredulous response was something like, "wow, people will believe anything, even if it is wrong."

In other words, large numbers of people ARE Karl Rove's Play Dough.

It was also interesting to look at the correlation to where the folks got their information. Turns out Fox News viewers routinely FLUNK those short current events quizzes, while NPR listeners score the highest.

So the question is, why is the Social Security button-push "persuasion" failing? It is using all the same rhetorical techniques that Karl Rove has used to rewrite every PR and spin doctor textbook in the country. Why isn't the Play Dough cooperating?

Perhaps, as the columnist considers below, audiences never were Play Dough. Or in the immortal words of someone we know too well, "fool me once, shame on you. Fool me… can't get fooled again.

Is Persuasion Dead?

Speaking just between us - between one who writes columns and those who read them - I've had this nagging question about the whole enterprise we're engaged in.

Is persuasion dead? And if so, does it matter?

The significance of this query goes beyond the feelings of futility I'll suffer if it turns out I've wasted my life on work that is useless. This is bigger than one writer's insecurities. Is it possible in America today to convince anyone of anything he doesn't already believe? If so, are there enough places where this mingling of minds occurs to sustain a democracy?

The signs are not good. Ninety percent of political conversation amounts to dueling "talking points." Best-selling books reinforce what folks thought when they bought them. Talk radio and opinion journals preach to the converted. Let's face it: the purpose of most political speech is not to persuade but to win, be it power, ratings, celebrity or even cash.

By contrast, marshaling a case to persuade those who start from a different position is a lost art. Honoring what's right in the other side's argument seems a superfluous thing that can only cause trouble, like an appendix. Politicos huddle with like-minded souls in opinion cocoons that seem impervious to facts.

The politicians and the press didn't kill off persuasion intentionally, of course; it's more manslaughter than murder. Persuasion just isn't relevant to delivering elections or eyeballs. Pols have figured out that to get votes you don't need to change minds. Even when they want to, modern media make it hard. They give officials seconds to make their point, ignore their ideas in favor of their poll numbers or showcase a clash of caricatures, believing this is the only way to make "debate" entertaining. Elections may turn on emotions like hope and fear anyway, but with persuasion's passing, there's no alternative.

There's only one problem: governing successfully requires influencing how people actually think. Yet when the habits of persuasion have been buried, the possibilities of leadership are interred as well. That's why Bill Clinton's case on health care could be bested by savage "Harry and Louise" ads. And why, even if George Bush's Social Security plan had been well conceived, the odds were always stacked against ambitious reform.

I'm not the only one who amid this mess wonders if he shouldn't be looking at another line of work. A top conservative thinker called recently, dejected at the sight of Ann Coulter on the cover of Time. What's the point of being substantive, he cried, when all the attention goes to the shrill?

[...]

But beyond this, the gap between the cartoon of public life that the press and political establishment often serve up and the pragmatic open-mindedness of most Americans explains why so many people tune out - and how we might get them to tune back in. Alienation is the only intelligent response to a political culture that insults our intelligence.

I know I've been bad with posting news of my trip and arrival in Missoula, but here are some photos of the stunning scenery I've been looking at since I got to Big Sky Country.
I've been busy with the start of school and building blogs like mad, but I did find time this Labor Day to take the dog out to the Rattlesnake Wilderness Area and climb a ways up Blue Mountain on some horse trails. I was out there just at Home on the Range, or something like that. Not too high, but it did give me a nice view of the Missoula valley, which was once a massive inland lake larger than any of the Great Lakes, kept in place by a glacial dam during the Ice Age. Neat, huh? When the glacial dam busted through, it sent a 500-foot wall of water all the way to the Pacific, or so they tell me (I'm from Alaska, where we always try to BS the new people in town).
On some of the mountains around town, you can see past shorelines of the lake, but sorry, not visible from any of these pictures. If you see a flash of water in some of the shots, that's the Bitterroot River, which runs into the Bitterroot Mountains, which I hear are big and gorgeous. That will be my next destination!
We also have a confirmed Lewis and Clark campsite here (confirmed because of the chemical content found in the latrine, heh), with bicentennial events running Sept 8-11, which was the exact time that Lewis and Clark slept here and used their latrine, 1805. Woo woo.

There were snow warnings in the passes and fresh dust on some mountains. And unbeknownst to us, a bunch of boulders had actually collapsed in on one of the tunnels the day before (according to AP) and closed the trail. But an outing of UMT J-school grad students and faculty mostly had a bit of a cold wind to contend with at the top of the trail, and a 1.7-mile curved tunnel with no lights but our own head and bike lamps to get us through. Wooo-eeee-oooo! I should look up the exact number, but 7 tunnels or so, and about as many trestle bridges, including that one looong one you see in the pictures. Way cool! My inner clock was messed up tho, because we kept crossing back and forth on the Montana/Idaho state line, so we kept gaining an hour, losing an hour, gaining an hour, losing an hour...

The weekend before Thanksgiving I went to a neat unpretentious ski place in the Bitterroot Mountains to the south, called "Lost Trail: Powder Mountain," off a tip from some folks at the ski shop. That weekend only two places were open (the other was over by where I took that bike trip from the other photo album, Lookout Mountain). What terrific luck! The lodge is rough and a crowded mess with people clomping all over, total nostalgia from skiing at places that don't assume everyone is filthy rich. Locals tell me last year there was so little snow, Lost Trail was the only place that could even stay open. Reminds me of Alyeska in Alaska in the late 1970s, long before anyone even thought of putting in a tram. Only thing different was I didn't see anyone skiing in Carhart coveralls like they used to in Alaska.
It turned out to be a stunning day, and I had so much fun I'm going back here the Sunday after Thanksgiving, rather than the closer Snowbowl, which is only open at the very top and still doesn't have much snow. But Lost Trail got a bunch of new snow the last few nights, so it should be great. I'm hoping the back mountain lifts open up too.

Last ski trip in Montana, unless I go again. I wanted to go at least once to one of the famous Montana ski places, and Big Mountain in Whitefish was just right. What amazing views of Glacier National Park to the east and clear into the Canadian Rockies to the north! I stayed at Pine Lodge in the cute little town and took the free Snow Bus up the mountain. In the pictures that follow, you'll see the odd effect of a temperature inversion that left the valley in the single digits and socked in with fog, but gave us in balmy upper 20s and gorgeous sunshine on the slopes. A few other things you should know: The fog/cloud deck and snow frosts up the trees into strange shapes, like those above. At Big Mountain they call them "Snow Ghosts," and clearly a lot of folks love the tree skiing among the ghosts. I stayed on the groomed runs, as the freezing and thawing made the rough stuff too challenging. Intermediates were great fun, and really easy too, as the mogules I'm more used to were groomed down to corduroy. Not icy at all. It seemed to turn intermediates into steep granny runs, but they were definitely steep. Hey, I never fell once, and skied hard right up to darkness and closing lifts.