Archives

There is nothing quite as catchy as a great pop song that deploys whistling. I was reminded of this truth last night at a show by the New Pornographers last night at Oakland’s majestic Fox Theater, where the band’s generous set included “Crash Years” — a song from its new album, Together, that features an infectious whistling chorus.

(I have to admit that the whistling volley loosed by the NPs last night was so solid, indeed so flawless, no stray sibilance or wobbles offkey, that I did wonder if it was live or sampled. I mean, the band members were whistling into their mikes. But these days, who knows?)

I was all set to write up a post about other great whistling songs, but soon discovered that it’s been done already.

The Spinner list is a pretty good one. But it’s heavy on songs that use whistling as a drop-in solo or a bridge or an outro. Those are great, but this aproach neglects examples in the grand “Colonel Bogey March/Bridge On the River Kwai” tradition — where the whistling carries the entire tune of a refrain.

My own favorite in this genre is Brian Eno’s “Back in Judy’s Jungle” — the missing link connecting the world of Colonel Bogey with that of “Crash Years.”

As for the rest of the New Pornographers show? With eight, sometimes more, people on stage, they have turned into indie power pop’s equivalent of a Big Band. Indeed, at times, with their tight harmonies and deep catalog of songs that feel like instant classics, they made me think of our era’s equivalent of the Band — with roots dug not in the country-folk tradition but instead in the now-long history of eccentric smart pop. Great, complex music: we’re lucky to have it.

I’m not interested in the argument over whether this new year’s marks the end of the decade-with-no-name. Since we celebrated the end of the millennium 10 years ago, I think we’re stuck. And you can bet that when 2019 rolls over to 2020 we’ll do the same.

My list, for your pleasure, is the decade in music — my personal bests. It will be no surprise to longtime readers here. This is the stuff that stuck with me through the years, that kept my body moving, my mind working and my heart opening. I’ve made most of these entries in pairs (or more) — because I can.

RUNNERS-UP:

Beck: The Information (2006)

The Decemberists: The Crane Wife (2006)

The Gaslight Anthem: The 59 Sound (2008)

Richard Thompson: 1000 Years of Popular Music (2003)

Wrens: The Meadowlands (2003)

XTC: Wasp Star (Apple Venus Vol. 2) (2000)

TOP TEN (IN ELEVEN):

(11) Garage Band and Rock Band: Apple’s software put remarkably high quality basement-taping music-making tools onto every Mac. Rock Band may be a toy, but it’s irresistible, and it schools young minds and bodies in the notion that music is to be made as well as consumed.

(10) Pernice Brothers: The World Won’t End (2001); Discover a Lovelier You (2005) — Definitely the sleeper in this bunch for me. When I first heard Joe Pernice’s work in 1998’s Overcome by Happiness I was impressed but a bit bored. Over time I came to appreciate, then crave, the combination of lush pop arrangements and astringent lyrics.

(9) They Might Be Giants: No (2002); Here Come the ABCs (2005)– For me this decade was all about raising a pair of twin boys. TMBG’s forays into children’s music were that process’s soundtrack — and frequent tonic. “No” offered my three-year-olds an early introduction to absurdism, and its charming animations proved an endless diversion. (“Robot Parade” introduced them to the term “cyborg” — and gave them a chance to misremember it as “borg-cy,” which we will never forget.) And even though, by the time “ABCs” came along, the alphabet had long been mastered, the music (and great accompanying videos) won over kids and grownups alike.

(8) The Long Winters: When I Pretend to Fall (2003); Putting the Days to Bed (2006) — Sharp tuneful alt-rock with an edge and a brain. My only complaint about singer/songwriter John Roderick? Low productivity!

(7) The Fiery Furnaces: Blueberry Boat — The Friedbergers, brother and sister, moved from the more forthright songwriting of their early tracks to the increasing obscurity of their more recent work. But along the way they created this masterpiece of baroque verbiage and extravagant music.

(6) Tobin Sprout: Lost Planets and Phantom Voices (2003) — Deep autumnal soundscapes and pop paintings from a maestro of gentle melody. The former Guided by Voices songwriter, far less profligate with his talent than that group’s leader, Robert Pollard, hasn’t put out an album since; he seems to be concentrating on painting these days. Too bad!

(5) Green Day: American Idiot (2004); and The Thermals: The Body, the Blood, the Machine (2006)– Two punk operas about Bush-era America. Green Day’s megahit album drafted Who-style song suites and hook-laden power-trio riffs in the service of a narrative about disaffected no-future youth; the Thermals channeled a Buzzcocks sound for their grim portrait of a young couple trying to escape a fundamentalist/fascist America.

(4) Mekons: Natural (2007) — These veterans kept producing challenging, creative work through the decade. Each album, from Journey to the Edge of the Night (2000) to OOOH (2002) to Natural, improved on its predecessor. Natural is the band’s version of pastoral — a contemplative, acoustic-heavy set of laments for the end of nature.

(3) Frank Black/Black Francis: Dog in the Sand (2001); Bluefinger (2007) — FB/BF has been as prolific with his songs as he is fickle with his stage name. These albums were his peaks of the decade. Dog in the Sand ranged from fierce Stones-style rockers to the almost unbearably beautiful “St. Francis Dam Disaster.” Bluefinger used the story of Dutch glam-rocker Hermann Brood as the spine for a memorable set of Black classics.

(2) The New Pornographers: Twin Cinema (2005), Challengers (2007) — I do not know how A.C. Newman and his cohorts do it, but each album adds to my respect for their genius. When I read somewhere in an interview that Newman is a big fan of Eno’s “Taking Tiger Mountain (By Strategy)” it all made sense.

(1) The Mountain Goats: Tallahassee (2003), We Shall All Be Healed (2004), The Sunset Tree (2005) — Don’t think I’d have made it through these years without John Darnielle’s music. Thank you. Happy new year!

The feature noted that photographer Edgar Martins “creates his images with long exposures but without digital manipulation.” Now it turns out the Times has removed the photos from its website and posted an embarrassing editor’s note admitting that the photos had been “digitally manipulated: “Most of the images,” the editors wanly declare, “did not wholly reflect the reality they purported to show.” It seems that, in some sort of misguided effort to create more pleasing images, Martins duplicated and then flipped portions of some photos to create a barely perceptible mirror image: a sort of fearful — but now, we know, bogus — symmetry.

Before I get any more Borgesian on you, let me point you back to the interviews I did with the photographer and multimedia artist Pedro Meyer back in the early 90s — one from the San Francisco Examiner, and one from Wired. (Please note that the Wired piece got mangled somewhere between the magazine and the Web; the intro paragraph appears at the end.)

This, from the Examiner piece:

Pedro Meyer points to one of his photographs and says, “Tell me what’s been altered in this picture.”

The photo shows a huge wooden chair on a pedestal – a Brobdingnagian seat that looms over the buildings in the background with the displaced mystery of an Easter Island sculpture.

It’s difficult to say what’s going on here: A trompe l’oeil perspective trick? Or the product of digital special effects?

Meyer is a serious artist and philosopher of technology, but today he’s playing a little game of “what’s wrong with this picture?”… The truth about the chair photo is that it’s a “straight” image: It’s just a really big chair.

Meyer says he took the shot outside an old furniture factory in Washington, D.C. But the self-evidently transformed pictures that surround it in his exhibit – like that of a pint-sized old woman on a checkerboard table carrying a torch toward an angelic girl many times her size – call its accuracy into question. We stare and distrust our eyes.

So is Pedro Meyer, who started out as a traditional documentary photographer, out to subvert our faith in the photographic image, our notion that “pictures never lie”? You better believe it.

“I think it’s very important for people to realize that images are not a representation of reality,” Meyer says. “The sooner that myth is destroyed and buried, the better for society all around.”

[You can see that chair photo in the “Truths and Fictions” gallery available off this page — click through to screen 26.]

And this, from the Wired interview:

I’m not suggesting that a photograph cannot be trustworthy. But it isn’t trustworthy simply because it’s a picture. It is trustworthy if someone we trust made it.

You’re interviewing me right now, you’re taking notes and taping the conversation, and at the end you will sit down and edit. You won’t be able to put in everything we talked about: you’ll highlight some things over others. Somebody reading your piece in a critical sense will understand that your value judgments shape it. That’s perfectly legitimate. Turn it around: let me take a portrait of you, and suddenly people say, That’s the way he was.

We don’t trust words because they’re words, but we trust pictures because they’re pictures. That’s crazy. It’s our responsibility to investigate the truth, to approach images with care and caution.

After learning what Meyer was trying to teach me, I can’t get too huffy about Martins’ work. There is no sharp easy line between photos that are “manipulated” and those that aren’t; there is a spectrum of practice, and when a photo is cropped or artificially lit or color-adjusted or sharpened or filtered in any way it is already being manipulated, even if Photoshop is never employed. Martins’ pictures are beautiful and arresting, and if he’d simply told the world what he was up to, I don’t think anyone would be too upset.

Of course, if Martins had been forthright the Times would probably not have printed his work, because it has an institutional commitment to, I guess, attempt to “wholly reflect” reality. Somehow.

I was sad to read of the recent death of John Mortimer — playwright, author, bon vivant and barrister. Here’s the story of my own extremely distant connection with him.

I never had much luck applying for internships in college. Part of it was, I’m sure, the times (the late ’70s and early ’80s were almost as brutal a time in publishing as the present) and part of it was my own belief that self-promotion was uncool and my talents spoke for themselves. But my junior year I did finally land an internship reporting and writing for the American Lawyer monthly — something I now recognize as a startup company led by a young journalist named Steven Brill. My heart lay in writing arts criticism, but I had a good head for investigative reporting and I knew a little about the law, so I took the job and got a few clips, and got to know a colorful (and incredibly talented) gang of future luminaries like Jim Cramer, Jill Abramson, James Stewart, Connie Bruck and many others.

John Mortimer

I wrote stories about lawyers and law firms, but I really wanted to write about playwrights and artists. So when I started freelancing full time after graduation I pitched the editors at American Lawyer with ideas for pieces about the occasional overlap cases — people like Louis Auchincloss and John Mortimer. In 1982 Mortimer’s wonderful autobiography Clinging to the Wreckage had just come out in the US, and Mortimer was doing interviews in NY, so I got to meet him. He decided that our interview should take place at Maxwell’s Plum, the legendary but by then (to me) tacky East Side cafe and singles bar, because he’d once set a scene in a story there but had never actually set foot inside. So the tape of my otherwise delightful interview with this drily charming subject was rendered nearly untranscribeable by the loud chatter of the surrounding wannabe-socialite gaggle. At that stage of my career I was still sometimes intimidated by the prospect of interviewing writers I admired; Mortimer was the kind of conversationalist who got me over that generously and quickly. The American Lawyer piece from 1982 isn’t online but a second interview I did with him years later, at Salon, still is.

Here’s Charles McGrath’s Times appreciation. McGrath, like nearly every other obit writer, reminds us of Mortimer’s label as a “champagne socialist,” one that he embraced. He may have lived just long enough to see its utility return for a new era of cheerful crusading on the left.

I want to note new books released by two friends and former colleagues, both of which are just out, neither of which I have yet read, both of which I am fully expecting to delight in.

Readers of Salon and the New York Times Book Review know Laura Miller’s critical writing by its wisdom, range, power and clarity. Her new book, The Magician’s Book: A Skeptic’s Adventures in Narnia, is her first (she also edited Salon’s Reader’s Guide to Contemporary Authors). It’s an unusual combination of personal memoir and literary criticism that is about, among other things, Narnia, childhood imagination, memory and the power of stories. I was always more of a Tolkien guy than a Narnian; I think by the time I got to Lewis’s books their Christian subtexts did not look “sub” at all to me, and I found the whole thing an exercise in crude allegory. But if anyone can make me understand their power, I imagine it will be Laura. If you read this excerpt from the book recently posted at Salon, you’ll see why. (Here’s Laura’s website for the book.)

My professional path has crossed multiple times with Mike Sragow’s: He’s now the movie critic for the Baltimore Sun. When I met him he was the movie critic and editor for the Boston Phoenix, where he encouraged me to write about movies (I’d limited myself to theater and books). Then we worked together at the SF Examiner, and again at Salon. For me, he has always been the best kind of mentor; for his readers, he has always been an incisive, insightful and deeply knowledgeable critic.

Some of my favorite albums are the quartet of “pop” records Brian Eno made in the 1970s after he left Roxy Music: Here Come the Warm Jets, Taking Tiger Mountain (By Strategy), Another Green World, and Before and After Science. These albums live in my brain and will reside there until I’m dead.

Eno has had a long and storied career since as the creator of ambient music, a producer of wonderful albums by Talking Heads and U2 and many others, and a multimedia artist. But one of the things that he no longer seems to do, much, is sing. He did, some, on a collaboration with John Cale from 1990 titled “Wrong Way Up.” But mostly, these days, he doesn’t. Now, his voice is not a conventionally “good” voice, but I always enjoyed it, and I’ve missed hearing it in new music. [UPDATE — yeah, I forgot about the 2005 Another Day on Earth, probably because I never got that into it. Should try it again…]

All of which is by way of introduction to this delightful piece Eno recently contributed to the NPR series “This I Believe,” in which Eno declares that what he believes in is…singing. It’s a strange admission for him to make after all these years — like, I don’t know, Harpo Marx espousing the virtues of speech, or Greta Garbo expressing her love of crowds. But he makes a good case.

A recent long-term study conducted in Scandinavia sought to discover which activities related to a healthy and happy later life. Three stood out: camping, dancing and singing…. When you sing with a group of people, you learn how to subsume yourself into a group consciousness because a capella singing is all about the immersion of the self into the community. That’s one of the great feelings — to stop being me for a little while and to become us. That way lies empathy, the great social virtue.

I read the recent New York Times magazine profile of Lewis Hyde with some interest. As it happened, I wrote a review of Hyde’s 1983 book The Gift just about 25 years ago as one of my early assignments at the Boston Phoenix. My editor at the time, Kit Rachlis, thought I might find Hyde’s uncategorizable mixture of literary criticism, sociology and anthropology intriguing, and he was right. (As the profession of editing moves into eclipse, let’s not forget that this matching of writer and subject is one of the subtle arts that we do not yet know how to automate.)

At the time, Hyde’s effort to establish a language of value separate from the financial marketplace spoke hauntingly to me — as a disaffected young liberal stunned by the Reaganite rise of free-market, anti-government ideology. The book’s themes feel somehow timely again today, at the end of the arc of history that began a quarter-century ago, as we scrabble through the ruins that said ideology has left of our economy and try to imagine rebuilding along different lines.

I was fascinated to learn from the Times piece that in the years since, The Gift has become a volume of almost totemic stature to writers like David Foster Wallace, Jonathan Lethem and others whom I admire. I’d written that Hyde’s book would “probably be most read and appreciated by those who already grasp its lessons, the visionary writers and artists from whom Hyde draws so many examples.” It appears I was right. But I’m glad to know that the book has had such perennial success — and that Hyde, now a fellow at the Berkman center, has moved on to studying the concept of the “commons,” newly relevant in the Web era. I’ll look forward to his work on that topic.

The funny thing about Nick Carr’s Atlantic cover piece, “Is Google Making Us Stupid,” is that the piece itself has the truncated quality that it blames the Internet for imposing on our culture. When my copy of the magazine (yes, I actually subscribe on paper) arrived I saw the headline and looked forward to a really thorough, in-depth look at this question. Carr’s entirely capable of that; I disagree with much of his perspective in “The Big Switch,” but it’s one of the more cogent and sustained critiques of the Web 2.0 future, and anything but lightweight. So I figured the Atlantic had paid Carr to do what the Atlantic, and only a tiny handful of outlets, can still do: spend many thousands of words digging into the heart of an important issue.

Ah, well. You can still find such pieces in the Atlantic (like this one about rising crime rates in mid-sized American cities), but Carr’s isn’t one of them. At 4000 words, it’s barely longer than the kind of thing Salon does every day. It’s a provocative read scattered with tasty quotes and anecdotes; it asks a useful question but does little to answer it. Carr starts off describing a sense of alienation from old-fashioned reading that he shares with several other people he quotes:

I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

Like Carr, I’ve found myself reading fewer books over the past decade. I can’t tell whether it’s because I’m spending more time on the Web (certainly possible). In my case, if my attention span has shortened at all, I think it’s far more likely that, for instance, raising children has cut into both my available time and my reserve of repose (both actual physical sleep time and emotional reserve of patience). But when I do get the chance to sit back with a good book — like two I’ve recently finished, Faking It (with related blog by authors Yuval Taylor and Hugh Barker) and Clay Shirky’s Here Comes Everybody (also with author blog) — I don’t feel any less absorbed than when I was a teenager plowing my way through a shelf of Tolstoy and Dostoyevsky.

I don’t want to discourage you from reading Carr’s article and pondering the issues it raises. Does Google represent the digital apotheosis of Taylorism (the industrial-age science of labor measurement)? Does the Web crowd out the opportunity for leisurely contemplation or “slow, concentrated thought”? Those of us who use the Web constantly are probably experiencing changes in how we read and think; what are those changes?

These aren’t stupid questions. But they deserve deeper contemplation than Carr has provided. His piece is less like a thoroughly researched magazine piece than, say, the prospectus for a writing project. Perhaps the Atlantic has simply published Carr’s next book proposal. If so, I’d look forward to reading the resulting book — in a relaxed, contemplative way, of course.

The first thing to do is learn how to breathe — very deep breaths, slow. Then you stand in one position if you’re going to conduct, or sing, or whatever, for about a minute, and you deliberately relax every muscle in your body. You become aware of the fact that quite a few muscles are tense, so you relax them, all the way down to the calves of your legs. Then you take one more very slow breath.

And then you say to yourself, what I do here is of no importance whatsoever. I am here as a servant. And if I’m nervous, it means that I think what I’m doing is important. That is an egocentricity which no interpreter can allow himself the luxury of. You’re there to serve the music, and you have to be in the best postiion, psychological and physiological, to do so. Which means no tension, no nerves. Yes, exhilaration. Yes, enthusiasm. Yes, focused energy. But no nervousness. Because that’s counterproductive.

The death of Gary Gygax, co-inventor of Dungeons and Dragons, has occasioned an outpouring of writing on the place of D&D in our culture. Salon’s Andrew Leonard was fast out of the gate identifying the “genetic influence” of D&D on the world of the Internet.

Next came Seth Schiesel in the Times, with observations on how the game brought isolated devotees together socially. In a fine piece in the Journal, Brian Carney pointed out that the original, pre-computerized D&D was simply “structured, collaborative storytelling” — exactly what attracted me to the game in my youth. I cared very little for the encyclopedic rules and charts (which often made little sense in the earliest editions of the game) and frequently ignored them in my own gamemastering, which I viewed as closer to the role of a stage director. My job was to make sure my players had a great time and went home with great stories, which I would recap in a mimeographed magazine.

So Gygax’s passing away occasioned a sort of distributed coming-out party for journalistic geeks. That seems fitting. For me it also served as a reminder of a question that always hovered in the back of my mind during the years I spent roaming others’ D&D worlds and crafting my own.

In D&D and its role-playing descendants, you play a character whose traits are quantified and typically assigned random starting values. This made perfect sense to me as applied to either physical or supernatural abilities — since you weren’t going to pull out a Sword +2 and charge the guy across the table from you, and fireballs were simply not going to fly across your basement room, you needed some sort of proxy system for evaluating individual abilities in these realms and resolving conflicts.

But other traits, like intelligence and charisma, present themselves naturally in the course of game play. The charismatic player was the one who could rally the gang to his side, and no roll of the dice was going to make the group schlub into a natural leader. So what did the randomly assigned values for these characteristics mean? How could a player who was himself a dim bulb play a character with 18 intelligence points? What was a smart player supposed to do with a character with a low brainpower score?

What ought to happen in D&D when the real-world qualities of a player were at odds with the game’s numerical dictates? Which ought to rule — free will or predestination? From that fateful day in 1975 or so that I first played the polyhedral dice, I never could resolve this Miltonic quandary. I don’t know whether today’s World of Warcraft clans face the same questions. But certainly part of the lasting fun that Gygax bequeathed us was the opportunity to grapple with them.