04/04/2015

Really smart article in tomorrow's New York Times Magazine on the subject of my favorite TV actress, the brilliant Tatiana Maslany, and her portrayal of multiple characters on BBC America's Orphan Black.

There's not much to say that hasn't been said about Maslany's amazing ability not only to inhabit all these multiple clone characters, not only to make the physical action seamless, not only to make us forget entirely that the same person is playing these disparate individuals, but also to bring an amazing depth of emotion to scenes in which she only playing against herself. Not much to say except it's long past time for the Emmys to notice this 29-year-old.

Writer Lili Loofbourow does less to illuminate Maslany's personality or the magic of her craft in the piece than she might have, perhaps due to Maslany's own reluctance to talk about herself -- although she gives me plenty of new reasons to adore Maslany for her curiosity and intelligence, like, how many 20-something actresses quote Martha Graham and Agnes DeMille, or comfortably and naturally describe things with phrases like "this is about violition and autonomy"?

Loofbourow seems more interested in her own ideas about Orphan Black than really getting to what makes Maslany tick. But at least the writer has some fascinating things to say about the show, which revolves around a group of genetically identical but psychologically and culturally disparate women discovering that they're actually clones, part of a long ongoing corporate project, who band together to fight for their personal autonomy in the face of claims of ownership.

The description makes it sound like a dramatized feminist treatise, but it doesn't play that way -- though it does hammer hard on other gender and identity themes like nature vs. nurture in a sometimes heavy-handedly way that not even a grad student could love. No, it plays like a wickedly funny action adventure romp full of broad, hilarious characters that would be nothing more than comic at best or corny stereotypes at worst if not for Maslany's ability to invest them with both symbolism and humanity.

But, as Loofbourow writes, the stereotypical qualities of the characters, and their very broadness, is part of the point and magic of Orphan Black:

In its subject matter, “Orphan Black” broods on the nature-nurture debate in human biology, but in its execution, the show cleverly extends the same question to matters of genre. What does the exact same woman look like if you grow her in the petri dish of “Desperate Housewives” or on a horror-film set in Eastern Europe? What about a police procedural? The result is a revelation: Instead of each archetype existing as the lone female character in her respective universe, these normally isolated tropes find one another, band together and seek to liberate themselves from the evil system that created them.

By structuring the story around the clones’ differences, “Orphan Black” seems to suggest that the dull sameness enforced by existing female archetypes needs to die. Early in the first season, there is a serial killer hunting down the clones ­— it turns out to be Helena, the Ukrainian — who ritualistically dismembers Barbie dolls after dyeing their hair to match that of her next victim. It’s a creepy touch, but one that can also be read as a metacriticism of how women are used on TV: the punishing beauty standards to which they’re held, the imposed uniformity. (Need a new sitcom wife? Grab the prototype and change the hairstyle.)

....

One of the most interesting things about the show, and its metacriticism of the genres it juggles, isn’t just how elegantly it addresses the solitude that the lone female character on many shows suffers in her particular TV universe. It’s also how resolutely the show refuses to place these genres in opposition to one another. There’s no condescension here; Alison’s suburbia gets as much visual and narrative respect as Rachel’s evil corporate empire. The characters find one another because the system that produced them and scattered them is breaking down. What emerges is a full, generative map of the possibilities that emerge when you let the Strong Female Character and her lonely sisters from other genres mix. By exploring the different directions that “genetic identicals” can take when differently nurtured, “Orphan Black” shows what a single actor can do when given the opportunity — and, by extension, reveals the interesting stories that emerge when women are afforded the chance to exist in rich narrative relation to one another.

Thanks Lili, you make me feel smart for loving Orphan Black, season three of which begins April 18 on BBC America.

03/30/2015

The tech echo chamber is ringing again -- this time about Periscope (and Meerkat, but that's so two weeks ago!).

Well, everybody's wrong. Or at least everybody is at the extremes -- where you have the predictable tech triumphalists and the predictable naysayers.

No, Periscope is not going to change the world. In fact, so often these days people proclaim that something is going to change everything or change the world that I think it's worth stepping back to consider what it means to invent something that changes the world.

In 1,000 or 2,000 years, when the last surviving pockets of humanity, subsistence farming on the small heavily defended patches of fertile high ground that haven't been flooded out by rising seas or parched by drought, look back on human history, there are things they might point to that changed the world -- the discovery of fossil fuels and the invention of the internal combustion engine are probably going to be high on their list. The semiconductor and the vacuum tube before it. Robots (the ones running the assembly lines that puts all those people out of work before the riots, the famines, and mass die-offs). Computers. Genetic therapy. Bacteriology. The Internet itself, yes, maybe but any individual application built to exploit it? Periscope? Twitter? Facebook? Even the graphical web browser or email server-client systems? Doubt it. Changing the world means the course of life on the planet is altered forever, not just the course of leisure time of a handful of upper middle class white kids until the next cool buzzy app comes along.

Twenty years or more into the commercialization of the consumer Internet, we've long since passed the time when any individual Internet application -- or at least any one involving messaging or the transmission of data -- is going to change the world. We're tinkering around the edges, adding new functionality, new bells and whistles, moving functionality from device to device.

But the fundamental change -- humanity's interconnection through instantaneous, global, many-to-many messaging and data sharing -- has already taken place. Whether that messaging takes the form of 140 text characters or streaming video, on a PC or a tablet or a phone, is just a difference of mode or method, not a difference of kind. The revolution is over. Until the Internet can do something other than move data and information around (say maybe when it becomes fully and ubiquitously connected to universally accessible 3D printers and drones that can actually make and do things in the physical world), we're just dicking around with new toys that do more or less the same things.

Of course, you can pay for a one on one live stream on a cam site, but this isn’t about money, this is about the bleeding edge. And that’s what’s so exciting about Meerkat and Periscope, it’s all brand new.

Like I watched a sunrise in New Zealand. A cove in Australia. Someone making coffee in Amsterdam and a snowy spring in Siberia. Call me a voyeur, we’re all voyeurs, and right now regular people are letting you into their lives, just for the fun of it, and it’s strangely riveting.

They do it for the love. No one wants to be alone anymore. They want hearts and comments and interaction. They’ll perform if you show up and comment.....

I start having weird flashbacks, to Josh Harris' Pseudo, or to the breathless reaction to the first days of Twitter. But by now we all know that in the end no one is going to give a s**t about watching someone making coffee in Amsterdam, for the most part they're going to use Periscope for celebrity gossip, porn, peering into the lifestyles of the rich and famous and sharing personal details with their close personal circles, people they already know in the physical world.

Now, it doesn't mean that in the near term society won't become better or worse because of some new app, probably both -- for every Arab Spring political moment helped along by the likes of Twitter, there's the shameful acting out of celebrities like Chris Brown to suck our bandwidth. It also doesn't mean there's not money to be made or industries to be disrupted by any potential new, hot thing.

I read with interest Mic Wright's The Next Web piece "Periscope won’t change the world – but it appeals to journalists’ vanity." Clearly I agree with the first part of Wright's headline. And as to the second part, well, vanity and self-obsession maybe the defining characteristics of the social media age. (I know, I know, since Tom Wolfe dubbed the Baby Boomers the "me generation," every successive American generation has been given some next iteration of the same appellation. But there's no way around the self-absorption of the selfie era.)

But when Wright says Periscope "won't change news," I think he's completely off the mark.

Wright's contention is that for the most part social media messages, even those eyewitness messages from the scenes of newsworthy events, aren't news but are the source material for news. He's just plain wrong about that. News has always first and foremost been about reportage. Go somewhere. Witness something that others need to know about but can't be present for. Report back so that the people can know.

Professional news organizations are never going to be able to do that again in a more timely fashion then the connected crowd. The occurrence of certain kinds of newsworthy events is just too unpredictable. Professional journalists aren't always everywhere when something newsworthy happens, but someone with a smartphone is, at least in major urban areas whether it's Tahrir Square or the East Village.

In other physical areas -- away from places where relatively well-heeled tablet wielding folks congregate -- professional reportage may still be needed. And other kinds of stories -- frankly, more important stories than the story of the latest fire or natural disaster -- like the story of Iran-US nuclear talks, are never going to be able to be told by citizen journalists.

But Wright's contention that, "We need analysis and thought to be introduced before something become news. Just being present is not enough," is wrong. For many kinds of news, for news at its most traditional and fundament, being present and beaming back video is precisely enough.

It's tough to take a line like "The function of poetry is to obtain for everybody one kind of success at the limits of the autonomy of the will," and turn it into the stuff of song!

Goldberg works his magic by fracturing Grossman's words into small shards which the composer repeats, stretches and alters, like Sonny Rollins stretching and altering musical motives in his famous solos.

Helping make the trick work is the silky vocal texture of singer/violinst Carla Kihlstedt and easy way she has with the difficult words.

Also key to making it all work is the incredible musicianship of the nine-piece ensemble -- in particular, trumpeter Ron Miles, guitarist Nels Cline and Goldberg himself (whose precision and crisp intonation reflect his background in klezmer music). These guys play with such a seeming effortlessness -- Goldberg flat out lovingly caressing melodies when called for -- that the music flows as naturally as clear water in a mountain stream.

Of course the compositions themselves -- arranged for violin, clarinet, trumpet, sax, piano, electric guitar, vibes, bass and drums -- are the real stars. After a 20+ year career playing avant garde jazz/klezmer fusion with his New Klezmer Trio and jazz flecked with Eastern European folk music with the Tin Hat trio, Goldberg has arrived at a wholly organic synthesis of all the music he seems to love and care for.

There's rock here, there's jazz here, there's klezmer and Broadway and chamber music, but all the influences and antecedents are so throughly synthesized that it doesn't feel like a fusion or a melange. It feels whole.

Back in 2010, the 55-year-old Goldberg told All About Jazz that he grew up loving the Beatles, eventually coming to jazz and klezmer and finally earning an MA in composition, but in the end he found himself less consciously trying to think about how to combine all his influences, approaching it all more intuitively:

"Something hits you about music when you're little. It hits you for a reason, for the right reason, and then you get involved in music. For me, a lot of other concerns then got pretty big, like complexity, sophistication, whatever. And at a certain point, I started asking myself, 'What happened to me? What became of all that music that I still like to listen to? Where is that in my own thing?' Not 'How come I'm not playing in a Beatles cover band,' but 'Those pleasures, where are those?' If there's something I love to hear when I put on a record, then when I make a record, am I giving somebody else something they love to hear? That's the real question.

Honestly, I don't just want to give people something that they can appreciate or understand, or that makes them think, or something like that. I used to kind of feel that that's what I wanted to do, but that's not what I want anymore. I want to give people something that they can love."

06/06/2011

Written last, read only posthumously (except by his wife), almost certainly unfinished, more than any other work of Herman Melville's brilliant, troubled, and mysterious career, Billy Budd, Sailor (An Inside Narrative) has served as a screen onto which successive generations of commentators have projected their personal obsessions.

The short work--at once deceptively simple (particularly stylistically in contrast to the dizzy soaring prose of Melville's younger years) and surprisingly dense for a piece of such un-Melvillian directness and proportions (only 130 pages in the definitive text prepared by Hayford and Sealts)--has alternatively been described as "Melville's testament of acceptance", as a Christian allegory, as a veiled portrait of the ambivalent way in which Herman's father in law, Massachusetts chief justice Lemuel Shaw, an abolitionist, mercilessly enforced the fugitive slave laws, and as a not so veiled story of self-loathing homosexuality.

There might be some truth to all of these things. Certainly it's an old man's book in which strangely inert characters submit with little protestation to the roles that the Fates, or at least the author, has proscribed for them in a manner that is unquestionably Greek in it's relentless inevitability ("Fated boy" Vere says to Billy when Billy kills Claggart, giving Vere's later moral hand wringing an air of powerless sophistry) and darn fatalistic. There's no doubt of the parallels to Biblical themes of the fall of man and the sacrifice of Jesus in Billy's more than preternatural innocence and the explicitly heavenly description of the beatified skies at the moment of Billy's death, as well as in Billy's final blessing of the man who condemns him to die. I can buy the notion that Shaw served as something of a model for Vere's almost self righteous rush to impose martial law as if no other choice were before him given not only what we know of Shaw's career but also what we can't help but read into what must have been Melville's resentment towards yet another towering family father in whose shadow he lived and on whose financial support he relied. And there's no question that Claggart's attraction/repulsion to "the Handsome Sailor" is pregnant with sexual envy and conflict.

But most of all I think Billy Budd, which I've just read again for the first time in some years, is about the same things that all of Melville's greatest work is about--the malleability of truth, the sad ways in which the social conventions of the civilized world alienate people from their true natures, and the doomed hopelessness of all human striving to understand anything about the world, the universe and the meaning of life.

At the heart of the book is a kind of switcheroo, a moral Freaky Friday trading places act. Claggart, an almost supernaturally depraved bad actor who falsely accuses Billy of planning a mutiny, whose only motive--whatever psychological motives we modern readers ascribe to him--is described as arising from his being "depraved according to nature" becomes the hapless victim while Billy--so naive he doesn't even understand that such depravity exists--becomes not only the necessary example of guilt to the rest of the crewmen (for whom his hanging is intended as lesson by Vere) but a simple of depravity, vindictiveness and alienation.'

Melville tells his readers outright what has gone on at the moment that Billy mutely strikes Claggart dead:

In the jugglery of circumstances preceding and attending the event onboard the Bellipotent, and in the light of that martial code whereby it was formally to be judged, innocence and guilt personified in Claggart and Budd in effect changed places.

But Melville goes a step farther, in a book uncharacteristically (for Melville) devoid, or mostly devoid, of pre-modernist multitextuality, Melville offers only a few touches of such stuff near the end of his tale, one in the form of a news report from an "authorized" weekly naval chronicle, written in good faith, Melville assures us, but nonetheless gathered from rumor and innuendo resulting in an official account, an outside narrative, of events that our author has just laid before us, so completely at odds with Melville's inside narrative as to represent an utter reversal. In the official account, Budd's no beloved innocent lashing out in mute passion against a man who has wronged him. No, he's an armed assassin and mutineer so vindictive and sneaky he must not even have been an Englishman who has committed a crime of "extreme depravity" against the "respectable," "discrete," and "patriotic" Claggart.

Beyond all the narrator's declarations of the unknowability of Claggart's character, the conspicuous absence of any backstory for either Claggart or Budd, and the invitations to readers to decide for themselves whether Vere has done the right thing or the wrong thing in ensuring that Billy hangs and quickly, or even whether or not Vere has become unhinged, this news report undermines the sense that we as readers know whether or not anything is true.

It's a subtler handling of the inadequacy of human knowledge then Meville gives us in Moby Dick with Ahab's obsessive, fatal, failing compulsion to "pierce the veil" but in it's modernity it's perhaps more disorienting.

Melville then pushes farther yet again into myth making, including at the narrative the short poem from which the whole piece sprang (Melville first wrote the poem Billy In the Darbies, a kind of folk ballad of a mutineer in a brig at night awaiting his execution in the morning--an old tar, not a symbol of innocence like the Billy of the inside narrative. The the novel grew up around the poem, springing from a headnote Melville began writing then continued to expand). Read at the end of the inside narrative, and following the unreliable news report, the ballad offers another version of the tale--a myth told as a sea chanty, its authorship now ascribed to a shipmate of Billy's who would have known the truth but instead writes the song of an old tar, an actually mutineer, with a girl back home in Bristol.

Melville knew well sailor's stories from his years at sea. And he studied Anglo American ballads in preparation for his great Civil War poetry. He knew how myths were made from the stuff of reality. But in many ways Billy In The Darbies feels more "true" or at least more real than the "inside narrative" itself--which is full of impossibly pure and impossibly malignant characters with no backgrounds, just sailor's stories about their rumored backgrounds, archetypes more than flesh and blood characters. And by including--Rashomon-like--these alternatives to the narrator's own "inside narrative," both an utterly opposite news report, and a very real feeling sea chanty Melville leaves us wondering what his texts always leave us wondering about: is anything true? Is anything really knowable at all?

04/15/2011

Ashlee Vance has an excellent feature today on Businessweek.com. Although the piece is burdened by the snappy, ambiguous title,The Tech Bubble Is Different, it's not the typical cyberutopian sermon you'd expect.

The gist of Vance's piece is this: the business model that underpins the social media bubble is advertising. This model has turned Silicon Valley into a magnet for the pure mathematicians who once went to Wall Street to develop trading decision algorithms; now they go to Palo Alto and design consumer behavior predicting algorithms and the Valley, instead of attracting the kind of hardcore computer scientists who once drove innovation, now just attracts quants. The quants, so Vance says, don't possess the kind of world-altering idealism of earlier generations of tech innovators. They're not looking to change the world, just to get you to click on an ad. As a result, Vance's reasoning goes, the social media bubble, when it bursts as all bubbles do, will leave behind no game changing technology infrastructure. The PC bubble put PCs into American households. The Internet bubble left behind the network infrastructure. The social media bubble will leave behind no such hard assets.

At least that's Vance's argument.

There's a lot of truth in what Vance has to say. The Internet--by connecting people and turning the computer, at least for consumer users, into, first and foremost, a communications device--has transformed the technology industry forever turning it into something that looks a lot more like the media business than the tech industry of old.

The horror and disdain this reality evokes from old-timers in the Valley is palpable in Vance's piece. Says one such person Vance quotes: "My fear is that Silicon Valley has become more like Hollywood--an entertainment-oriented, hit-driven business that doesn't fundamentally increase American competitiveness."

The quote captures the self-important, even sometimes delusional conflation of private business interests, public concern, and faith in science that has long characterized Silicon Valley, and which turned a simple observation about the pace of change in transistor manufacturing into "Moore's Law"--something with the whiff of basic physics as if inevitably planetary forces were involved. The quote also reveals the Goldwater conservatism that has always informed the basic presumptions in the Valley--increasing American competitiveness, there's a goal worth striving for, everything else is namby pamby hippie stuff.

But there's nothing wrong, of course, with entertainment and communications businesses. And they do actually produce innovative technology that changes not only the way we work (as if that were the only thing that matters) but how we play and, perhaps even more importantly, how we relate to one another. And one could easily argue that the role that Facebook, Twitter, and Internet telecom more generally played in, say, the populist revolutions in Tunisia and Egypt recently suggest that the social media bubble has already played a more important role in changing the world--and left behind a more important legacy--than, say, PowerPoint. Vance fails to measure or take any note of the kind of social change Internet communications and social media have enabled and, while not a legacy of hard assets, may in fact be no less important than PC penetration and the explosive group of Internet points of presence.

But in the end there IS something disappointing about how the advertising model has swallowed the world of computer technology. Back in the 1990s, when the Internet first began unwinding the world of traditional media, the dream for media technologist was not only about new ways of producing and distributing media, not only about community and user generated content, but also about a new business model that would support new media in a way that would allow for the continuing creation of not only pure entertainment media (like Farmville) but also new kinds of news products, and popular arts. Instead what has happened is that peer to peer file sharing and DIY media has driven the consumer price tolerance for media arts to $0; the movement of classified advertising from newspapers to the likes of Craigslist and Monster.com has hastened the demise of the urban newspaper, leaving us with the least well informed American population in a century; and, as Vance points out, Groupon--a cyber-coupon circular--is the fastest growing "tech" start up of the new bubble. Far from inventing a new business model for media, the Internet revolution has pushed the media business model into the world of technology, and the end result may well have been worse for the media sector than the tech sector.

03/20/2011

Steel guitar is one of American music's signature sounds. The weeping swells of the pedal steel, the swooping hokum of bluegrass dobro, the grinding rock riffage of, say, Ben Harper pushing his Dumble amp with his pickup-equipped Wiessenborn --the round timbres, wobbly tremolos, and octave wide portmanteaus--these sounds say "American music", the way the sound of the hammered cimbalom says "Hungarian music" or the sound of the charango says "South American music".

Of course, like many of the things we think of as characteristically American, the steel guitar is an import. In fact it is a Hawaiian approach to a Spanish instrument that took the continental US by storm after its appearance at the Hawaiian pavilion at the 1915 Pan-Pacific Exposition in San Francisco touched off a craze for Hawaiian music, steel guitar, and steel guitar instruction, a craze of a sort not seen again until the electric guitar boom touched off in the wake of the Beatles. (For more about how a Hawaiian instrument became a mainstay of country music see Episode 4 of my podcast, Down in the Flood).

Lesser known, until recently at least, has been the tradition of sanctified gospel steel guitar. Mostly an East Coast phenomenon associated with the House of God church so-called "sacred steel" guitar emerged along an Atlantic Coast sanctified circuit from New York to Florida, bursting into the public eye when Arhoolie Records licensed a collection of sacred steel performances from the Florida Folklife Program.

Soon one player in particular--Robert Randolph out of New Jersey--emerged as the genre's biggest star, a pop music darling, the go-to player when some rock star wants to add a "sacred steel" flavor to his or her records (he's recorded with Elton John, Dave Matthews, Ringo Star, Santana, and Los Lobos among others).

Mostly these days Randolph records secular music and to be sure he's a smoking player, witness this rip through one his signature show stopper "The March":

But in large measure the sacred steel style is descended from the work of one man, gospel steel pioneer Willie Eason, who, in the 1930s and 1940s, rocking an Epiphone lap steel, brought the sound of steel to the church.

No doubt the few records Eason made in the 1940s and 1950s were thought of by the labels that released them as novelties. Eason was never much of a singer or leader, witness his 1951 recording for Regent records "I Want to Live So God Can Use Me."

Perhaps Eason's best known recording is his 1947 pairing with The Soul Stirrers on the jubilee style political song, "Why I Like Roosevelt." That recording, is, unfortunately completely unavailable in any digital format although Eason continued to perform it until his death in 2005, including this warm performance Eason delivered for filmmaker Alan Govenar in 2004 .

If Eason's few records were mostly novelties, the performances he continued to make up and down the East Coast until his death in 2005, were more influential, inspiring among other Eamon's brother in law Henry Nelson, and Nelson's son Aubrey Ghent to take up the sacred steel style.

Ghent continues to perform in church in the style Eason taught him:

Eason was never the virtuoso that Randolph is or that Chuck and Darick Campbell of The Campbell Brothers are.

But his few recordings remain an inspiration, especially the later-in-life Eason performances collected by folklorist Bob Stone for the influential sacred steel anthology including this Eason performance of "Just A Closer Walk With Thee."

03/13/2011

The 30-sides recorded by Texan street singer Blind Willie Johnson between 1927 and 1930 comprise one of the great bodies of work in the history of American music.

The songs Johnson recorded in those sessions, many of them older even than Johnson himself (born in 1897), have become familiar--either through their adoption by COGIC performers or, after the folk revival of the 1960s, rock and rollers. I'm talking about "Motherless Children Have A Hard Time," "Jesus Make My Dying Bed" (also known as "In My Time of Dying"), "Nobody's Fault But Mine," "If I Had My Way I'd Tear the Building Down," and "You'll Need Somebody on Your Bond," all recorded not just by many a gospel performer but also by Eric Clapton, Led Zeppelin, The Grateful Dead, and Captain Beefheart.

Johnson's version of "Dark Was the Night and Cold Was the Ground On Which Our Lord was Laid"--which turned the hymn into the deepest of wordless blues moans--is so deep and perfect an example of African American song that it was included, alongside the music of Bach, Stravinsky, and Louis Armstrong on the recording that NASA scientists sent to space in the Voyager 1 space probe in 1977.

Not so much straddling as making a mockery of any line that might exist between blues and gospel, Johnson accompanied his own gravely, froggy, cement mixer of a voice with bottleneck guitar propelled by a heavy thumb-picked bottom. His style was, formally, no different from the blues music of peers like Blind Lemon Jefferson. But every song Johnson sang was spiritual.

This mixture of sanctified religion and blues music style and energy was, in the early years of the twentieth century, something of a Texas specialty. Up around Dallas, Jefferson--who would become the first breakout star of the country blues--didn't so much mix the forms as freely alternate between them: straight blues, ragtime novelty numbers and spiritual music (Jefferson's first recordings were spiritual numbers like "I Want to Be Like Jesus in My Heart"). That kind of repertoire was fairly typical for a blind street singer like Jefferson--musical performance of this sort was a rare career path for the blind in pre-war rural America--who played what people wanted to hear.

But two other blind singers in Texas more directly blended the music. Working, like Jefferson, around Dallas (and later Oklahoma City), but in the COGIC churches not on street corners, pianist Juanita Arizona Dranes mixed barrelhouse piano with sanctified revival music in a style that influenced everyone from Rosetta Tharpe to Jerry Lee Lewis.

And then, down in Beaumont, near the gulf, there was Johnson.

Unlike Jefferson, who became a pop music star of a sort, and Dranes, who didn't record after 1929 but who performed in larger sanctified churches for years where she was heard widely and passed along a significant influence, Johnson more or less disappeared from the public eye after making his last recordings in 1930, only to be rediscovered via his records during the folk revival thanks to the efforts of folk music researchers like Sam Charters, who took the first pass at unearthing Johnson's biography; and, later Texas journalist Michael Corcoran who located a living descendant and a death certificate.

After making his last recording sessions in 1930, Johnson apparently lived and preached around Beaumont, Texas, operation a store front church in that city until his death in 1944.

All of Johnson's recordings are essential listening for fans of gospel, blues, and American music generally. And "Dark is the Night" is, at this point, probably the best known, but Johnson's recording of "Jesus May Up My Dying Bed," from his first recording session at a makeshift studio in Dallas in 1927, is something of a miracle too, not so much for its elliptical, swallowed vocal--which bites off the lyric and allows the uitar to finish the key phrases--or for its deep, mournful feeling, but for it's outrageous slide guitar playing.

03/06/2011

Gospel quartet singing--that male a capella style in which background singers provide a rhythmic accompaniment in harmony for the flights of the lead singers--emerged simultaneously all over the south. The Soul Stirrers came originally from Houston, the Fairfield Four from Nashville, the Golden Gate Quartet from Virginia. And of course, after World War II, with the great migration of black Americans from the rural South to the urban north, quartet singing took root on the street corners and barber shops of northern cities where it mutated into all sorts of secular forms.

But if there was a regional center at all for the early emergence of the quartet style is was Jefferson County, Alabama and the towns of Birmingham and Bessemer. Among others the region produced the Bessemer Sunset Four, The Heavenly Gospel Singers, the Kings of Harmony and the Birmingham Jubilee Singers founded by influential local quartet instructor Charles Bridges, who created a school for quartet singers in Bessemer.

But most importantly the Birmingham scene gave us Silas Steele and his Famous Blue Jay Singer, responsible as much as any group was for the popularization of a new style of quartet singing full of blues gestures and sanctified energy.

The early Birmingham style was characterized by tight, elaborate harmonies, and a cleanly enunciated lead approach that owed more to the turn of the century stage than it did to the new COGIC congregational singing. It was an approach quite deliberately built upon the sound and style of the late 19th century singing groups, both quartets and larger ensembles, founded at the early black colleges--the Fisk Jubilees and closer to home the Tuskegee Quartet and Tuskegee Institute Singers first organized by Booker T. Washington in 1884. The university groups had repertoires that consisted of pre-gospel spirituals. Their arrangements tended to mix passages of rubato performance alternating with passages of uptempo music with clearly stated rhythms. The harmonies at times can sound nearly Anglican. And the lead vocals are cleave fairly closely to the melodies--with little of the melissmas and bent notes that are hallmarks of the blues and sanctified singing.

You can hear the influence comparing this 1914 recording of the Tuskegee Institute Singers in 1914 performing the spiritual "Live a-Humble":

with Bridges' Birmingham Jubilees a decade and a half later, in 1930, singing "I Want God's Bosom To Be Mine"

Then along came Silas Steele and his Famous Blue Jay Singers. Steele founded the Blue Jays around the same time Bridges founded the Birmingham Jubilees, circa 1925, and the two groups were local rivals. Both featured the elaborate, well-arranged harmonies of the Birmingham style, but the Blue Jays had a secret weapon in Steele--not just his rich, potent baritone but also his adoption of the sanctified singing/preaching approach to lead.

Here's a number from the Blue Jays second record date, "I Declare My Mother Ought to Live Right" recorded in 1932. The contrast is clear: the background harmonies are tight and dense but the entire accompaniment is rhythmic and propulsive; the singing is full of bent notes, melissmas, and even falsetto--sanctified techniques absent from the early Birmingham style records; replacing the rubato sung sections are "breakdowns" for preaching style verses. Gospel music, as distinct from the earlier jubilee style, is in full flower, albeit with a Birmingham flavor to the harmony.

Steele's Blue Jays were the first Birmingham area group to take to the gospel highway and hit it big, at least big for the world of gospel in the 1930s--relocating first to Dallas and later to Chicago, and even recruiting former rival lead man Charles Bridges as a second lead/swing singer in the late 1940s, and becoming known in gospel circles as the "fathers of the quartet," although they hadn't so much pioneers the form as they had popularized the new, bluesy approach to it.

It was in this configuration, with two great Alabama lead singers--Steele and Bridges--that the Blue Jays cut their greatest records like this 1947 recording of "I'm Bound For Canaan Land"

While the Blue Jays would continue on into the 1960s in various configurations, it would be without Steele, who, sometime in 1948 or 1949 left the group he founded to join the Spirit of Memphis quartet, leaving the group he founded to the leadership of his one time rival on the Birmingham/Bessemer scene.

03/01/2011

It's a popular rhetorical move among conservative pols to twist polling on issues they are concerned with into broad assertions that they, and not their political opponents, are the voice of the American people.

The list of things "the American people don't want"--determined by polls which inevitably proceed from one side of a debate or another and are packed with polemically slanted questions--according to Republicans in last election were the Obama healthcare reform package (continually mis-characterized as a government takeover of health care--if anything it's a government subsidy for private industry), gay marriage, economic stimulus, deficit spending, and liberalism itself.

Well, here's another thing that, by the same criteria "the American people don't want": public workers unions stripped of their collective bargaining rights.

According to a New York Times/CBS News poll: "Americans oppose weakening the bargaining rights of public employee unions by a margin of nearly two to one: 60 percent to 33 percent."

A Pew poll more specifically focused on the situation in Wisconsin found a a 41-32% split in favor of public workers (with a full 21 percent either not preferring one side or another don't know).

And among Wisconsin voters, a Public Policy Polling survey found that if November's election between Governor Scott Walker and Democratic opponent Tom Barrett were held today, Barrett would win big, 52 to 45%, mostly as a result of a shift among blue collar GOP voters.

It's actually Republicans, more so than Democrats or independents, whose shifting away from Walker would allow Barrett to win a rematch if there was one today. Only 3% of the Republicans we surveyed said they voted for Barrett last fall but now 10% say they would if they could do it over again. That's an instance of Republican union voters who might have voted for the GOP based on social issues or something else last fall trending back toward Democrats because they're putting pocketbook concerns back at the forefront and see their party as at odds with them on those because of what's happened in the last month.

Precisely the same kind of polemical polling--with the same sorts of leading questions, issues about the nature of those polled and the underlying agendas of the pollsters--produced the polls GOP candidates cited in November to claim they were the standard bearers for the will of the people.

But this morning, with exactly the same sorts of polls clearly showing the public favoring preserving the power of public unions, conservatives are up in arms about polling methods, cohorts, and the like--complaints the same folks never raise when equally colored polls return the results that they favor.

American politics and journalism have each become far too reliant on polls. Not only are polls themselves packed with inherent flaws--the inevitable biases, the small sampling sizes, the unreliability of respondents, the unwillingness of people to participate, the increasing difficulty reaching respondents through traditional means--but they see to serve, for journalists and politicians alike, as a new kind of ersatz class of facts, illuminating little.

Sure, polls can work fairly well in the context of simple yes or no questions, or even in elections between two candidates, where there are binary choices with none of the ambiguity, complexity, or subtlety presented when you try to reduce to a pollable question issues and attitudes that humans reflexively approach with nuance polls can't capture. If a respondent tries to answer a poll question with the real nuance and ambiguity most humans feel about more complex questions than Coke or Pepsi, all that happens is the response is invalidated.

But polls and surveys have come to govern our public lives. Politicans make all their decisions based on poll results. And increasingly new organizations use them as substitutes for substantive coverage. The New York Times off lead story today is the poll it conducted with CBS News, a glorified "man in the street" piece, a work of journalistic alchemy that has the impact of reducing complex social attitudes to simple opinions and then presenting those opinions as facts.

Look, I'm as fascinated with poll results as anyone. I write about them and use them to inform me about a range of topics. But I'm always aware of the limitation inherent in their methodology and would never be so bold as to proclaim that on the basis of the poll "the American people" can be said to want one thing or another.

As a side note, last night I happened to catch Jamila Wignot's documentary about the Triangle Shirtwaist factory fire on WGBH's American Experience series. The film is terrific, replete with period photography from the tragedy that I had never seen before. The famous fire--in which 146 mostly teenage girls perished in a sweatshop a year after a strike led to concession like a 52 hour work week and 4 paid holidays but left the workers unable to unionize--was a galvanizing moment in the history of American labor. That 100 year old event is instructive in the history of the current disputes in Wisconsin and elsewhere. Sure, public employees today don't work in dangerous sweatshops. But America is at an historic crossroads and needs to decide how it is going to support it's aging population as it leaves or is pushed out of the workforce and left to live on precisely those meagre income streams that are continually under GOP attack--social security and negotiated pension benefits. Let's hope we don't need the spectacle of starving homeless retirees freezing to death in tent cities in the upper midwest to realize the workers have a point.

02/27/2011

It's probably gospel music's most famous, most widely-recorded song, Thomas A. Dorsey's 1932 composition "Take My Hand, Precious Lord." The tune is one of those rare non-traditional numbers that has found its place in both the black and white gospel traditions, recorded by everyone from Mahalia Jackson and the Soul Stirrers to Roy Acuff, Elvis Presley, and Merle Haggard.

The tale of Dorsey's transformation from Georgia Tom into the Reverend Thomas A. Dorsey is oft-told: how a piano playing songwriter and arranger who backed Ma Rainey on the black vaudeville circuit and composed salacious party tunes for his duo with guitarist Tampa Red became gospel music's greatest songwriter and founder of both the first black-owned gospel publishing company and of the National Convention of Gospel Choirs and Choruses.

Here's Dorsey, as Georgia Tom, with Tampa Red playing Dorsey's most famous blues "Tight Like That" in a 1928 recording:

Dorsey's was a creeping conversion, at least in terms of his career as a songwriter. Born again in 1921 Dorsey continued to write blues but he also began writing religious songs that mixed his blues sensibility with the black baptist hymnal style of mid Atlantic preacher C.A. Tindley, who, in the early 20th century had composed many proto-gospel classics including "We'll Overcome Someday," "By and By" and "Stand By Me."

It wasn't until Dorsey had managed to sell a few copies of the sheet music of his first gospel tune, "If You See My Savior, Tell Him That You Saw Me," at the 1930 National Baptist Convention, that Dorsey abandoned secular music all together.

"Precious Lord," as it has come to be commonly known, grew from personal tragedy, famously composed by Dorsey in a depression following the deaths in childbirth of his first wife and their child in 1932, borrowing a melody from an 1844 hymn called "Maitland." The earliest know recordings of Dorsey's number date from 1937 waxed first by The Heavenly Gospel Singers in a pop tinged vocal quartet arrangement, and by Elder Charles Beck in a piano-backed bluesy version that one imagines is closest to the way Dorsey must have originally played it.

In the years since the song has received every kind of treatment one can imagine, from the hard driving sanctified quartet version by RH Harris and the Soul Stirrers that is my all-time favorite:

To Mahalia Jackson's slow devotional version:

To Tennesse Ernie Ford's fascinating 1950s version that mixes gospel and pop arrangement:

To this beautiful trombone sonata performance by Wycliffe Gordon:

But it doesn't get much better than the 14-year old Aretha Franklin in 1956 singing in her father's church in Detroit. The recording picks up with the performance already under way but just about to take off. All I can say is, amen!