Category Archives: Memoriam

Post navigation

It’s been a long, long time since I’ve been punched in the stomach. But I remember what it feels like with this news: Dr. Jim Blake, my college advisor, died a few months ago in Oil City, PA. He was 68. That I’m hearing about it a few seasons late shows how life’s sturm und drang will have its way.

Dr. Blake was one of the two best teachers I ever had, and one of the most influential people in my life in many ways. It was he who passed on to me his love of analysis and of fairly relentless logic, yes – but also how to find joy, stimulation and meaning in how words were put together; the packing of meaning into every word of a great poem, the layers of symbols and meaning in a great book, the ruthless economy of a well-honed phrase. And he showed a lot of us how four years of studying literature could be a good, powerful and important force in ones’ *real life* – which is, I’m afraid, a lost art in the modern college.

Beyond that? Incredible as it may seem in this age, it was Dr. Blake – an English professor who called himself a “monarchist” – who showed me that I really wasn’t the bobblehead I had been when I started college; “Mitch, you’re not a liberal”, he said in his Queens accent during out of our hours of talking about policis, philosophy, current events; he shook his head and made me read Solzhenitzyn, Paul Johnson, P.J. O’Rourke, Dostoevskii and Tolstoii. And by golly, he was right; once my brain turned on, I was a conservative after all. When I pulled punched my ballot for Ronald Reagan in 1984 (albeit without telling my parents), and started my first conservative talk show in 1986, and every day I do the NARN or write my blog today, Dr. Blake was and is there.

I’ve thought a lot over the years; would the modern humanities academy know what to do about a Dr. Blake – an English prof with a fearsome BS detector and no patience for the PC fripperies of the modern humanities academy?

Oh, it would be an epic battle indeed.

The only tragedy in his death is that not every college kid had or will have the opportunity to learn from him.

The defensive mastermind that was, perhaps even more than Mike Ditka, behind the greatest team in the history of NFL football, Ryan had a long, long career:

Beloved by his players and hated by opposing offenses (and sometimes hated even by his own offenses), Ryan masterminded Chicago’s 46 defense that won Super Bowl XX. He later served as head coach of an Eagles team that had a great defense in its own right, and ended his coaching career as head coach of the Cardinals in 1994 and 1995.

Ryan’s 35-year career as a football coach began in 1961 as a defensive line coach with the University at Buffalo Bulls, and in 1968 he moved to the Jets, helping them win Super Bowl III. He spent two years with the Vikings in 1976 and 1977 before George Halas hired him to coach the Bears’ defense in 1978.

He and his ’85 Bears were the subject of an ESPN biopic last year; he really wasn’t looking good (and either was Jim McMahon).

The last surviving credited cast member from the movie “Casablanca”, French actress Madeline LeBeau, has died. She was 92.

LeBeau, in a still from a scene cut from “Casablanca”

The cause was complications from a broken thigh bone, her stepson, documentary filmmaker and mountaineer Carlo Alberto Pinelli, told the Hollywood Reporter.

Ms. LeBeau (sometimes credited as Lebeau) was the last surviving credited cast member of “Casablanca” (1942), which the American Film Institute lists as the second greatest movie of all time. “Citizen Kane” is No. 1, according to the film preservation group.

She played Rick Blaine’s (Humphrey Bogart) jilted girlfriend in the early part of the movie…:

I have no idea where the subtitles come from. Bizarre.

…and then reappeared during the famous “La Marseillaise” scene:

(Along with her husband at the time, Emil the Croupier, who hands Major Renault his winnings at the end of the clip).

When I was a kid, country-western was trying its darnedest to cross over with pop music; the Nashville power-brokers were pushing to try to rake in some of that Top 40 money. From the early seventies to the mid-eighties, C&W was sodden with bloated pop pretenders – the Eddie Rabbits and Ronnie Milsaps and Lee Greenwoods and Barbara Mandrells that peaked during that lost 15 years, not to mention the legit country singers – Dolly Parton among others – who bottomed out during thqt woebegone stretch.

Standing athwart that current, yelling “stop” before Waylon and Willie, before the Highwaymen and Dwight Yoakam and all the Outlaws of Country, much less the “country roots” revival of the late eighties, was Merle Haggard.

Even before I worked my first country gig (KDAK in Carrington ND, in 1982), I was drawn to the fact that Merle was a legendary anti-hippie:

And while he was never a flashy player, he was no slouch on the guitar.

Anyway – if you’ve been under a rock or on a ballistic missile sub on patrol, Haggard passed away yesterday at 79, leaving behind a C&W scene dominated by American Idol winners and frauds like “Florida-Georgia Line”.

Rickman could have had a nice career playing villains. But 1990’s “Truly Madly Deeply”, directed byAnthony Minghella, upended expectations. Rickman played Jamie, the ghost of Juliet Stevenson’s dead lover. Stevenson’s character had been grieving the loss for a year, and one night she sits down to play the piano. As she plays, a cello suddenly starts up off-screen, and “Jamie,” who had played the cello in real life, is seen sitting behind her. The reunion that follows is one of such wrenching emotion that it puts “Ghost” to shame. It’s barely romantic. They clutch and hold, they weep and coo, they sob. As “Jamie,” Rickman is both hilarious (he’s always freezing, always cranky) and tragic (if she can’t let him go, then he really can’t let her go.) An entire new world opened up for Alan Rickman, at least in terms of the audience who had only seen him in a gigantic blockbuster as a multinational terrorist-villain. When Jamie says to Nina, “Thank you for missing me,” his tone is quiet and thoughtful, but Rickman filled the line with a sense of almost humility: “This fabulous woman grieved ME this intensely? I have this much value?” His line-reading cracks open the heart of the film.

David Jones – who had to change his surname to “Bowie” after the Monkees debuted in the UK, almost fifty years ago – passed away yesterday, way too early, at age 69.

He’s been a longtime candidate for one of my “Things I’m Supposed To Love…” bits. I have always been ambivalent about Bowie’s music – and like a lot of music I started out as ambivalent about, it’s probably something I should look into further.

Historically? It probably doesn’t help that I first encountered Bowie at at time when he was at his most pretentious – and I was, personally, at my most pretentious in my disdain for pretense. And even some of his biggest fans will cop to the fact that, especially earlier in his career, a lot of style had to cover for not all that much substance; he started out as a pretty rudimentary lyricist. And, duh – rock and roll is more about style than substance; never let anyone tell you rock and roll is “poetry set to music”; it’s doggerel set to music slathered in style!

But it wasn’t my style.

So one way or another, Bowie had very little music that really, truly grabbed me where I lived, at least initially.

Lemmy was lead vocalist, bassist, principal songwriter and the founding, and the only constant member of Motörhead since the band’s formation in 1975. To date, Motörhead have released twenty studio albums and achieved 30 million in sales worldwide. Their last record, Bad Magic, was released in August 2015.

Over forty years, Kilmister was simultaneously one of the gödfathers of speed metäl and pünk.

Motörhead saw far more commercial success in the UK, though they achieved a cult status in the US. Their ferocious hard-rock style rejuvenated the metal genre in the late 1970s and inspired everyone from Metallica to Guns N’ Roses to Dave Grohl. Albums such as Ace of Spades, Orgasmatron, and Rock N’ Roll were critically lauded, though ironically the band’s only Grammy Award came via a cover of Metallica’s “Whiplash”, which they recorded for a tribute CD.

They were cult figures in the US – but I remember going to Europe in 1983. And while that was a great year for a lot of bands – U2, Little Steven, Duran Duran, Madness, Big Country and many others – what band did I see in the most graffiti, all over Europe, from Scotland to Switzerland?

Yep. Mötörhead.

Kilmister bragged of drinking a bottle of whiskey a day for the past forty years, and was a vocal advocate of amphetamines. As such, he makes Keith Richard look like Pat Boone.

And that’s the real kick in the teeth. Rock stars – in the romance of the genre – aren’t supposed to die of cancer at 70. They’re supposed to go out in a blaze of alcohol-and-drug-fueled glory at 29.

Music geeks over the weekend noted the passing of Chris Squire, longtime bassist for prog-rock icons Yes.

Now, as I’ve written innumerable times, I really listen to music on two levels; is the music technically adept in some way – singing, instrumental chops, production – and does it grab me in the liver and say “this song is something important to you”.

Much Noise, Signifying…: Speaking for me? Yes – of whom Squire was the only constant member from 1968 through his passing, as the band went through 18 other members over the years – was always plenty of the former, and only rarely any of the latter.

As to the former, the musical talent? It was always the band’s long suit. I, like a lot of guitar players of a certain age, grew up very pleased with myself for nailing the first part of “Roundabout”, and bobbing my head in awe at the rest of the song:

Admit it; if it weren’t for “I’ve Seen Good People” and “Roundabout”, you don’t know the words to the chorus of a single “Yes” song before 1984. It’s not the most ornate Yes song of their first 16 years as a band – they frequently had songs that filled entire 20 minute album sides – and far from their least accessible.

But there’s no doubting the technical chops; Rick Wakeman’s virtuosic but gaseous keyboards, Jon Anderson’s fluid lead singing, and Steve Howe’s technically-impeccable and occasionally-brilliant guitar (why does he always look like he’s getting a prostate exam when he’s playing?).

But Squire’s bass is the most notable thing about the song; from the blazingly ornate yet reliable sixteenth-note runs during the verses, to the off-kilter pulse of the chorus, it’s really brilliant stuff.

Which, of course, made me nod my head and go “yeah, pretty brilliant – now where’s some music I actually feel?”

It was the first time I had actually felt some emotion besides admiration for their technical chops when listening to a Yes song. In this case, it was unbridled hatred for murdering a great song.

But it wasn’t the last.

So – wanna start an argument with a “Yes” fan? Tell him you didn’t hear a “Yes” song that you actually enjoyed until “Owner of a Lonely Heart”:

The band shed Howe (who went to join the dull as dry toast “GTR” for a few years) and added South African guitar whiz Trevor Rabin. They also did three albums in a row produced by Trevor Rabin, the former lead singer of “Buggles” (“Video Killed the Radio Star”), who’d sung lead for Yes for a year before becoming one of the defining producers of the 1980s.

The best way to get an old-school “Yes” fan to try to assassinate you is to say you prefer the song to their earlier work. But I do. Far and away. Assassinate me? Bring it.

No Respect: I wasn’t the only one who didn’t much care for Yes. The Rock and Roll hall of fame has been cool to them:

In February 2013, Rolling Stone spoke to Squire about Yes’ legacy and the fact that Rush, but not Yes, were inducted into the Rock and Roll Hall of Fame. “Logistically, it’s probably difficult for whoever the committee is to bring in Yes,” Squire said. “Rush is fairly simple. It’s the same three guys and always has been. They deserve to be there, no doubt about that. But there still seems to be a certain bias towards early-Seventies prog rock bands like Yes and King Crimson… In our case, we’re on our 18th member. If we ever do get inducted, it would be only fair to have all the members, old and new. So that may be a problem for the committee. I don’t know.”

Classical rockers with hearts of cold, Yes entered the Seventies as a creative example of post-Pepper‘s artistic aspirations, a musicianly alternative to the growing metal monster rock was becoming. It left the decade as perhaps the epitome of uninvolved, pretentious and decidedly nonprogressive music, so flaccid and conservative that it became the symbol of uncaring platinum success, spawning more stylistic opponents than adherents. … On Tales from Topographic Oceans, the bottom fell out …

Now, I had that particular Record Buyer’s Guide. And I was as “rockist” as Marsh, who is most famous as the definitive biographer of The Who and Springsteen, and who has always compared all rock and roll to the MC5, and always will.

At it was via watching rock critics’ treatment of Yes during its various stylistic gyrations in the eighties – especially Marsh, my favorite as a teenager, and the single most promiscuous mixer of art and politics in the English language – that I finally realized something; that the real gaseous, bloated, self-important, pretentious, overblown, in-love-with-the-sounds-of-their-precious-creativity ones…

The whole thing is worth a read. But there was one part I’d never known about:

She was a Republican, always a surprising thing in show business, and in a New Yorker, but she was one because, as she would tell you, she worked hard, made her money with great effort, and didn’t feel her profits should be unduly taxed. She once said in an interview that if you have 19 children she will pay for the first four but no more. Mostly she just couldn’t tolerate cant and didn’t respond well to political manipulation. She believed in a strong defense because she was a grown-up and understood the world to be a tough house. She loved Margaret Thatcher, who said what Joan believed: The facts of life are conservative. She didn’t do a lot of politics in her shows—politics divides an audience—but she thought a lot about it and talked about it. She was socially liberal in the sense she wanted everyone to find as many available paths to happiness as possible.

I always enjoyed Rivers’ comedy – and like the little life lesson about politics dividing one’s audience.

As I discussed on the show on Saturday, there are really two sides to Memorial Day, to me.

The first part is the obvious part; remembering those who’ve died to keep this country free.

There are many of them; well over a million men and women have died in the service of this country, in wars big – the Civil War, World War 2 – and small (the Philippine Insurrection, Desert Storm).

And their memory – and the ones that lived, and are with us – deserve a world of thanks.

———-

A friend of this blog – a Navy veteran, as it happens – posted this on Facebook late last week:

Good morning all! It’s Memorial day weekend again.

Instead of exhorting patriotism and thankfulness from folks who don’t want to hear it I’d like to remind you that our government is keeping tabs on all of us. They are flying drones over our homes and collecting our communications. There are cameras *everywhere* taking our pictures, recording our movements. Our local police are now a military force, equipped with heavy weapons and armor. If you have made any firearm related purchases, or frequent arms related websites, your name is on a list. If you happen to belong to a conservative political group, the IRS has your number, but don’t feel left out Lefties, sooner or later they’ll get around to you too. If this situation is not OK with you, what have you done about it? Written anyone? Called anyone? Shown up in person anywhere to get in your legislators grill?

If you don’t care enough to protect the freedoms so many have died for, please don’t post a bunch of smarmy pictures & canned slogans; I don’t want to hear it.

There’s a place for the simple and the sentimental, of course…

…but the writer is correct; the real challenge facing those of us who haven’t died in the service of this country is to make sure that this country is worthy of their sacrifice. To make sure that those who died to preserve freedom didn’t die in vain.

Those who founded this country knew perfectly well that the greatest threats to this nation’s freedom weren’t from overseas.

The writer wrote the piece in honor of a comrade…:

CWO3 Mike Sheerin; missing you today brother. Not many left around to pick up the slack you left; nobody at all to fill the shoes.

We’ve been blessed with just the right people to pick up the slack when they’ve been needed.

It was ten years ago today that a roadside bomb in Anbar province killed two soldiers from the North Dakota Army National Guard’s 141st Engineer Battalion.

One of them, Specialist Brown, was the nephew of two of my high school classmates and of my seventh-grade history teacher. I remember him as a little kid, back in North Dakota in the eighties. His grandfather, as I recall, is a friend of my father’s.

Different people get different things out of remembering. If nothing else, I hope it prompts you to send a prayer to the Brown and Holmes families, and all the families who’ve lost loved ones in this past decade and a half.

In the late sixties, a justifiably obscure SCOTUS’ “decision”, “US v. Miller” (a depression-era case involving a robber who was murdered before his case made it to the court, and for whom no attorney argued before the high court) was dragged out of the legal ether by a series of liberal, activist judges, and installed into a misbegotten place as binding precedent that led, by a tortuous “logical” route, to the Second Amendment being interpreted for four decades as a “collective right”. Just the way the Ku Klux Klan interpreted it until the 14th Amendment came along.

The Heller case began the process of flushing this noxious bit of authoritarian posturing down the latrine of history.

But it fell to Otis McDonald – a seventy-something black man who just wanted to defend his life and property against the crime that had overrun the neighborhood where he’d lived since 1971, in which he’d raised three of his children – to deliver the coup de grace against Chicago’s racist, classist gun ban.

Otis McDonald

It was merely the latest of several fights for McDonald, who was 76 when the SCOTUS upheld his demand to be allowed to defend himself, his family and his property, and not be treated like the government’s livestock.

It was one of many battles he fought in his long, full, unsung-but-productive life.

McDonald started life as one of 12 children of a Louisiana sharecropper who’d left the land at 17, deep in the Jim Crow era. He worked for decades as a janitor at the University of Chicago, joined the union, earned a living, raised a family…

…and watched his neighborhood decay from a comfortable blue-collor area to a crime-ridden gang shooting gallery.

He sought “permission” to own a handgun – because as an older man, he couldn’t stand up in fight against one predatory teen, much less the whole pack. The city of Chicago, adhering to the gun control movement’s orthodoxy that black people must only be seen and heard at the polls, and shouldn’t be getting all uppity in between elections, shut him down with, as it were, prejudice.

And so he, along with three other co-plaintiffs, filed suit – which duly led to the Supreme Court and, in 2009, victory in the case that bore his name, and incorporated the Second Amendment as law binding all lesser jurisdictions; the right to keep and bear arms was, as it has always been, a Right of The People, not the National Guard, not to be frittered away by self-appointed racist elitists out of the fear of armed brown men that motivates all gun control.

McDonald, on the day of his case’s epic victory.

McDonald, a humble man without even a high school education, accomplished more to secure freedom than many buildings full of Ivy-League-spawned pundits and lawyers ever will.

Otis McDonald passed away last week at age 79, after a long battle with cancer.

As a black man in America, he fought his way up from economic disadvantage to earning a good living for his family. He fought against violent crime in his adopted city of Chicago, and in so doing came to his most famous battle as the lead named plaintiff in McDonald, et. al. v. City of Chicago. In the plaintiffs’ landmark victory in that case in 2010, the Supreme Court of the United States ruled that neither the Windy City nor any other city could ban law-abiding citizens from owning handguns for defense of self and family. The McDonald decision helped pave the way for the concealed carry permits now being issued throughout Illinois

.And the wages of McDonald’s victory are being felt – despite the media’s attempt to suppress them – today. More at noon. Oh, yes – oh, so much more at noon.

And so rest in peace, Otis McDonald. Your legacy – leaving your world a freer place than the one you came into – is one that shames those of a whole lot of people who came into this world with advantages you never dreamed of.

Of all of the films that have come out during my lifetime, all the huge important Oscar-winning serious films, all the weighty masterpieces, all the films about important topics, all of the “instant classics”, the beloved movies, the camp classics, the game-changers, the films draped in awards … of all of them, if I had to choose one film to be the #1 contender for “Film That Will Be Watched Regularly 150 Years From Now”, it would be Groundhog Day.

Leni Riefenstahl was the world’s first notable female filmmaker, and the greatest female filmmaker of the 20th century. She created innovations in the technique and aesthetics of film still used not only in cinema, but in the filming of crowds and athletic events; some of the techniques you see at the Super Bowl are evolutions of techniques Riefenstahl pioneered in filming the 1936 Olympics.

But it’s not considered polite to applaud Riefenstahl in public with out an emphatic verbal “asterisk”, because of her association with the Nazi Party. Her best-known work, Triumph Des Willens (Triumph of the Will) is an epic documentary and one of the world’s best known and most influential pieces of propaganda.

And so Riefenstahl was ostracized for the rest of her long life (she died at age 101 in 2002) as a Nazi impresario, for her association with a regime that killed 11 million people directly and triggered a war that swallowed tens of millions.

I write a fair amount about music in this blog. And when a major musical figure passes away, I often try to write something.

And in his way, Pete Seeger was one of the most important figures in popular entertainment, ever.

Not necessarily because of his music. Oh, he had a few classics of American folks music, to be sure. And dozens of forgettable songs – but that’s true for any songwriter, or any artist in any genre for that matter.

Many conservatives writing about Seeger’s passing note that he was a committed Communist. It’s true – he was, and in a way that seems straight out of Orwell, as during this episode after Stalin and Hitler signed their non-aggression pact in 1939:

In the “John Doe” album, Mr. Seeger accused FDR of being a warmongering fascist working for J.P. Morgan. He sang, “I hate war, and so does Eleanor, and we won’t be safe till everybody’s dead.”…The film does not tell us what happened in 1941, when — two months after “John Doe” was released — Hitler broke his pact with Stalin and invaded the Soviet Union. As good communists, Mr. Seeger and his Almanac comrades withdrew the album from circulation, and asked those who had bought copies to return them. A little later, the Almanacs released a new album, with Mr. Seeger singing “Dear Mr. President,” in which he acknowledges they didn’t always agree in the past, but now says he is going to “turn in his banjo for something that makes more noise,” i.e., a machine gun. As he says in the film, we had to put aside causes like unionism and civil rights to unite against Hitler.

For years, Mr. Seeger used to sing a song with a Yiddish group called “Hey Zhankoye,” which helped spread the fiction that Stalin’s USSR freed the Russian Jews by establishing Jewish collective farms in the Crimea. Singing such a song at the same time as Stalin was planning the obliteration of Soviet Jewry was disgraceful. It is now decades later. Why doesn’t Mr. Seeger talk about this and offer an apology?

It’s impolite in polite society to laud Riefenstahl after her association with a regime that murdered over 10 million people. Fair enough.

So why does Seeger escape any questioning for doing so much to support a regime that may have killed five times as many?

But as Howard Husock noted in his classic essay on Seeger, his most lasting impact on American culture may have had little to do with music.

Adopted at the Seventh Congress of the Communist International in 1935, the Popular Front tasked communists in the West with building “progressive” coalitions with various institutions—including political parties and labor unions—that the party had previously denounced as bourgeois and corrupt. The front reflected fears haunting Stalinist Russia at that time. “Hitler had shown a strength that made Communist predictions about his imminent collapse seem grotesque,” observed left-wing historians Irving Howe and Lewis Coser… Following this new strategy, the American Communist Party suddenly asserted that it wanted to build upon, not destroy, American institutions. “Communism is 20th century Americanism,” Earl Browder, the American party’s general secretary, enthused, while extolling Abraham Lincoln in speeches.

This led to the creation of the “Popular Front”, whose mission was not so much to assault capitalism as to co-opt it. And one of the institutions it marked for co-option was the entertainment industry.

And Seeger was a key cog in that machine:

It took a while for the Popular Front’s strategy to get results in popular music—and Pete Seeger was the catalyst. Many critics mark Elvis Presley’s arrival in the 1950s as a turning point in postwar American popular culture, not just because he injected a more overt sexual energy into entertainment, but also, they claim, because his rebellious spirit anticipated the political upheavals of the 1960s. But neither Presley nor the newfangled thing called rock ‘n’ roll had any explicit politics at the time (and Elvis would one day endorse Richard Nixon). A better leading indicator of the politicization of pop was the first appearance of a Seeger composition on the hit parade.

It happened in early March 1962, when the clean-cut, stripe-shirted Kingston Trio released their recording of Seeger’s “Where Have All the Flowers Gone?” Seeger’s lament about the senselessness of war and the blindness of political leaders to its folly soared to Number Four on Billboard’s easy-listening chart, and it remained on the list for seven weeks. “Where Have All the Flowers Gone?” eventually became a standard, sung on college campuses and around campfires nationwide. At the time, the song proved one of the biggest successes yet of the folk-music revival then under way, and it marked a major improvement in Seeger’s fortunes. Not long before, his career had suffered from the fifties anti-communist blacklist. Now it was on a new trajectory—culminating in his 1993 Grammy Lifetime Achievement Award and his 1994 National Medal of Arts.

Seeger did not, himself, “make Hollywood leftist”. But he was a key part of that transition.

Rock and Roll, we are told, started as a blender-mix of rockabilly and R&B. Elvis put a rockabilly delivery onto a rhythm ‘n blues beat. Chuck Berry sped up the blues to rockabilly speed. Johnny Cash did rockabilly over a persona that could have made Howlin’ Wolf go “wow. That’s the blues”.

And the Everly Brothers brought the final piece of the “billy” half of rockabilly – the tight, keening vocal harmonies that characterized bluegrass music – out of the holler and onto pop radio.

Lou Reed died over the weekend, proving once and for all that only Keith Richards can ingest absolutely every recreational chemical known to modern science and live to tell the tale forever.

It took me a long time to really get into Lou Reed – which may seem really counterintuitive, if you know me and my taste in music (and if you read the “Music” category of this blog, you do, sort of; I haven’t written about everything, just yet). After all,everyoneknows Lou Reed and the Velvet Underground were the godfathers of punk – right?

Sure.

But even though the Venn diagram among the different outbreaks of “punk” in New York in the seventies has tons of overlaps, there was a yawning gap between the joyful, garage-band-y noise of the Ramones and the New York Dolls (and their Cleveland descendants, the Dead Boys) and the Greenwich Village scene that spawned Reed, crawling as it was with high-art pretension. The likes of Andy Warhol and William S. Burroughs saw and were seen among the rat-bitten warrens of the Village, hobnobbing with and encouraging the likes of the largely unlistenably shrill Patti Smith, the campy “Stilettos” (featuring a young Debbie Harry, who’d form “Blondie” by the mid-seventies), and of course Reed and the Velvets.

It probably wasn’t until I moved to the Cities and started doing music here that I took a step back, at the urging of my band’s old drummer. “Forget all the BS”, he said, “and just focus on the fact that he’s a guy who loves doing basic rock and roll”.

And in one sense it was true – the classic Lou Reed was all about the joy of playing the most basic rock and roll, simple and unadorned and pared down to its most basic components, filtered through a layer of New York grime.

Usually.

Reed was also an experimenter. In “The Original Wrapper” (from 1986’s Mistrial), he wryly claimed the title of the orignal, well, rapper – since he never so much “sang” as “spoke in rhythm”. He delved through jazz, experimental music, screeching noise…

…even some pared-down pseudo-classical music – as in this very, very, very pre-MTV video for his classic “Street Hassle”, featuring a spoken-word coda by Bruce Springsteen around the eight-minute mark:
r

So I’m going to find my old copy of “Rock and Roll Animal” this week here, and give it another spin.

Son of a dairy farmer from Princeton, MN, Grams came up through broadcasting, working his way from small radio stations into the anchorman’s seat at Channel 9 by the mid-eighties.

From there, he went into politics – defeating Gerry Sikorski, who was hobbled by a capitol banking scandal that showed the door to not a few Congresspeople that year.

And in 1994, at the crest of the “Contract with America”, he took over Dave Durenberger’s Senate seat, after beating Ann Wynia by squeaker in a race that showed both the nascent power of conservatives in the exceedingly moderate Minnesota Independent Republican party, and the rising power of the state’s Second Amendment lobby.

His term in the Senate also was a barometer for the slide of the Twin Cities media into outright partisanship; the Twin Cities media lavished coverage on the twists and turns in Grams’ personal life, and breathless wall to wall scrutiny on the travails of Grams’ son Morgan – of whom Grams’ ex-wife had had full custody – in a way that they never quite managed to for DFLers.

But it is an objective fact that Grams accomplished more in his six years in DC than the celebrated Paul Wellstone did in 12, or than Amy Klobuchar likely will in her entire career.

After being defeated for re-election by future “Worst Senator in America” Mark Dayton in 2000, Grams went back to his first love, broadcasting; he owned a cluster of radio stations in Central Minnesota.

I had the pleasure of interviewing Senator Grams two or three times on the NARN. He had a broadcaster’s knack for being a great interview subject.

I urge you to direct your prayers – or whatever your worldview calls for – to his family.

“I censored myself for 50 years when I was a reporter. Now I wake up and ask myself, ‘Who do I hate today?’” – Helen Thomas

The Grand Dame of the Washington Press Corps files her last report. Will they regret giving her so much deference?

——

The memoriams to Helen Thomas have thus far ventured no where near hagiography-status, due largely to the anti-Semitic statements and acrimonious questions that defined her later years. But to follow Thomas’ career trajectory is to follow the style and influence of the mainstream media. Thomas admirably fought her way into the newsroom, asked probing questions with at least a veneer of respect (hence, her concluding remarks of “thank you, Mr. President” after every presidential press conference), and then devolved into a caricature of an angry, biased reporter holding some extremely ugly and racist views.

The daughter of Lebanese immigrants, Thomas worked as a reporter for the United Press in 1943 on “women’s topics” – essentially fluff articles on baking and clothing. It wasn’t until the mid-1950s, after having written the equivalent of Washington gossip columns, that Thomas was able to cover major federal agencies and far more noteworthy news items. From her post as the head of the Women’s National Press Club and later a White House correspondent during the Kennedy administration, Thomas was able to get women a greater role in journalism – having previously been denied access to organizations like the National Press Club and events like the White House Correspondents Dinner.

Worthwhile accomplishments, to be sure. But having spent most of her professional life fighting for acceptance, even once Thomas was in the door, she couldn’t stop her role as an endless antagonist to those she personally disagreed with. Thomas was most certainly not an “example for journalists,” although her behavior of biased reporting and lack of decorum has definitely been followed by many current reporters.

Thomas’ defenders often claim she was a bitter pill to politicians of all stripes. Of course, Thomas’ White House harangues for Democrats typically involved criticizing them for not moving further left, as she once famously declared that Barack Obama was not liberal. Bill Clinton “personified the human spirit” while George W. Bush was the “worst president in history.” When Thomas joined the Hearst Syndicate in 2000, whatever restraint she had held before vanished, hence her above quote about being able to “hate” whomever she pleased.

From trail-blazer, to provocateur, to angry activist with a byline – does that not also describe the evolving role of the mainstream media in the past 60 years? Thomas was unfortunately another trendsetter in the end – a forerunner of the mixture between opinion and reporting; of a style of journalistic coverage that smears ideological opponents and debases politics regardless of facts. Stephen Colbert might recoil at the thought, but Helen Thomas was one of the originators of the “truthiness” that Comedy Central’s mock conservative loves to sling at others.

I’m a liberal, I was born a liberal, and I will be a liberal till the day I die. – Helen Thomas

A research chemist turned lawyer who became a elected representative of the people of Finchley, Margaret Thatcher changed the world.

Margaret Thatcher

As Great Britain’s longest-serving (and only female) Prime Minister, “The Iron Lady” fought Liberalism and championed Conservative policies that won a war, rejuvenated the national economy and defeated the Soviets.

Margaret Thatcher passed away today, April 8, 2013. She was one of my heroes.

Bremer, the co-author of “The Madness of Michele Bachmann: A Broad-Minded Survey of a Small-Minded Candidate,” died Tuesday afternoon, Jan. 15, at his house in Stillwater Township, from complications related to pancreatic cancer. He was 60.

Bremer was a tenacious muckraker, an award-winning blogger and an avid photographer. His blog — Ripple in Stillwater — was named Best Local Blog by City Pages in 2012. He also received several Minnesota Society of Professional Journalists awards for best use of public records.

I’d never speak ill of the dead. Bremer had his friends and family. I’m sorry for everyone’s loss.