How Cutting the Cord Is Reinventing TV as Shared Cultural Experience

Like pretty much everyone else, I watched the Breaking Bad finale Sunday night (if you haven’t seen it yet, you should probably stop reading here, because there will be spoilers ahead). But for all that I thought it was excellent, I don’t really want to write about the show itself, because there are enough talented writers doing that already — including our own Jason Bailey right here – and, to be completely honest, I’m all recapped out. But that in itself is an interesting subject of discussion, because the last couple of weeks of this show really felt like a shared cultural experience of the sort that seems rarer and rarer these days.

I don’t have cable, so after the questionable decision to watch “Ozymandias” after midnight on the night it screened — not a great deal of sleep was had that night — for the last couple of weeks I’ve headed down to the local bar, where they’ve been showing Breaking Bad on their oversize TV. Last Sunday the place was packed to the extent that it was impossible to find a seat, but that was nothing in comparison to last night, where it literally took half an hour to get a drink and there were plenty of desperate fans who ended up peering through the window from the street outside to see what was gonna happen.

And yet, despite the fact that there were at least a couple hundred people crammed into a bar that’s normally lucky to entertain 20 patrons on a Sunday night, the place was entirely silent when the show was playing… with a few exceptions, of course. It erupted into wild cheering last week when Jesse made his escape, then lapsed back into appalled silence for a good couple of minutes into the ad break after Andrea got shot. And it’s been a long time since I’ve been a part of a crowd that’s gone as completely batshit as when Jesse finally got his revenge on Todd. It felt more like being at a football game than a harrowing TV show about the meth trade.

As I stood there jammed up against the bar while a total stranger whooped and cheered and hugged me, I got to thinking that this really felt like being part of something. One of the recurrent complaints about the 21st century is the fragmentation of culture, the idea that the sheer variety of entertainment and art available means that the definition of a single, coherent, monolithic culture based on geography becomes impossible. I think this is correct to an extent, although I don’t think it’s necessarily a bad thing — the Internet means that even if you’re the only teenager in Asshole, Iowa who’s into obscure indie pop, you can find and interact with like-minded souls.

This does also mean, however, that there are fewer unifying cultural experiences — the sheer wealth of choice means that you’re likely to define the parameters of the culture you consume, which may bear little relation to the diets of those around you. Curiously enough, it was TV that started this. In its early days, with only three or four channels to choose from, it seemed to promise a kind of cultural unification: everybody watched the same thing, and the next day, everybody talked about it. As such, it’s, yes, the baby boomer generation that was raised during what’s referred to as a sort of golden age of TV. One Steve Gillon, author of Boomer Nations, argues here that “If you grew up in the ’50s and ’60s, you came of age at the same time that national culture first developed… The assassination of J.F.K., for instance, was the first event the nation experienced in real time at the same time.”

By the 1980s, though, there was also the sense that TV undermined high culture. There were suddenly a bazillion channels to choose from, and most of them were rubbish. One of the complaints about TV when I was growing up, before the Internet took its place as the subject of popular older-generation angst, was that it’s a fundamentally alienating medium — that by sitting and staring into the box, you shut yourself off from the world.

Instead of interacting with your fellow humans, the complaint went, you were simply staring at a screen. In the 1980s and ’90s, there was hand-wringing aplenty about the number of hours of TV kids watched daily/weekly/yearly, and a sense that these hours meant a disconnection from real-world, capital-C Culture in favor of pop culture, a shunning of your real-life peers for a virtual world.

You might think that the advent of the Internet has made this worse — after all, these days it’s trivial to download shows and watch them whenever you want, and if you’re a fan of Netflix-based shows like House of Cards and Orange Is the New Black, you can binge to your heart’s content on an entire season at your convenience. There’s certainly been much hand-wringing about the idea that the Internet signals the end of TV as shared cultural experience — the death of the water-cooler, as the Independent called it last month.

But now, it feels like things are coming full circle. The fact that fewer and fewer of us have cable these days means that we’re more likely to seek out public spaces to watch episodes that absolutely have to be watched at the time they’re aired, or to congregate in the living room of that one person you know who does get AMC or HBO.

Away from the physical world, the Internet has actually brought people closer together on occasions like last night. Literally my entire Twitter feed last night concerned Breaking Bad, tweets from both friends and people I’ve never met echoing the real-life reactions in the bar — lots of “OMFG YESSSSSSSS”-type tweets when Jesse finally had his revenge, lots of tearful “#goodbyebreakingbad” reflections once the show was over. It was the end of the best TV show of the current decade. And it felt like being part of the sort of shared cultural experience that’s supposed to be in decline. Maybe culture isn’t dead quite yet.