Even in the days before information overload, contextual links to other interesting sites and articles were the norm. Now it seems that unless it’s part of a “strategic partnership” or is otherwise monetized, stories on the web are less about helping the user by providing useful context. This concept, among others, is well explored by Anil Dash is his post “The Web We Lost“:

Ten years ago, you could allow people to post links on your site, or to show a list of links which were driving inbound traffic to your site. Because Google hadn’t yet broadly introduced AdWords and AdSense, links weren’t about generating revenue, they were just a tool for expression or editorializing. The web was an interesting and different place before links got monetized, but by 2007 it was clear that Google had changed the web forever, and for the worse, by corrupting links.

As Dash points out, “This isn’t our web today.” I maintain that if startup founders and VCs funded solutions to the problems faced by media, instead of the latest location-based social check-in app or redundant e-commerce site, we could find solutions to help rebuild the industry.

My stint with the Broadway.com Word of Mouth Panelcontinues. In recent months, I’ve had the pleasure of taking in the fun but forgettable family fare that was the musical Elf; the entirely unforgettable The Other Place, starring sure-to-be Tony winner Laurie Metcalf in a heartrending performance; the delightful (if occasionally slow) biographical performance piece Ann, starring one of my favorites, Holland Taylor, as former Texas governor Ann Richards; and Vanya and Sonia and Masha and Spike, a Chekhov-inspired comedy (really!) that stars the surprisingly hilarious Sigourney Weaver. Click on my face to play the videos!

vs.

The Mayans were wrong, the holiday season has ended, New Year’s has come and gone, and we’re all settling in to 2013. It may be a new year, but it’s the same old problems for the future of journalism…or is it? Below, five of the most interesting nuggets I read this week about the state of print media, advertising and marketing.

1.

Andrew Sullivan, late of the Daily Beast, announced in a post called “New Year, New Dish, New Media” that he’s taking his site to the people. He’s leaving the advertiser-based media world entirely, as well as the venture-backed one:

We want to help build a new media environment that is not solely about advertising or profit above everything, but that is dedicated first to content and quality.

We want to create a place where readers — and readers alone — sustain the site. No bigger media companies will be subsidizing us; no venture capital will be sought to cushion our transition (unless my savings count as venture capital); and, most critically, no advertising will be getting in the way…. Hence the purest, simplest model for online journalism: you, us, and a meter. Period. No corporate ownership, no advertising demands, no pressure for pageviews.

2.

From an essay in yesterday’s NYT magazine called “Can Social Media Sell Soap?” by Stephen Baker on the value, or perceived value, of data- and social media-based marketing and advertising on social media today compared to the so-called heyday of advertising that’s depicted on Mad Men.

In the “Mad Men” depiction of an advertising firm in the ’60s, the big stars don’t sweat the numbers. They’re gut followers. Don Draper pours himself a finger or two of rye and flops on a couch in his corner office. He thinks…. Fellow humanists dominate Don Draper’s rarefied world, while the numbers people, two or three of them crammed into dingier offices, pore over Nielsen reports and audience profiles.

In the last decade however, those numbers people have rocketed to the top. They build and operate the search engines. They’re flexing their quantitative muscles at agencies and starting new ones. And the rise of social networks, which stream a global gabfest into their servers, catapults these quants ever higher. Their most powerful pitches aren’t ideas but rather algorithms. This sends many of today’s Don Drapers into early retirement.

While the rise of search battered the humanists, it also laid a trap that the quants are falling into now. It led to the belief that with enough data, all of advertising could turn into quantifiable science. This came with a punishing downside. It banished faith from the advertising equation. For generations, Mad Men had thrived on widespread trust that their jingles and slogans altered consumers’ behavior. Thankfully for them, there was little data to prove them wrong. But in an industry run remorselessly by numbers, the expectations have flipped. Advertising companies now face pressure to deliver statistical evidence of their success. When they come up short, offering anecdotes in place of numbers, the markets punish them. Faith has given way to doubt.

This leads to exasperation, because in a server farm packed with social data, it’s hard to know what to count. What’s the value of a Facebook “like” or a Twitter follower? What do you measure to find out?

3.

“We see a real shift going on from traditional advertising to a content-driven strategy,” Dan Kortick, managing partner at Wicks, said in a phone interview on Friday. “It’s more about engagement than exposure,” Mr. Kortick said, as content marketing offers “real engagement with your customer base.”

4.

Derek Thompson of The Atlantic weighs in on why web advertising sucks and which of the models described in the quotes above will work going forward (spoiler alert: it’s probably a combination of both, depending on the scale and the goal).

It’s commonly understood that Web advertising stinks, quarantined as it is in miserable banners and squares around article pages. BuzzFeed’s approach is different: It designs ads for companies that aim to be as funny and sharable as their other stories. Jonah Peretti, the CEO of BuzzFeed, told the Guardian’s Heidi Moore that he attributed nearly all the company’s revenues to this sort of “social” advertising. “We work with brands to help them speak the language of the web,” Peretti said. “I think there’s an opportunity to create a golden age of advertising, like another Mad Men age of advertising, where people are really creative and take it seriously.”

The online reaction to the Dish [striking out on its own, without advertising] and BuzzFeed [getting $20 million in funding] seems to be that what Andrew’s doing is sort of quaint and old-fashioned and what BuzzFeed is doing is weird and revolutionary. The opposite is true. Funding a journalistic enterprise without advertising is weird and revolutionary and experimenting with ads that are suitable to their medium is a clear echo of history. Just as the first radio ads were essentially newspaper ads read aloud, and the first television ads were little more than radio spots over static images, many on the Web are fighting the last war rather than building ads that work for the Internet, journalism history professor Michael Schudson explained to me.

Banners and pop-up ads are so awful they practically sulk in their acknowledged awfulness, fully aware that they are interruptions rather than attempts to compete with editorial content for the readers’ attention. BuzzFeed (and other companies experimenting with designing advertising for their advertisers) gets that and tries to fix it. Just as TV ads are successful precisely because they try to be as evocative, funny, arresting, and memorable as actual TV, there’s no reason why advertising content shouldn’t aim to be as informative or delightful as an original online piece.

Even as Sullivan’s Dish is pushing the boundaries of subscriptions, testing how much a dedicated audience is willing to pay for online journalism that is supposedly free, BuzzFeed is pushing the boundaries of advertorial — advertising content like looks like editorial content — testing how far each side of their two-sided market (readers and companies) is willing to go. The future of paid journalism — if we can even try to guess at it — will probably be a blend of the two strategies celebrated this week: Ads that are less useless and ignorable, and readers who are asked to show a little more love than they’re used to.

5.

Finally, let’s wrap up with yet another pollyanna-ish piece from David Carr, titled “Old Media’s Stalwarts Persevered in 2012.” He has postulated that “old media,” by which he means broadcast networks, are “raining green” because they’ve learned from happened to music and print.

The worries about insurgent threats [to broadcasters] from tech-oriented players like Netflix, Amazon and Apple turned out to be overstated. Those digital enterprises were supposed to be trouncing media companies; not only is that not happening, but they are writing checks to buy content…. “As it turns out, the traditional television business is far stickier than people thought, and audience behavior is not changing as rapidly as people thought it might,” said Richard Greenfield, an analyst at BTIG Research.

Perhaps the numbers support this for now — this quarter, this year — but I think that’s a temporary glitch of the awful economy, not a harbinger of the future. As Carr reports, these giant corporations, instead of spending money, paid out dividends and financed stock buybacks. So sure, the numbers are up…but stuffing your savings under the mattress is not a long-term strategy. And its certainly not one that will not work for all “old media,” which Carr eventually acknowledges:

Another thing about those dinosaurs is that they aren’t really old media in the sense of, um, newspapers. When their content is digitized, it is generally monetized, not aggregated.

I’ll ignore the irony of having aggregated the thoughts above. And I won’t even comment on five white guys having written them in the first place, and the stories themselves being about other white guys, and what these facts say about the future (or is it past?) of media and advertising. Happy 2013.

The New York Times turned the February avalanche at Tunnel Creek in Washington State into a completely absorbing multimedia experience. I was both spellbound and delighted by the video, audio, maps, photos, GIFs and most of all words, which all added up to an engaging, vital storytelling experience.

The gripping tale of the exciting lead-up to, feelings of dread about, and inevitable tragic end to the ski outing could have been told singularly by the Times. Only the Times (or a news organization of similar stature) could spend six months reporting a story that, according to the end credits “involved interviews with every survivor, the families of the deceased, first responders at Tunnel Creek, officials at Stevens Pass and snow-science experts” as well as reports from police, the medical examiner and 911 calls. Sixteen names in addition to John Branch’s (the writer) are listed in the credits (byline seems an even more outdated term than usual on this piece).

The article honors the victims and their families, approaches the survivors gracefully and tactfully, and serves as a cautionary tale to adventurers. And it fires up journalists and others who admire the well-reported, well-structured feature, a story form that has fallen out of favor in the era of pageviews, soundbites and 140-character updates. It’s as well written as anything I’ve read in the genre, including Jon Krakauer’s stuff, and it sets a new bar for multiformat journalism.

And it might even make money: Notice at the end, there’s a call-out to buy an e-book version of the article on Byliner.

As I mentioned in a previous post, many of my recent freelance gigs have involved reading printed materials on various electronic devices. For several distinct projects, I read the same material on no fewer than four devices at a time, and each had a different layout, different size, different coding language and different interactive elements. This was the case because Apple, Amazon and the rest render their materials in different, proprietary programming languages, and the hardware they’ve created boasts proprietary specs. It has been a major shock to learn how much work and money must to go into optimizing the same printed material for all these devices. And it’s abundantly clear that as publishing professionals, we must do much more work, and soon, in establishing standards for print-to-digital conversion.

“Technology is always destroying jobs and always creating jobs, but in recent years the destruction has been happening faster than the creation.”
—Erik Brynjolfsson, an economist and director of the M.I.T. Center for Digital Business (via)

Arguably, this obtuse process is employing me. The technology has, in this case, created a new job: There’s a need for someone to read each article of each issue (or each page of each chapter of each book) on each device. I don’t want to sound ungrateful, because I’m developing quite a little niche for myself as an expert on print-to-digital conversions. But I wonder how long it can last, considering that print media is undergoing huge change at the moment. Momentous, disruptive, industry-wide change that’s happening at a rapid pace, particularly with regard to technology.

We might be powerhouse publishers, but in the tech world we’re just like every other Joe App Maker, 96 percent of whom do not make significant money on their apps. According to a recent article in the New York Times, 25 percent of Apple game app makers made less than $200, with only 4 percent making upwards of $1 million. Granted, random game app makers don’t have the brand recognition or cachet of major publishing houses; neither do they have an overarching, Apple-endorsed app that features their stuff (Newsstand for Apple, if you’re still following me).

But make no mistake, the field has been leveled, and instead of competing only with each other, even the biggest content publishers now also compete with Angry Birds, Twitter, Facebook, travel apps, e-commerce apps, dining apps, coupon apps…the list is endless.

The difference? Unlike many apps, the media’s brand relevance and reputation absolutely hinges on an amazing user experience across devices at all times. In short, it has to be perfect. And in order for that to happen, the same material must be reconceived by its creators multiple times. It seems impossible to believe, but publishers optimize the same product over and over again, incurring all sorts of real costs from designers, editors, producers and programmers with each iteration. (And this isn’t even counting the web producers who conceive it all over again for the online version!) Once you account for these costs, in addition to the so-called legacy costs of creating the print product in the first place, it hardly makes sense even to enter into the realm of app creation for many print products. That’s even if you can get your app sponsored or otherwise monetized, and even if you use Adobe to help you create it.

I realize that the common line of thought is that, like websites, if you don’t have an app presence, you don’t exist. Half a decade ago, this principle propelled the creation of a million new half-assed websites (websites: another print-distribution model without a standard!). But I’d counter that without apps — without content — these devices would be useless. So unless we want to bankrupt the already struggling print media industry further, we must stop playing by the device makers’ rules and rewrite them to benefit our business. We must invent technology that adapts our product (ie, content) to any device at any orientation. We must create or help market forces create a standard we can implement and follow; we must negotiate a better rate than giving away 30 percent of our revenue; we must not “throw in” digital access with print subscriptions.

I know, I know: Nature abhors a vacuum. If we don’t follow suit, we’re nothing. But following hardware makers blindly down dark passageways as our pockets get picked around every corner isn’t a smart strategy, either. In one big way, we are not like Joe App Maker: We possess a hugely powerful medium. We must harness our strengths and lead ourselves forward. A nice start might be to begin taking a stand against having to endlessly tinker with every article in every issue of every magazine, every book, every design.

As Shawn Grimes, the app developer profiled by the Times said: “People used to expect companies to take care of them. Now you’re in charge of your own destiny, for better or worse.” Let’s be in charge of our own destiny.

Many of my recent freelance gigs have involved reading printed materials on various electronic devices, so I’ve basically become a one-woman control group for determining the best device-reading experience. I’ve had the opportunity to directly compare the following devices: Kindle E-Ink, Third-Generation Kindle (“Keyboard Kindle”), Fourth Generation Kindle, Kindle Fire, Kindle Fire HD, Samsung Galaxy, iPad 2 and iPad 3.

Ready for the results? The winner is…the iPad 3 with retina display!

The result is perhaps not surprising, but the gap in performance and readability among all of these devices versus the iPad 3 really is shocking. The iPad 3, in addition to being a more more sleek and elegant experience overall for the user, is also far, far easier to read. The display is better than even the original printed product to which I was comparing it, believe it or not. The words are clearer and crisper; the photos are deeper and livelier.

When evaluating tablets, we must start with the premise that every six months a new one is released, and that the newer versions are superior to the previous generations. That leaves truly valid comparisons, at the moment, between only the iPad Mini, the iPad 3 and the Kindle Fire HD. Setting aside the iPad Mini for the moment because it doesn’t (for some stupid reason) yet have retina display, that leaves the latter two. Perhaps to casual users, the gap between the iPad 3 and the Kindle Fire HD isn’t noticeable, but having spent many weeks putting down one device and picking up the other, I can tell you with certainty that the Apple product blows the Amazon one out of the water.

I acknowledge that I am an Apple person. I have an iMac, an iPad 2 and an iPhone, and when I had a Droid phone for about six weeks last year, I wanted to throw it out the window. (Except Swype. I love Swype! Why doesn’t Apple have Swype?!) So for me, the Apple experience — gestures that just seem to make sense, buttons where they should be, seamless navigation among apps, access to hundreds of thousands of other amazing and useful apps — in addition to the reading experience put the device in a field of its own.

Is the difference in quality worth $200 ($499 for iPad 3 versus $299 for Kindle), especially if you aren’t already living the Apple lifestyle? It depends what you want to use it for and how much weight you want to tote around town, but for my money, even if — or maybe especially if — you only use it to read books and magazines, the retina display is such a game changer that I absolutely think so.

Separately from work, I recently test-drove a Microsoft Surface briefly, and my initial thoughts were that it might be nice if you already live in the Windows universe — native Outlook and Excel apps, for example — but it really doesn’t do anything better than the iPad does. And that includes the weird add-on cover keyboards, which are either nontactile (in other words, useless versus the virtual) or just small enough compared to a normal keyboard as to be aggravating. (And this is coming from someone who loathes Apple’s virtual keyboard.)

I’ve also had the opportunity to play with the seven-inch Nexus, which has a nice hand-feel and is extremely portable. I don’t think this makes up for its lack of sensible navigation or access to trusted apps, but it’s an OK alternative to the real game-changing device, which will be the next-generation iPad Mini, with retina display. (True story: I’ve never even laid eyes on a real-life Nook.)

It’s a safe bet that when the iPad Mini with retina display — small enough to feel good in the hands and fit in the bag, but with the text clarity of the iPad 3 — comes to market, I’ll be first in line.

The Daily, News Corp.’s general-interest iPad news product, shut down this week. Media experts (or perhaps I should say “observers”—I’m not sure the media has any experts anymore) disagree on the specific reasons it failed, but they do seem to agree that it was doomed. The columns I’ve read and rounded up from around the web cite the following three conclusions:

1. Making it available only via iPad and without access to the open social web (readers couldn’t share links) made it a walled garden.

“The Daily’s device-bound nature limited its potential…. Locking into a single platform and not having a web front door limiting sharing and social promotion.” —Joshua Benton

“Publishing for a single platform, whether print, web, or the iPad, is a foolish move, and I think we knew that before The Daily was excised from News Corp.’s balance sheet.” —Ben Jackson

“The product, its content and the conversation around it should have been porous, able to flow in and out of social media platforms and be informed by them. Content should have been unlocked, and made available to subscribers on all platforms.” —Jordan Kurzweil

“More than 54 million people in the U.S. use an iPad at least once a month, but they remain just 16.8% of the population and 22.2% of people on the internet, according to eMarketer. That put a hard cap on the number of subscribers The Daily could acquire no matter how solid its product.” —Nat Ives

“Simply put, The Daily never attracted the revenue required to support a team of 120 people. Launching what amounted to a digital daily newspaper with many of the legacy costs and structures of print wasn’t the best idea.” —Hamish McKenzie

“The Daily should have been run like a startup, a digital business, not a division within a division in a corporation.” —Jordan Kurzweil

“Though it looked quite nice and its content was competent, that content was all-in-all just news and news is a commodity available for free in many other places.” —Jeff Jarvis

“[The term general reader means] a media executive is imagining himself and his friends (you know, normal guys) and intending to produce a bundle of content for that hyperspecific DC-to-Boston-went-to-a-good-college-polo-shirts-and-grilling demographic…. This is not to say that media properties cannot be built with the goal of reaching the mainstream [but successful] sites have been built up like sedimentary rock from a bunch of smaller microaudiences. Layers of audience stack on top one another to reach high up the trafficometer.” —Alexis Magrigal

Whatever the reasons it was closed down, I’m glad someone at least experimented with new ways to produce news. Trying stuff really is the only way to learn. My condolences to those journalists who were laid off. They should consider the no doubt multitude of lessons they’ve learned and call themselves, rather than out-of-work journos, technicians in the lab of digital journalism — scientists who can take the knowledge they’ve gleaned and apply it to the next experiment.

As I mentioned last month, I’ve recently had the opportunity to be part of the Broadway.com Word of Mouth Panel for the 2012–13 theater season. I get amazing seats at new shows for free, and my only obligation is to have an opinion on it afterward — not a big challenge! Here are my reviews of Annie (loved it!) and Dead Accounts (didn’t love it!). Click on my face to play!

It’s human nature to compare things. We put things in context for better understanding. “This thing [business/weather/process/person/event] that is happening is like this other thing that happened, and that thing turned out [good/bad/different/better/worse].”

I’ve been doing a lot of that lately surrounding the media. Specifically, I’ve spent time contemplating how to reconcile how valuable journalism is to society compared to how much actual monetary value it generates. As I’ve written about before, no one knows what’s going to happen to this business: whether it will go the way of the steamship and the telegraph, reinvent itself a la Apple, or something in between.

I’m not the only observer who’s searching for an appropriate comparison from the past in order to predict the media’s future, but I do find that some insights are better than others; does anyone really think that the envelope business, of all things, is really a good model for the Random House-Penguin merger? (Does anyone think of “the envelope business” at all?)

Watching the Ken Burns PBS documentary The Dust Bowl recently, however, opened my eyes to a new analogy for the media of the present day: farming a century ago. (And why not — we did recently learn that there are far more software app engineers than farmers.) According to Burns, farmers in the Great Plains around 100 years ago sold their goods, wheat in particular, in enough volume and at a fair enough price, that they kept their families fed, happy and productive before the Great Depression. Prior to the big event, they faced periodic yet persistent droughts and occasional technological breakthroughs (gas-powered plowing, for example). But year after year, they found a way to keep going, even increasing volume to make up for the deficits caused by off years. That is, until the permanently landscape-altering Dust Bowl.

Compare this to journalists and media today. For decades we plied our trade, not making big money but making enough to support our families. We changed with the times, moving from copy boys and paste-ups to computers. But the past decade has seen such a huge acceleration of technology (and a hugely inverse deceleration of jobs) that our worth is now, to put it mildly, in question. Like the farmers, we’ve tried doing more: You’re now not only a reporter, you’re also a videographer, photographer and blogger — and you will hereafter be known as a “content creator.” You’re now responsible for not only reporting your usual one-story-by-deadline allotment, but you’re also going to write six additional posts a day (and you need to know how to produce them, tag them and upload them).

But as the farmers discovered, doing more not only didn’t help them, it actually created its own set of problems. In their case, they unknowingly caused the largest man-made ecological disaster to date (you’re well on your way, though, global climate change: hang in there). In ours, the huge volume of posts was churned through by disloyal consumers, the glut and pace belittled the value of the news, and the business changed from creating newsworthy, relevant content to attracting eyeballs and lowering bounce rates and counting click-throughs and measuring social engagement and Tweeting viral videos.

Other, larger factors were also at play, including the rapid pace of technological development. The ease of use of technology meant that anyone could be a creator of content — so the process of journalism was democratized, but it was also dumbed down and its worth devalued.

“But of all our losses, the most distressing is our loss of self-respect. How can we feel that our work has any dignity or importance when the world places so low a value on the products of our toil?”

—Caroline Henderson, Oklahoma farmer during the 1932 drought during the Depression, just prior to the Dust Bowl’s worst

Now, I’m not saying it’s a perfect comparison. We haven’t had to put to pasture cattle that suffocated during “black blizzards” or bury children who caught “dust pneumonia.” But I think it’s a decent metaphor, because the media is going through its version of the Dust Bowl. Newspapers and magazines are closing up shop at an unprecedented pace; media businesses are losing money quarter after quarter and year after year, with no end in sight; those workers who are able (and I count myself among this number) are learning new skills and moving into new areas. (All of this can be said for other industries as well, by the way, particularly music.)

Somewhat brazenly, and I think disrespectfully, we’ve taken to calling tech and business shakeups, events and new models “disruptions.” Of course, since the beginning of time businesses have striven to disrupt other, existing businesses, but it seems much more ruthless to start your business with the sole intent of creating wreckage. I think it’s fair to cast our historical eye onto the Depression and the Dust Bowl and deem them disruptions, at the very least. And it’s easy to forget, but disruptions have a cost — a monetary one and a human one.

Years from now, I wondered while watching the documentary, how will journalism be perceived? Who will be the talking heads and what will they say? Which commentators will highlight which historical implications that, in retrospect, seem clear? How will the people generations from now — even one or two — talk about the media? Will we have adapted with the times and made a new reality for ourselves (and somehow have figured out a way to feed our families along the way)? Is journalism like the family farm in the Oklahoma panhandle of the 1930s, and are we farmers, continuing to plow the fields that we’ve yet to learn will never again yield crops? Is it like kerosene lighting, steam-powered train engines, millinery, fax machines, answering services, 8-tracks, the luncheonette, and the endless list of other businesses throughout history that litter the shoulders of the road toward the future? I want to believe that it’s not. I hope upon hope that it’s not.

“Hope kept them going, but hope also meant that they were being constantly disappointed.”