Rex Smith: A breakfast revolution lesson

Lately I’ve taken to examining the breakfast cereal industry for clues to the future of journalism. Nothing in my research would bowl you over — sorry — but it’s thought-provoking.
There was consternation a few years ago when marketers reported a sharp decline in cereal sales, a $10-billion-a-year American industry. Since people have been eating in the morning for quite some time, researchers viewed this as an ominous development.
But it turns out folks were still hungry; they just didn’t feel as though they had time to sit down and dig into a bowl of Cheerios. Since the drive for profit makes a species quite adaptable, breakfast brokers rushed to market dozens of new brands of cereal bars. No mess! Eat ’em on the run!
This helped the bottom line of General Mills and other high-ranking cereal makers, but it was clearly not a permanent solution to the no-time-for-breakfast crowd. For one thing, it’s hard to juggle a breakfast bar in one hand and a cellphone in the other as you negotiate the Northway, and a person who doesn’t have time to eat breakfast is surely not going to let something as mundane as driving interfere with the need to maximize minutes of cell usage.
Now we’re seeing a maturing of the breakfast market. Evidence of this: Doughnut sales are reported to be once again on the upswing (Americans’ favorite: glazed). Oatmeal has surged in popularity in fast-food restaurants, right alongside breakfast burritos. Kellogg’s, a venerable cereal maker, is offering fruit pizza you can microwave in a minute.
Culinary purists may view a mixed berry granola pizza ready for your home microwave oven as an abomination, but in contemporary business practice it could be respected as a “disruptive innovation.” This is a term coined in the mid-1990s by Clayton Christensen, a Harvard business professor, who suggested markets could be changed by simple, cheap and accessible products that aim at new consumers and that don’t need to be great in quality — just good enough to get the job done.
You may consider “just good enough” to be a fairly low standard for a product, especially for something you depend upon for nutrition, but sometimes so-called disruptors morph into successors, rising in quality and value.
Consider Wikipedia, initially a wholly untrustworthy information source that has become, like its print predecessors, a useful starting point for all sorts of research. How do you think I know so much about breakfast? Anyway, the lessons of the breakfast revolution were on display in another field last week, during a forum at the University at Albany about contemporary movie-making. It featured Craig Hatkoff, who along with his wife, Jane Rosenthal, and the actor Robert DeNiro created the Tribeca Film Festival in 2002.
Hatkoff, a 1972 Albany Academy graduate, is a disciple of Christensen. He says disruptive innovation has opened filmmaking to people who might never have made a movie in an earlier era. It has affected stars, too. Ed Burns, who has acted in 27 movies, including at least one that grossed a half-billion dollars, produced a film recently for $9,000, thanks to disruption enabled by inexpensive editing software and improvements in camera technology. “The truth is you can make a movie for no money in New York … and have a blast,” Burns has said.
But is it something you’d pay ten bucks to see in a theater if it weren’t created by a star with a track record of interesting work?
In fact, absent the big name, would you even find it? Will the quality be sufficient to lure you back for the filmmaker’s next creation?
This notion of disruption changing the face of an industry is quite familiar to someone in my line of work. Everybody with a smartphone can be a publisher now, and for a lot of consumers reporting is good enough if it confirms what they assume is likely to be true.
What’s right about this digital disruption is that it offers news consumers a chance to see far more than the limited range of information that traditionally has emerged from our legacy newsrooms. What’s uncertain, though, is whether people will find journalism that follows a discipline of verification, with standards aimed at giving readers and viewers a true view of what lies beyond their own experience. It’s easy to distinguish a lousy movie from a good one, but the absence of fair journalistic storytelling may not be apparent for a long time, until people belatedly recognize what they wish they had known.
Consider it a question of the nutritional value of your information consumption — something for you to mull over as you sip a morning coffee. If you have the time, that is.

Rex Smith

One Response

Rex,
Though I read this column in print, it has taken me a while to write a response here on this blog. Thanks for the opportunity. I am glad for those journalists that diligently verify the information they present, and weave a well-told tale at the same time.
However, I would submit to you that a bigger reason for the decline in print journalism, along with many other long-standing news networks is as simple as one Word: Bias, as in the book by Bernard Goldberg. The biggest factor that is not explained is the vast number of stories that are never reported by any “traditional” media organizations, regardless of how many opportunities there are to research and verify them.
Of the many examples I could offer, I will choose only one: From Jan 20, 2006 to Jan 20, 2009, how many days can you count where an article about Iraq and Afghanistan war casualties did not appear on the front page? And then if you compare Jan 21, 2009 to Jan 21, 2012 – how many days? I would suggest the coverage was overwhelming in the first period, and nearly invisible in the second. And how do you explain the difference?