Reports & Briefs

Reports & Briefs

Tow Fellows

Tow/Knight Projects

Post Industrial Journalism: Adapting to the Present

Introduction: The Transformation of American Journalism Is Unavoidable

This essay is part survey and part manifesto, one that concerns itself with the practice of journalism and the practices of journalists in the United States. It is not, however, about “the future of the news industry,” both because much of that future is already here and because there is no such thing as the news industry anymore.

There used to be one, held together by the usual things that hold an industry together: similarity of methods among a relatively small and coherent group of businesses, and an inability for anyone outside that group to produce a competitive product. Those conditions no longer hold true.

If you wanted to sum up the past decade of the news ecosystem in a single phrase, it might be this: Everybody suddenly got a lot more freedom. The newsmakers, the advertisers, the startups, and, especially, the people formerly known as the audience have all been given new freedom to communicate, narrowly and broadly, outside the old strictures of the broadcast and publishing models. The past 15 years have seen an explosion of new tools and techniques, and, more importantly, new assumptions and expectations, and these changes have wrecked the old clarity.

There’s no way to look at organizations as various as the Texas Tribune, SCOTUSblog and Front Porch Forum or such platforms as Facebook, YouTube and Storify and see anything like coherence. There’s no way to look at new experiments in nonprofit journalism like Andy Carvin’s work at NPR during the Arab Spring and convince yourself that journalism is securely in the hands of for-profit businesses. And there’s no way to look at experiments in funding journalism via Kickstarter, or the coverage of protest movements via mobile phone, and convince yourself that making information public can be done only by professionals and institutions.

Many of the changes talked about in the last decade as part of the future landscape of journalism have already taken place; much of journalism’s imagined future is now its lived-in present. (As William Gibson noted long ago, “The future is already here. It’s just unevenly distributed.”) Our goal is to write about what has already happened and what is happening today, and what we can learn from it, rather than engaging in much speculation.

The effect of the current changes in the news ecosystem has already been a reduction in the quality of news in the United States. On present evidence, we are convinced that journalism in this country will get worse before it gets better, and, in some places (principally midsize and small cities with no daily paper) it will get markedly worse. Our hope is to limit the scope, depth and duration of that decay by pointing to ways to create useful journalism using tools, techniques and assumptions that weren’t even possible 10 years ago.

We also highlight the ways new possibilities for journalism require new forms of organization. Traditional news organizations have tended to conserve both working methods and hierarchy, even as the old business models are collapsing, and even when new opportunities do not fit in those old patterns. In interview after interview with digitally focused members of the traditional press, the theme of being thwarted by process came up. Adapting to a world where the people formerly known as the audience are not readers and viewers but users and publishers will mean changing not just tactics but also self-conception. Merely bolting on a few new techniques will not be enough to adapt to the changing ecosystem; taking advantage of access to individuals, crowds and machines will mean changing organizational structure as well. (We recognize that many existing organizations will regard these recommendations as anathema.)

This essay is written for multiple audiences—traditional news organizations interested in adapting as well as new entrants (whether individual journalists, news startups or organizations not previously part of the journalistic ecosystem)— and those organizations and entities that affect the news ecosystem, particularly governments and journalism schools, but also businesses and nonprofits. We start with five core beliefs:

Journalism matters.

Good journalism has always been subsidized.

The internet wrecks advertising subsidy.

Restructuring is, therefore, a forced move.

There are many opportunities for doing good work in new ways.

Journalism Matters

Journalism exposes corruption, draws attention to injustice, holds politicians and businesses accountable for their promises and duties. It informs citizens and consumers, helps organize public opinion, explains complex issues and clarifies essential disagreements. Journalism plays an irreplaceable role in both democratic politics and market economies.

The current crisis for the institutions of American journalism convinces us of two things. First, there is no way to preserve or restore the shape of journalism as it has been practiced for the past 50 years, and, second, it is imperative that we collectively find new ways to do the kind of journalism needed to keep the United States from sliding into casual self-dealing and venality.

Not all journalism matters, of course. Much of what is produced today is simply entertainment or diversion, but here, we grapple only with what has variously been called “hard news,” “accountability journalism” or “the iron core of news.” Hard news is what matters in the current crisis. Rather than try to list or define the elements that separate hard news from the fluff, we have simply adopted Lord Northcliffe’s famous litmus test: “News is something someone somewhere doesn’t want printed. Everything else is advertising.”

This does not mean that the output of news organizations can be cleanly divided into two categories, hard news and fluff. Sometimes a business section will run stories on tie colors; sometimes the lifestyle section will break business news in the fashion world. As we write this, the New York Daily News home page features one story on Miley Cyrus’ new haircut and another on the city’s stubbornly high unemployment rate.

Even with that spectrum recognized, however, hard news is what distinguishes journalism from just another commercial activity. There will always be a public appetite for reporting on baseball, movie stars, gardening and cooking, but it’s of no great moment for the country if all of that work were taken over by amateurs or done by machine. What is of great moment is reporting on important and true stories that can change society. The reporting on the Catholic Church’s persistent harboring of child rapists, Enron’s fraudulent accounting and the scandal over the Justice Department’s Operation Fast and Furious are all such stories.

Because telling true stories is vital, the value of journalism can’t be reduced to other, ancillary needs. Journalism performs multiple overlapping functions, and there never used to be much urgency in defining those functions. In the period in which public speech was scarce (which is to say, all of history until now), journalism was simply what journalists did, journalists were just people hired by publishers, and publishers were the relative handful of people who had access to the means of making speech public.

We believe that the role of the journalist—as truth-teller, sense-maker, explainer— cannot be reduced to a replaceable input for other social systems; journalists are not merely purveyors of facts. Now and for the foreseeable future, we need a cadre of full-time workers who report the things someone somewhere doesn’t want reported, and who do it in a way that doesn’t just make information available (a commodity we are currently awash in), but frames that information so that it reaches and affects the public.

An increasing amount of firsthand reporting is done by citizens—much of our sense of the Fukushima Daiichi nuclear disaster in Japan and the Pearl Roundabout massacre in Bahrain came from individuals on the ground—but this does not mean that all professional journalists will, can or should be replaced. Instead it means that their roles will change, overlapping with the individuals (and crowds and machines) whose presence characterizes the new news environment.

Good Journalism Has Always Been Subsidized

The question of subsidies for news has been a hot issue for some time now. Observers of the news environment such as Steve Coll, David Swensen and Michael Schmidt, and Michael Schudson and Len Downie have suggested that the U.S. press should move toward a more explicitly subsidized model, a suggestion that generated heated responses from other observers—Jeff Jarvis, Jack Shafer, Alan Mutter—who insist that only a commercial press produces the necessary resources and freedom that the U.S. press requires.

We believe that this is a false dichotomy. Subsidies are often regarded as synonymous with direct government funding, which would raise obvious and serious concerns, but subsidy, in the sense of support granted to work seen to be in the public good, comes in many flavors. It can be direct or indirect, and it can come from public or private sources. Citizen donations are as much a subsidy as government grants.

Good journalism has always been subsidized; markets have never supplied as much news as democracy demands. The most obvious form is indirect public subsidy: Radio and TV enjoy free access to the airwaves, in return for which fielding a credible news operation is (or was) the quid pro quo. Businesses are forced to pay for legal notices in newspapers. Print publications are given favorable postage rates.

There has been some good news in the form of direct reader fees for digital properties, using the “payment after a page-view threshold” model. These fees are obviously welcome; however, few large publications implementing them have managed to get to even 5 percent adoption by their web users, and the page threshold virtually guarantees that most such users will never be asked to pay. As a result, though the new income serves to slow the reduction of revenue, it does not stop it, much less reverse it.

The biggest source of subsidy in the news environment has always been indirect and private, coming from advertisers. As Henry Luce put it 75 years ago, “If we have to be subsidized by anybody, we think that the advertiser presents extremely interesting possibilities.”

There are a few publications in the news business whose audience pays directly for the journalists’ work, but they are a tiny fraction of the news ecosystem, clustered around professional practices (finance, law, medicine), with a handful of outliers, such as Ms. magazine, selling freedom from advertising. Most outlets for news aren’t in the news business but the advertising business.

The most important thing about the relationship between advertising and journalism is that there isn’t one. The link between advertiser and publisher isn’t a partnership, it’s a sales transaction, one in which the publisher has (or had) the upper hand. The essential source of advertiser subsidy is lack of choice; so long as businesses have to rely on publishers to get seen, publishers can use the proceeds to pay for journalism, regardless of advertiser preference. Nine West doesn’t care about keeping the Washington bureau open; it just wants to sell shoes. But in order to reach potential Nine West customers, it has to pay an organization that does care about the Washington bureau.

In addition to advertising, many other forms of private subsidy exist. For most of U.S. history, some owners have been willing to publish newspapers and magazines at a loss, in return for prestige or influence. Both the New Yorker and the New York Post bleed red ink; their continued existence in their current form involves a decision by their wealthy owners that they should not be completely exposed to the market. These kinds of publications are de facto nonprofits. Similarly, family ownership of newspapers provided a shield from demands for short-term profits, in part because the publisher was typically willing to take some compensation in status goods (salary aside, it was good to be the publisher of the local paper) and in part because family ownership meant managing for long-term viability, as opposed to immediate revenue extraction, another form of being in the market but not of it.

Though recent conversation about subsidy and journalism has mainly focused on governmental rather than private provision, the various forms of subsidy are quite entangled. General Motors and Diageo spend significant sums annually on 30-second spots or full-page ads because they are legally stuck with brand advertising. GM might want to sell directly from the factory, as Dell does, and Diageo might be happy to offer a click-to-buy button, as Ghirardelli does, but state law forbids them to use direct marketing. Brand advertising for cars and trucks and beer and booze is propped up by government-mandated subsidy that prevents the affected businesses from investing in the alternatives.

The American public has never paid full freight for the news gathering done in our name. It has always been underwritten by sources other than the readers, listeners or viewers. This essay does not concern itself with where future subsidy can or should come from or how it should be directed. Income can come from advertisers, sponsors, users, donors, patrons or philanthropies; cost reductions can come from partnerships, outsourcing, crowdsourcing or automation. There is no one answer: Any way of keeping costs below revenue is a good way, whether an

organization is large or small, niche or general, for-profit or nonprofit. What is clear is that the model long adopted by the majority of news outlets—a commercial entity that subsidizes the newsroom with advertising dollars—is in trouble.

The Internet Wrecks Advertising Subsidy

This report is concerned with the way journalists do their jobs rather than the business practices of the institutions that support them. However, the business practices intersect journalistic practices in one critical way: Advertiser support, the key source of subsidy for American journalism since the 1830s, is evaporating. (Indeed, for newspapers, much of it is already gone, but more bad news is coming for newspapers, and for magazines, radio and TV as well.)

Advertisers have never had any interest in supporting news outlets per se; the link between advertising revenue and journalists’ salaries was always a function of the publishers’ ability to extract the revenue. This worked well in the 20th century, when the media business was a seller’s market. But it does not work well today.

The disruption began in earnest in the 1990s, with the launch of the commercial web, though it was masked for a decade by rising ad revenue for traditional publishers and by the dot-com bust, which convinced many publishers that they had overestimated the threat from the internet. Traditional ad revenue began to fall in 2006, but by that time the alteration of the underlying advertising market was already well along; lost income was a trailing indicator of a transformed environment. Legacy publishers don’t sell content as a product. They are in the service business, with vertical integration of content, reproduction and delivery. A news station similarly maintains the capabilities to send out its material over cable or satellite; a magazine runs or contracts for both printing services and distribution networks. Vertical integration carries high capital costs, reducing competition and sometimes creating a bottleneck where the public could be induced to pay.

The internet wrecks vertical integration, because everyone pays for the infrastructure, then everyone gets to use it. The audience remains more than willing to pay for reproduction and distribution, but now we pay Dell for computers, Canon for printers, and Verizon for delivery, rather than paying Conde Nast, Hearst or Tribune Co. for all those services in a bundle. When people want to read on paper, we are increasingly printing it ourselves, at a miniature press three feet away, on demand, rather than paying someone else to print it, 20 miles away, yesterday. When we want to listen to audio or watch video, we increasingly use the commodity infrastructure of the internet, rather than purpose-built (and -funded) infrastructure of broadcast towers and cable networks.

Publishers also typically engage in horizontal integration, bundling hard news with horoscopes, gossip, recipes, sports. Simple inertia meant anyone who had tuned into a broadcast or picked up a publication for one particular story would keep watching or reading whatever else was in the bundle. Though this was often called loyalty, in most cases it was just laziness—the path of least resistance meant that reading another good-enough story in the local paper was easier than seeking out an excellent story in a separate publication.

The web wrecks horizontal integration. Prior to the web, having a dozen goodbut- not-great stories in one bundle used to be enough to keep someone from hunting for the dozen best stories in a dozen different publications. In a world of links and feeds, however, it is often easier to find the next thing you read, watch or listen to from your friends than it is to stick with any given publication. Laziness now favors unbundling; for many general interest news sites, the most common category of reader is one who views a single article in a month.

On top of all this, of course, is heightened competition. As Nicholas Carr noted in 2009, a Google search for stories about the U.S. Navy rescue of a U.S. cargo ship captain held hostage by Somali pirates returned 11,264 possible outlets for the story, the vast majority of them simply running the same syndicated copy. The web significantly erodes the value of running identical wire service stories in St. Louis and San Luis Obispo.

In addition to the changes wrought by technology, the spread of social media has created a new category of ads that are tied to media without subsidizing the creation of content. In the 1990s, many websites had discussion boards that generated enormous user interest but little revenue, because advertisers didn’t regard user-created material as “brand-safe.”

MySpace was the first big site to overcome that obstacle. Like the junk-bond transformation of the 1980s, MySpace made the argument that low-quality ad inventory was a good buy if enough of it was aggregated, at a low enough price. The pitch to advertisers was essentially “Even at miniscule click-thru rates, there is a price at which MySpace page views are worth it to you.”

This opened the floodgates. Once enough businesses decided that social networks were acceptable venues, the available media inventory became a function of people’s (limitless) interest in one another, rather than being a function of publishers’ ability to create interesting content or maintain an audience. When demand creates supply at a cost barely above zero, it has a predictable effect on price.

The past 15 years have also seen the rise of advertising as a stand-alone service. The loss of classified ads to superior services like Craigslist, HotJobs and OkCupid has been widely commented on; less noticed is the rise of user-to-user recommendations in a transactional environment, as on Salesforce or Amazon. These recommendations take on some of the functions of business-to-business or business-to-consumer advertising, while involving no subsidy of content (or even a payment to anyone who looks like a publisher). Those services themselves also provide little or no subsidy for media outlets: after a 15-month test of television advertising, Amazon abandoned TV for most products, concluding that the ads would be less effective in driving sales than spending the same amount of money to provide free shipping.

Even publishers who understand that the lost revenue will not be replaced, and that print revenue (and production) will continue to wane, hold out hope that the change in advertising subsidy can somehow be reversed.

The fact that the web, a visually flexible medium, has nevertheless been more readily adapted to direct marketing than brand advertising was a disappointment to publishers, who have always benefited disproportionately from brand advertising. Over the past decade, there have been periodic assertions that the direct marketing version of web advertising is a phase and that someone will reinvent brand advertising online. This is essentially an assertion that advertisers will start handing over significant sums of money for animated graphics or time in the video stream, while expecting little in return but the assurance that they have somehow built awareness.

This seems unlikely. The shift from the logic of brand advertising to the logic of direct marketing is just a symptom of the larger change driven by the web, which is the victory, everywhere, of measurement. What made brand advertising profitable was that no one really knew how it worked, so no one really knew how to optimize it—making a TV commercial was more like making a tiny Hollywood film than it was like running a giant psych experiment.

Online, businesses increasingly expect even brand advertising will have measurable results, and measurable ad spending disrupts the high margins of the good years. John Wanamaker’s endlessly quoted line about not knowing which half of his advertising budget was wasted explains why measurability in advertising puts further pressure on revenue.

Another source of hope for restoration of ad revenue was the internet’s improved specificity (“You can target only real estate lawyers in Montana!”). It was widely assumed that this narrow targeting would lead to defensibly high advertising rates for at least some websites; better targeting would yield better results, and this would make a higher premium worth the cost.

The shift to cheap advertising with measurable outcomes, however, wrecks much of the logic of targeting as well. To take a simplified example, it costs about 60 cents to reach a thousand people with untargeted web advertising. Ad space that costs $12 per thousand viewers (a widely discussed estimate in 2010 for certain niche sites) may well be more efficient because of targeting, but to make economic sense, the targeted ad would have to be 2,000 percent more efficient. Any less, and the junk inventory is more cost-effective.

Because ads can now appear on social media, the junk end of the cost curve is very low indeed, low enough to exert continued pull on the higher prices for targeted ads. Businesses don’t care about reaching people with ads. Businesses care about selling things. The ability to understand who actually buys their products or services online means that many advertisers can arbitrage expensive and cheap ads at will.

There may yet be some undiscovered source of advertising revenue, but to restore the fortunes of ad-supported journalism, this philosopher’s stone must be available to publishers but not to social media or advertising-as-service sites. To justify the return to formerly high rates, it must be dramatically more effective than any current advertising method. And it must generate revenue immune to price pressure from large-scale competition.

On current evidence, these conditions seem unlikely. Publishers’ power over advertisers is evanescing; since the appearance of the web, a huge shift has occurred in net value per advertising dollar from the publisher back to the advertiser, and more signs are pointing to that trend increasing than reversing. Even publishers willing to bet their businesses on this kind of salvation should consider alternative plans for continuing to produce good journalism should the advertising subsidy continue to decline.

Restructuring Is a Forced Move

The broadly negative turn in the fortunes of legacy news businesses leads us to two conclusions: News has to become cheaper to produce, and cost reduction must be accompanied by a restructuring of organizational models and processes. Many factors point to further reductions in ad revenue and few point to increases in the next few years. Though the most precipitate revenue collapse is over, we are nevertheless writing this in the 23rd consecutive quarter of year-on-year revenue decline. The past three years of decline have taken place during a period of economic growth; in addition to the cumulative effects of revenue loss, the inability to raise revenue even in a growing economy suggests that legacy media firms will suffer disproportionately when the next recession begins, as it doubtless will within a few years.

Web advertising has never generated anything like the same revenue per reader, mobile looks even worse, and the continuing rise in online advertising generally is now often bypassing traditional news properties altogether. Meanwhile, hoped-for sources of direct fees—pay walls, micropayments, mobile apps, digital subscriptions—have either failed or underperformed.

Of these, digital subscriptions, as practiced at the Los Angeles Times, Minneapolis Star-Tribune, New York Times and others have done best, but even here the net effect of subscriptions has not made up the print shortfall. Furthermore, because most digital subscriptions are designed to increase print circulation, the short-term effect of digital subscriptions has the immediate effect of making the papers more reliant on print revenue, despite the long-term deterioration of print.

We do not believe the continued erosion of traditional ad revenue will be made up on other platforms over the next three to five years. For the vast majority of news organizations, the next phase of their existence will resemble the last one—cost reduction as a forced move, albeit in a less urgent (and, we hope, more strategic) way, one that takes into account new news techniques and organizational models.

In the 1980s, much academic ink was spilled over the “productivity paradox,” where businesses had invested heavily in information technology over the preceding two decades, but, despite the capital outlay, had very little to show for their efforts. A few firms, however, did show strong early productivity gains from their embrace of IT. The companies that benefited didn’t just computerize existing processes; they altered those processes at the same time that they brought computers into the business and became a different kind of organization. By contrast, companies that simply added computers to their existing processes produced no obvious gains in output or efficiency.

We believe that a similar dynamic is at work today, one we’re calling post-industrial journalism, a phrase first used by Doc Searls in 2001, to mean “journalism no longer organized around the norms of proximity to the machinery of production.” (The original rationale of the newsroom was not managerial but practical— the people producing the words had to be close to the machine, often in the basement, that would reproduce their words.)

Observers of the news industry such as David Simon have noted, correctly, that “doing more with less” is the mantra of every publisher who’s just laid off a dozen reporters and editors. However, because the “with less” part is a forced move, we have to try to make the “doing more” part work, which means less cynical press-release-speak about layoffs and more restructuring to take advantage of new ways of doing journalism.

Post-industrial journalism assumes that the existing institutions are going to lose revenue and market share, and that if they hope to retain or even increase their relevance, they will have to take advantage of new working methods and processes afforded by digital media.

This restructuring will mean rethinking every organizational aspect of news production—increased openness to partnerships; increased reliance on publicly available data; increased use of individuals, crowds and machines to produce raw material; even increased reliance on machines to produce some of the output. These kinds of changes will be wrenching, as they will affect both the daily routine and self-conception of everyone involved in creating and distributing news. But without them, the reduction in the money available for the production of journalism will mean that the future holds nothing but doing less with less. No solution to the present crisis will preserve the old models.

There Are Many Opportunities for Doing Good Work in New Ways

If you believe that journalism matters, and that there is no solution to the crisis, then the only way to get the journalism we need in the current environment is to take advantage of new possibilities.

Journalists now have access to far more information than previously, as a result of everything from the transparency movement to the spread of sensor networks. They have new tools for creating visual and interactive forms of explanation. They have far more varied ways for their work to reach the public—the ubiquity of search, the rise of stream-like sources (Facebook’s timeline, all of Twitter), the wiki as a format for incorporating new information. All these developments have expanded how the public can get and process the news.

Superdistribution—the forwarding of media through social networks—means that a tiny publication with an important article can reach a huge audience quickly and at no additional cost. The presence of networked video cameras in people’s pockets means that an increasing amount of visual reporting comes from citizens.

As new possibilities of information gathering, sense-making and distribution proliferate, it’s possible to see organizations taking advantage of working methods unavailable even 10 years ago, as with Narrative Science’s automating the production of data-driven news; or ProPublica’s making data sets and templates available for repeating a story, as with Dollars for Docs; or searching through existing data to discover new insights, as independent financial fraud investigator Harry Markopolos did with Bernie Madoff (one of the greatest missed journalistic opportunities of the past decade).

The commonality to enterprising digital members of traditional organizations— Anjali Mullany, formerly of the Daily News; John Keefe of WNYC; Gabriel Dance at the Guardian in the U.S.—and digital news startups such as WyoFile, Technically Philly and Poligraft is that they organize their assumptions and processes around the newly possible, like making graphics interactive, providing the audience with direct access to a database, soliciting photos and information from the audience, or circulating a story via the social graph. It’s not clear that Poligraft will be around in a decade (nor the Daily News, for that matter), but the experimentation being done at these organizations exemplifies good use of new tools to pursue journalistic goals.

The most exciting and transformative aspect of the current news environment is taking advantage of new forms of collaboration, new analytic tools and sources of data, and new ways of communicating what matters to the public. The bulk of our recommendations later in this essay will focus on these opportunities.

Defining “Public” and “Audience,” and the Special Case of the New York Times

Before presenting the body of the report, we need to engage in a little throat clearing about two contentious words—public and audience—as well as discussing the special case of the New York Times as a uniquely poor proxy for the general state of American journalism.

Public first. The concept of “the public,” the group of people on whose behalf hard news is produced, is the “god term” of journalism, as James Carey put it:

… the final term, the term without which nothing counts, and journalists justify their actions, defend the craft, plead their case in terms of the public’s right to know, their role as the representative of the public, and their capacity to speak both to and for the public.

The public is the group whose interests are to be served by the news ecosystem. It is also very difficult to define cleanly.

The idea of “the public” has been core to American theorizing about news since John Dewey’s famous response to Walter Lippmann in the 1920s. Lippmann despaired that the average person in a mass society with complex economic and technical workings could ever become the kind of informed citizen that most democratic theory seemed to assume. Dewey, in response, argued that there were multiple, overlapping publics that could be “activated” by the emergence of particular issues. This notion of news outlets serving disparate but overlapping publics has remained core to their organizational logic.

Since the emergence of Lippmann’s and Dewey’s competing views of mass media and mass society, philosophers such as Jurgen Habermas, Nancy Fraser, James Carey, Michael Schudson and Yochai Benkler have all made some conception of the public sphere core to their work, enriching but complicating any account of media whose role serves a (or the) public.

We will adopt the coward’s strategy of noting but not solving the dilemma. We do not propose to provide a definition any more rigorous than this one:

The public is that group of consumers or citizens who care about the forces that shape their lives and want someone to monitor and report on those forces so that they can act on that knowledge.

This is an unsatisfying, question-begging definition, but it is at least respectful of the welter of opinions about what actually constitutes a “public.”

The word “audience” has become similarly problematic. When the media landscape was cleanly divided into publishing (print, broadcast) vs. communication (telegraph, then telephone), the concept of an audience was equally clean—the mass of recipients of content produced and distributed by a publisher. Movies, music, newspapers, books—all these had obvious audiences.

One of the most disruptive effects of the internet is to combine publishing and communications models into a single medium. When someone on Twitter shares a story with a couple of friends, it feels like a water cooler conversation of old. When that same person shares that same story with a couple thousand people, it feels like publishing, even though it’s the same tool and the same activity used to send the story to just a few. Furthermore, every one of those recipients can forward the story still further. The privileged position of the original publisher has shrunk dramatically.

Observing a world where the members of the audience had become more than recipients of information, the scholar Jay Rosen of New York University coined the phrase “The People Formerly Known as the Audience” to describe the ways in which previously quiescent groups of consumers had become creators and annotators and judges and conduits for information. We adopt Rosen’s view of this transformation here; however, writing out his formulation (or TPFKATA) is too unwieldy.

We will therefore talk throughout about “the audience”—keep in mind that we mean by that the people formerly known as the audience, newly endowed with an unprecedented degree of communicative agency.

Finally, a note about why we will not be concentrating very much on the fate of the New York Times. A remarkable amount of what has been written about the fortunes of American journalism over the past decade has centered on the question of what will happen to the Times. We believe this focus has been distracting.

In the last generation, the Times has gone from being a great daily paper, in competition with several other such papers, to being a cultural institution of unique and global importance, even as those papers—the Washington Post, Chicago Tribune, Los Angeles Times, Miami Herald, among others—have shrunk their coverage and their ambitions. This puts the Times in a category of one. Any sentence that begins “Let’s take the New York Times as an example …” is thus liable to explain or describe little about the rest of the landscape.

The Times newsroom is a source of much interesting experimentation—data visualizations, novel partnerships, integration of blogs—and we have talked to many of our friends and colleagues there in an effort to learn from their experiences and make recommendations for other news organizations. However, because the Times is in a category of one, the choices its management can make, and the outcomes of those choices, are not illustrative or predictive for most other news organizations, large or small, old or new. We will therefore spend comparatively little time discussing its fate. While the Times serves as an inspiration for news organizations everywhere, it is less useful as a model or bellwether for other institutions.

Organization

This essay is written with several audiences in mind—startups, traditional organizations trying to adapt, journalism schools, and organizations that support or shape the ecosystem, from the Pulitzer Prize Board to the U.S. government. After this introduction are three main sections: Journalists, Institutions and Ecosystem.

We start by asking what individual journalists can and should do today, because their work matters most, and because the obsessive focus on institutional survival in recent years has hidden an obvious truth—institutions matter because they support the work of journalists, not vice versa.

We next ask what institutions can do to support the work of journalists. We are not using the word “institution” in its colloquial sense of “legacy news organization,” but rather in its sociological sense of “a group of people and assets with relatively stable patterns of behavior.” Huffington Post is as much of an institution as Harper’s; we are as interested in the institutionalization of current news startups as we are in the adaptation of old institutions to new realities.

Finally, we examine the news ecosystem, by which we mean those aspects of news production not under the direct control of any one institution. The current ecosystem contains new assets, such as an explosion in digital data and computational power. It also contains new opportunities, such as the ability to form low-cost partnerships and consortia, and it contains forces that affect news organizations, from the assumptions and support or obstacles produced by schools, businesses and governments.

In our brief conclusion, we extrapolate several of the current forces out to the end of the decade and describe what we believe some of the salient features of the news environment of 2020 will be.

We do not imagine that any one organization can act on all or even a majority of our recommendations; the recommendations are too various and directed at too many different kinds of actors. We also don’t imagine that these recommendations add up to a complete strategic direction. We are plainly in an era where what doesn’t work is clearer than what does, and where the formerly stable beliefs and behaviors of what we used to call the news industry are giving way to a far more variable set of entities than anything we saw in the 20th century.

We do imagine (or at least hope) that these recommendations will be useful for organizations that want to avoid the worst of the anachronism between traditional process and contemporary opportunity and want to take advantage of the possibilities that exist today.

Section 1: Journalists

On June 28, 2012, the Supreme Court handed down its decision on the legality of the individual health care mandate contained in President Barack Obama’s Affordable Care Act (ACA). Coming in an election year, with the incumbent president faced with having his policy centerpiece ruled unconstitutional, the significance of the decision went beyond health care: It was a major political story.

The cable news network’s embarrassment in reporting the decision erroneously was exceeded only by the breakthrough moment for what was until then a little-known specialist website whose sole beat is the Supreme Court. On that day, SCOTUSblog became the key source for must-read breaking context and analysis of the court’s ACA opinion. The Atlantic later broke down the progress of SCOTUSblog’s coverage, reporting that by 10:22, 15 minutes after the decision was delivered, the site had close to a million visitors; it had to install extra server capacity for the surge in traffic.

SCOTUSblog was founded in 2003 by the husband-and-wife team of Tom Goldstein and Amy Howe. Neither was a journalist; they were partners in a law practice and lectured at Harvard and Stanford law schools. On the morning of the decision, Goldstein covered the whole process live; this live-blogging became C-SPAN 3’s source of coverage. Goldstein described the ruling as “our Superbowl,” and said his goal was to provide the best analysis of the ruling, at the most appropriate moment for the audience.

SCOTUSblog demonstrates that journalism can be done outside traditional newsrooms, by individuals free of traditional demands of both commerce and process. In an environment of what journalism professor Jeff Jarvis describes as “do what you do best and link to the rest,” the SCOTUSblog model delivers the most consistent coverage of the Supreme Court and aims to deliver the best coverage as well. SCOTUSblog will not rush 25 journalists into Haiti in the event of an earthquake (or assign any to Lindsay Lohan’s DUI hearing), so it is not replacing CNN. But it doesn’t have to. SCOTUSblog has found its niche and knows what its role is.

Journalists exist because people need to know what has happened and why. The way news is most effectively and reliably relayed is by those with a combination of deep knowledge of the subject and a responsiveness to audience requirements. On this occasion, SCOTUSblog managed to achieve both goals. While CNN corrected its erroneous reporting after several critical minutes, it was initially deficient on the most basic metric: reporting what the court had actually decided. The SCOTUSblog breakthrough is just one example of how the customary territory of traditional journalists is being eroded. Surveying the new news ecosystem brings up examples far more radical than SCOTUSblog, which employs reporters alongside its lawyer-blogger founders. In some cases, nonprofessional journalists have proven they can do journalism at as high a level as professional journalists, and sometimes higher. Experts, whether economist Nouriel Roubini on the housing bubble, sociologist Zeynep Tufekci on riots in the Middle East, or financial analyst Susan Webber at Naked Capitalism, are producing contextual pieces that outstrip many of the efforts produced by traditional journalists. This is more than just individuals being able to publish their views directly; the Lance Armstrong doping case was covered better and far earlier by NY Velocity, a specialist bike-racing blog, than it was by the professional (and decidedly unskeptical) sports press.

An interesting question about direct access to the public by experts arose after the exposure of Bernie Madoff ’s Ponzi scheme. The most notable aspect of that fraud was the failure of the Securities and Exchange Commission to heed the prescient, detailed and accurate warnings of wrongdoing provided by the investor Harry Markopolos. Ray Pellecchia at the investment blog Seeking Alpha asked, “Could a Markopolos Blog Have Stopped Madoff?” Could the SEC have remained inattentive if, instead of going to the agency, Markopolos had gone public with a blog posting over the improbability of Madoff ’s trades? It’s impossible to run this experiment, of course, but it’s easy to imagine that public analysis of Madoff ’s trades would have had greater effect than leaving the matter to the professionals.

We have also reached a point where the “crowd” is publishing its own information in real time to each other and the world. Data on any type of measurable change are more cheaply gathered today than ever, and algorithms are developing that are able to reassemble this information in fractions of a second and produce accounts of events that have passed the Turing test of being indistinguishable from those written by humans. All of this is done without any intervention from a journalist.

The changes in the news ecosystem are not just a story about erosion, however. Even as the old monopolies vanish, there is an increase in the amount of journalistically useful work to be achieved through collaboration with amateurs, crowds and machines. Commodities traders, for example, do not need a reporter to stand by a wheat field and interview a farmer. Satellites can take real-time images of the crops and interpret the visual data, turning it into useful data in the blink of an eye. Narrative Science generates reports on quarterly results for Forbes. com. Journatic causes both intrigue and distress with remotely compiled “local” reporting. Verification of ordnance dropped in market squares in the Middle East occurs through networks of witnesses with mobile phones and military experts on Twitter, publishing firsthand accounts and their analyses in real time.

The list of what a journalist can do grows daily, as the plasticity of communications technology changes both reporting capabilities and audience behaviors. AP journalist and news innovator Jonathan Stray noted in a post:

Each of the acts that make up journalism might best be done inside or outside the newsroom, by professionals or amateurs or partners or specialists. It all depends upon the economics of the ecosystem and, ultimately, the needs of the users.

Understanding the disruption to news production and journalism, and deciding where human effort can be most effectively applied, will be vital for all journalists. Figuring out the most useful role a journalist can play in the new news ecosystem requires asking two related questions: What can new entrants in the news ecosystem now do better than journalists could do under the old model, and what roles can journalists themselves best play?

What Social Media Does Better

Amateurs

The journalistic value of the social media exists on a spectrum, from the individual person with a key piece of information—the eyewitness, the inside observer— all the way through the large collective. Bradley Manning, the private in Army intelligence charged with divulging hundreds of thousands of State Department documents to the whistle-blowing website WikiLeaks, occupied a position of singular importance, while the BBC’s documentation of debris scattered after the space shuttle Columbia exploded required multiple independent observers. Huffington Post’s Off the Bus project in 2008 occupied a similar spectrum; blogger Mayhill Fowler’s coverage of Obama’s remarks at a San Francisco fundraiser about people who “cling to guns and religion” came from a sole source, while coverage of the Iowa caucuses relied on a crowd.

When the Navy SEALs took down Osama bin Laden, the first public “report” came from Sohaib Athar (Twitter name @reallyvirtual) or, in his own words, “uh oh I’m the guy who live blogged the Osama raid without knowing it.” Sohaib Athar is not a journalist (he’s an IT consultant in Abbottabad, Pakistan, where the raid took place) and might not even have known he was practicing journalism, but as Steve Myers, then at the Poynter Institute, said “he acted like a journalist.” Athar tweeted about hearing a helicopter and a blast, then responded to inquiries, added information when he thought he had it, followed the thread of the story and created context for it. Athar became a resource for journalists who were reconstructing a timeline of the events—a part of the verification system that could be compared in real time against the official version.

For many newsworthy events, it’s increasingly more likely that the first available description will be produced by a connected citizen than by a professional journalist. For some kinds of events—natural disasters, mass murders—the transition is complete.

In that sense, as with so many of the changes in journalism, the erosion of the old way of doing things is accompanied by an increase in new opportunities and new needs for journalistically important work. The journalist has not been replaced but displaced, moved higher up the editorial chain from the production of initial observations to a role that emphasizes verification and interpretation, bringing sense to the streams of text, audio, photos and video produced by the public.

“Original reporting” occupies pride of place within journalistic self-conception— it is at the core of what journalists do that they say cannot be done by others; it is the aspect of their work that requires the most tacit skill; it is the function that most directly serves the public good. The importance of original reporting is reflected in many of the more perennial battles that have been fought around journalism over the past decade and a half, from the seemingly endless struggle of “bloggers vs. journalists” to the conflict over news aggregation vs. original reporting.

Because original reporting is so often perceived as simplistic or methodologically naive, it is frequently misunderstood by outside observers. Getting key bits of descriptive information out of an eyewitness, aggressively challenging the verbal responses of a seasoned government bureaucrat, knowing exactly where to find a key document, or navigating the routines and idiosyncrasies of complex modern organizations is a non-trivial intellectual endeavor, and a public good to boot. In many instances, the most important aspects of individual journalistic work remain what they’ve always been at their best: interviewing, making direct observations and analyzing individual documents.

And yet, many of the strategies we advocate do not easily map onto the original reporting paradigm. Most journalists, and journalistic institutions, have failed to take advantage of the explosion in potentially newsworthy content facilitated by the growth in digital communication. The reality is that most journalists at most newspapers do not spend most of their time conducting anything like empirically robust forms of evidence gathering. Like the historical fallacy of a journalistic “golden age,” the belief in the value of original reporting often exceeds the volume at which it is actually produced.

Too many reporters remain locked into a mindset where a relatively limited list of sources is still relied on to gather evidence for most important stories, with the occasional rewritten press release or direct observation thrown in. This insidercentric idea of original reporting excludes social media, the explosion of digital data, algorithmically generated sources of information, and many other new strategies of information gathering that we emphasize here.

There should be more original reporting, not less, and this original reporting should learn to live alongside newer forms of journalistic evidence gathering. We acknowledge the very real threat to original reporting posed by the economic collapse of newspapers; solving this dilemma requires new attention to journalistic institutions, which we will address more fully in the next section, on institutions.

Crowds

When you aggregate enough individual participants, you get a crowd. One thing that crowds do better than journalists is collect data. When Japan was hit by an earthquake in March 2011, and the Fukushima Daiichi nuclear plant suffered a leak, frustration around the lack of availability of up-to-date data on radiation levels led to individuals with Geiger counters filming the readings and streaming them to UStream.

Platforms for sharing real-time data, such as Cosm, rely on activist groups of businesses or simply interested individuals gathering whatever information they are interested in—air quality, traffic speed, energy efficiency—and sharing it through low-cost sensors. These sites provide a range, depth and accuracy of data that simply cannot be matched by individual reporters.

Citizens also photograph events, film important pieces of news, and sometimes, as Off the Bus did for the Huffington Post in 2008, get political scoops. Social platforms such as Facebook and Twitter recognize that gathering all information now available and interpreting it is a task beyond human scale. Built in to all social platforms and search engines are algorithmic capabilities helping to analyze what subjects are being shared, which topics are most discussed by whom, and when information emerges and how it moves.

The availability of resources like citizen photos doesn’t obviate the need for journalism or journalists, but it does change the job from being the source of the initial capture of an image or observation to being the person who can make relevant requests, and then filter and contextualize the results. The word “crowdsourcing” itself implies a “one to many” relationship for the journalist: asking a question or deriving an answer from a large group of people. But the “crowd” is also a series of individuals performing networked activities, which can be interrogated and used for a more complete version of events or to discover things that were not easily or quickly obtained through traditional shoe-leather reporting.

What Machines Do Better

One thing machines do better is create value from large amounts of data at high speed. Automation of process and content is the most under-explored territory for reducing costs of journalism and improving editorial output. Within five to 10 years, we will see cheaply produced information monitored on networks of wireless devices. Their output, from telling people the optimum time to use water to avoid polluting rivers, to when to cross the road, raise questions of data ethics, ownership and use.

In the technology industry, startups like Palantir, Kaggle and Narrative Science are exciting investors with the infinite capabilities offered by data gathering and organization through algorithms.

With a staff of 30, two-thirds engineers and one-third editorial, Narrative Science “takes raw numerical data and generates full narratives,” as chief technology officer Kris Hammond describes it. He and his team of computer scientists work on identifying what constitute the key elements of a story, and how this might vary for a recap from a baseball game or a financial earnings report. They then write code that allows streams of data to be turned into words. Clients for the low-cost content range from commercial businesses to traditional media outlets.

Narrative Science proposes to automate production of standard stories such as ordinary financial statements and capsule game summaries. This approach reduces the human inputs required for repetitious work, freeing up labor for more complex or interpretive tasks, rather than describing basically uneventful occurrences. And, as always, commodification expands the number of participants beyond the traditional professional cadre. If your child plays in a Little League baseball game and you use an iPhone app called GameChanger to record the scores, Narrative Science will process those data instantaneously into a written description of the game. More than a million such game reports will be generated this year.

Hammond said in an interview with Wired that he anticipated that 80 to 90 percent of stories in the future will be algorithmically generated. We asked him for his rationale, and he explained that the high levels of localized and personal data likely to be collected and made available online will greatly expand the type of “story” that can be generated. The 90 percent figure thus assumes not just more granular data, but a much larger universe of stories or content being published, by a much larger collection of reporters, most of whom are amateurs. Anywhere data are available in this digital format will be suitable for this type of reporting, and anywhere there are no such data, like the local town hall meeting, will need a reporter to record the data.

Hammond says the machines his team builds must “think like journalists”; his interest is looking at what journalists do, and then replicating it through programming. “We want the machine to come to people—humanize the machine and make human insight at tremendous and outrageous scale.”

Reporters and editors find this scenario terrifying. Journalists and programmers (or journalists who are computer scientists) very rarely work on this kind of replication process. As Reg Chua, head of data and innovation for Thomson Reuters, commented, “We don’t have the understanding in place, there are only a handful of news organizations that have the capabilities at the moment.”

If the answer to the question “what do algorithms do better?” is that they produce stories that come from structured data, and if the world of structured data of a personal, local, national and international nature is exponentially increasing, then an estimate of 90 percent of the universe of “stories” being automated is not farfetched.

What Journalists Do Better

Prior to the spread of the steam engine, all cloth was “artisanal,” in the sense of being made by artisans. It was not, however, very well made; humans were making cloth not because of their superior skill but because there was no alternative. The steam engine displaced production of low-end woven materials, which ended the use of human beings for most of the raw production of cloth but created a raft of new jobs for high-quality artisans, as well as designers of new patterns and managers of mills.

We believe that something similar is happening to journalism—the rise of what we think of as “the press” coincided with the industrialization of reproduction and distribution of printed matter. When the cost of sending a column inch of writing to a thousand people began falling, news organizations could swing more of their resources to the daily production of content. Now we are witnessing a related change—the gathering and dissemination of facts, and even of basic analysis, is being automated. This obviously disrupts those jobs that employed journalists, not as artisans but simply as bodies, people who did the work because no machine could. It also allows news organizations, traditional and new, to swing more of their resources to the kind of investigative and interpretive work that only humans, not algorithms, can do.

Accountability

A recurring question that society asks and demands to have answered—usually when things go wrong—is: “Who is responsible?” If journalism has an impact and part of its role is to force accountability in other institutions, then it must be able to produce accountability of its own. The two government inquiries, one police inquiry and series of charges into the News of the World’s widely publicized phone hacking case in the United Kingdom demonstrate rather vividly that while journalists should have the freedom to publish, they also have to account personally for their actions.

Identifying who bears publishing risk is legally important and will become more so, both in the field of prosecutions and protections.

The construction of programs and algorithms that replace human reporting is made by a series of decisions that needs to be explicable and accountable to those affected. Journalists write algorithms at Narrative Science; at Google News, engineers have to understand what makes a story “better” to improve an algorithm. Data and algorithms are as political as cartoons and op-ed pieces, but seldom carry the same transparency.

New areas of accountability are emerging. One question journalists and news institutions need to involve themselves in is: “What are you doing with my data?” It might not matter who is a journalist, except to the person disclosing information to a journalist.

Equally, protections and defenses afforded to journalists must be made available to everyone who is making information available in the public interest. If a journalist or news organization owns your data, then you might reasonably expect them not to be handed over to the police.

We know what happens when sensitive information, such as the diplomatic cables published by WikiLeaks, is hosted on a platform that is inherently commercial but not inherently journalistic. Those services can be withdrawn; both an arm of Amazon that provided web services to WikiLeaks, and PayPal, the online payment mechanism, severed their ties to the organization. Platforms that engage in censorship for commercial expediency are often less easy to spot. Rebecca MacKinnon, a New America Foundation fellow and author of “Consent of the Networked,” points out that Apple’s approval process of products for its popular app store is opaque and arbitrary, and the rejection of some of the material amounts to censorship, as with its famously opaque decision to reject developer Joshua Begley’s interactive map of drone strikes. Therefore, just in choosing an Apple product to use, journalists participate in shaping a future of the internet that engages in censorship.

Self-evident as it is, journalists can be much more efficient than machines at obtaining and disseminating certain types of information. Access and “exclusivity” or “ownership” of a story is created through interviewing people. Making phone calls to the White House or the school board, showing up at meetings and being receptive to feedback, sharing views and expressing doubt all make news more of the “drama” that James Carey identified as central to the concept of a newspaper. These very personal and human activities mark journalism as a form of information performance rather than simply a dissemination of facts.

Originality

The origination of ideas, algorithms, the formation of movements, and innovations to practices all require originality of thought. Journalists should be provoking change, initiating experimentation and instigating activity. Recognizing what is important about those credit default swaps or why Mitt Romney’s tax affairs needed to be pursued relies on a complexity of understanding about the world

for which it is still difficult to build and maintain machines. Cultural literacy skills distinguish reporters, editors, designers and other journalists from other systems of data gathering and dissemination.

Charisma

People follow people, and therefore just by “being human” journalists create a more powerful role for themselves. It is a device personality-driven television has long relied on, but only in a one-way medium. In a networked world, the ability to inform, entertain and respond to feedback intelligently is a journalistic skill. As Paul Berry, the former chief technology officer at Huffington Post, said, “There is really only one question for a journalist at an interview now: How many followers?” Influence being a better metric than sheer mass, one might refine this to “Who are your followers?” But the point remains: A journalist’s individual agency, which is to say the journalist’s means and freedom, is growing outside the brand and the audience of the newsroom.

Working between the crowd and the algorithm in the information ecosystem is where a journalist is able to have most effect, by serving as an investigator, a translator, a storyteller. Without leveraging the possibilities of either the crowd or the algorithm, some kinds of journalism become unsustainable, falling behind the real-time world of data and networks available to audiences through everything from the sensor on their waste bin to the trending list on their Twitter stream. The journalism layer within the ecosystem thus becomes about humanizing the data and not about the mechanizing process.

Adapting to this environment is a stretch for journalists who developed their skills in newsrooms where precision and security were the key demands of the product, and where there was unity and clarity around a small set of processes— talking, writing, editing. The ability to recognize, find and tell a story, in the most appropriate format and for a specific audience, remains a constant requirement, but the number of formats and variability of the audiences have grown. Beyond this, the craft skills that will help journalists define and redefine their future roles and the business in which they work, are changing.

What Does a Journalist Need to Know?

When Laura and Chris Amico moved to Washington, D.C., from California as a result of Chris’ getting a job as a news developer at NPR, they did not know their neighborhood, they did not know the community, and they did not know where Laura, a crime beat reporter, was likely to find work.

“People were just not hiring,” says Laura. The boredom of unemployment and a shared interest in public service journalism led the Amicos to kick ideas around about what to do. “We thought a lot about what was not being covered,” says Laura, who is the kind of reporter who keeps a police scanner where most would have an alarm clock.

What was not being covered in the crime pages of the local metro papers and even the Washington Post, they realized, was every homicide in the city. The Amicos responded to this gap in coverage with a startup online site they called Homicide Watch D.C. “We deliberately thought about doing things that others would not,” says Chris. In effect the most radical act was to put every part of the reporter’s notebook online, and using “the whole pig,” every aspect of the available data. Homicide Watch D.C. is built around “objects”—incident, victim, suspect, case—and uses structured information about location, age and race to build a very detailed picture of this one type of crime in one city. The comprehensive nature of the service to citizens helps the reporting process: If someone visits the site looking for an unfamiliar name, it is a cue to Laura to investigate if the person being sought is a victim. In one case, this is how the site reported a killing and the victim’s identity before the police had even confirmed the incident.

There is no authorial “voice” of the site, everything is written in AP style, and the comments of the victims’ families or others in the community are given high prominence against the very factual accounts of the homicides. But by recording and making visible every homicide in D.C., the site serves a very clear and distinct journalistic purpose: It is possible by glancing at the home page to deduce that homicide crime is overwhelmingly male, black and often young. Within a few clicks, a visitor can see detailed statistics that confirm this theory.

Homicide Watch is an example of what Chris and Laura were sure they would not have been allowed to do in a newsroom. Their statistics-driven reporting approach and a site that prioritizes victims and incidents over stories are alien to many newsroom priorities.

Reporting is the heart of journalism but the tools for reporting, as we see from Homicide Watch, can be used in very different ways. A database that takes every part of the reporter’s notebook and turns it into structured information with the intention of producing more stories is a good example of this. A commenting system that allows users to better highlight and filter useful comments is another example. Not every journalist will be skilled in every area of work. We assume the centrality of reporting, so we are focusing more here on the new capabilities that are already required for better reporting but that exist in too short a supply. It is undoubtedly the case that the hard skills Laura and Chris possess are the bedrock of the site’s success; she is a crime reporter, and he has developer skills. If there were to be one valuable lesson learned, however, it is not just the “hard” skills that made Homicide Watch viable, but rather the “soft skills” that make their application possible.

The “Soft Skills” of Journalism

Mindset

What Laura and Chris Amico possess, alongside skills as a crime reporter and a news developer, is a mindset that wants to improve journalism, not simply replicate or salvage it. As Shazna Nessa, head of the AP’s interactive newsroom, notes: “We need to get young journalists to understand that they can change organizations. Indeed, they are often expected to be the people who change things.”

The institutional appeal for those possessing this mindset is limited. Very few companies follow the example of John Paton at Digital First Media, inviting disruption, expecting change and making no guarantees.

So talent like the Amicos, or Leela de Kretser of DNAinfo, or Lissa Harris of the Watershed Post, or Burt Herman of Storify, or Pete Cashmore of Mashable and hundreds of others like them, struck out in a direction chartered by Nick Denton, Arianna Huffington and Josh Marshall before them, to try to do better by forming a new institution.

Having the desire and motivation to exercise personal influence over journalism at the level of both the story and the institution requires a mix of awareness, confidence, imagination and ability.

Not all of these qualities might be teachable, but they are not optional. It is important to recruit and develop journalists, whether in newsrooms or through journalism schools, who engage with persistent change. For some of these institutions, which by their very nature represent stability, a substantial readjustment will be required.

The idea of the “entrepreneurial” journalist is becoming a familiar one and is increasingly encouraged in both teaching programs and within certain news institutions. Its associations of judging quality of innovation by creation of profit are not always helpful, as the pursuit of profit has to be preceded by the creation of relevance. Individual journalists in whatever area of expertise need to think of experimentation with the aim of innovation as something they practice rather than endure.

Being “Networked”

All journalists carry with them a network and always have, whether it is a network of sources and contacts, or a network of those with similar professional knowledge, or a network of a community that follows and helps them. As the individual connectedness of each member of their network increases, journalists with effective network skills can leverage more help or efficiency. Editing, assigning and reporting all become tasks wholly or partially delegated to the network. Creating and maintaining an effective network is a soft skill, with hard edges. It requires time, thought and process. It requires judgment, not least because networks imply proximity and journalism requires distance, so building for both is hard.

In the notorious leaked 2011 strategy document “The AOL Way,” the blunt assumption made by the portal business was that journalists with larger followings or networks were of higher value. Much about the document was thought to be crass or wrongheaded, but the impact that a large and visible following has on a journalist’s career is undeniable. When a writer such as Andrew Sullivan moves from the Atlantic to the Daily Beast, the recruitment happens with the expectation that his readership moves, too. The credibility of individual reporters and their reliability and expertise are already judged through the composition of their network.

Every individual, subject or location has the potential of a visible network around it. Services such as Facebook, YouTube, Twitter, Orkut and Weibo publish vastly more every day than the aggregate output of the world’s professional media, so mining the relationships, conversations and stories will only become more central to information gathering. The aggregational tool Storify and the Irish journalism startup Storyful, which extracts stories and verification from social media streams, are forms of social news agencies, offering more journalistic protection and filtering than the platforms that host them, but reliant on making sense out of scattered and often confusing information.

Paul Lewis, a reporter for the Guardian, used the techniques of a networked reporter to break a number of significant stories, including analyzing user-captured footage of protests around the G-20 protests in London in 2009. Ian Tomlinson, a man with pre-existing health problems, collapsed and died at the G-20 march, but the police version of the incident seemed inconsistent to Lewis, who continued to interview march participants and tried to establish the order of events. Days after Tomlinson’s death, video footage taken by a bystander with a camera phone was sent to the Guardian, which claims “openness” as a central tenet of its journalism. The video showed conclusively that the police had scuffled with Tomlinson before his death. The importance of the story, the impulse of the witness and the techniques of the journalist led to an outcome that might be seen as the exemplification of accountability journalism.

Persona

Personal presence, accessibility and accountability are important components of journalism. So, too, is the narrative ability. We can all look at figures documenting the decline of the press, but we can also read David Carr telling us in the New York Times what he thinks the important factors are. In fact, we want to read Carr because he is a talented prose stylist. The more we feel engaged with a journalist through his persona, the more we want to hear what he has to say about the world.

Public persona was once the exclusive territory of the high-profile columnist. Now it is part of the job of every journalist; editors and reporters, designers, photographers, videographers, data scientists and social media specialists all have their own perspectives and accountability for storytelling. This requires judgment exercised consistently and publicly; whatever the medium of publication, information is now instantly shared, discussed, annotated, criticized and praised in a live, uncontrolled environment.

Integrity and judgment are attributes that journalists carry with them, as part of their public persona. These are not so much soft skills as values. The nature of search and continual publication means these attributes can be established more easily, but once lost are hard to regain. Plagiarism, dishonesty and covert bias are harder to conceal, while factual inaccuracies, self-copying of material and rudeness can erode reputation quickly and irreparably. By contrast, good journalism, in whatever realm, can gain authority without institutional endorsement.

How a journalist constructs a good reputation—by maintaining integrity, adding value to information for an audience, demonstrating knowledge, linking to sources and explaining methodologies—now has to be done in a public, realtime realm. The old model of a handshake around source protection is no longer enough; journalists who want to work with confidential sources must be able to provide enough information security to prevent their sources from being identified by determined attackers, both governmental and non-governmental.

News institutions need to balance the needs of the individual journalists with the default mechanisms set to safeguard institutional reputation. These mechanisms are not necessarily inimical to building individual reputations, but the requirements of publishing securely, accurately, coherently and to a schedule or within a product can be in tension with how journalists work most effectively.

This is something we will look at more closely under the process part of this section.

Hard Skills

Specialist Knowledge

The extent to which a journalist now needs to have in-depth knowledge about something other than journalism is increasing. Exposed by the wider availability and quality specialist commentary and knowledge, a deficit in skills in professional journalism is all the more obvious. In areas such as economics, science, international affairs and business, the complexity of information and the speed at which people wish to have it explained and contextualized leaves little room for the average generalist.

The cost of employing highly knowledgeable specialists means more expert journalism is likely to come from those who see journalism as only part of what they do—whether it is the SCOTUSblog founders, through their law firm, or the economists Nouriel Roubini and Brad DeLong through consultancy and teaching. Knowledge can be geographic, linguistic, or in a certain discipline or area of study.

The value of specialization can reside in communication and presentational techniques or skills; outstanding writers or photographers, audio or video specialists or social media editors will create audiences for their work through an ability to identify and address a market.

The Guardian’s head of digital engagement, Meg Pickard, describes the phenomenon of individual creating niche communities of interest around areas of knowledge as generating “contextual micro fame.” Journalists need to know how to create communities of knowledge and interest that serve their own specialization.

Sara Ganim, the Pulitzer Prize-winning journalist who investigated a story about child sexual abuse by retired Penn State assistant football coach Jerry Sandusky, was able to produce such astonishing results because of her journalistic skills, central to which was her understanding of the college community she was investigating.

Data and Statistics

The basic data literacy of those engaged in journalism will have to improve for the field to maintain relevance. As individuals, corporations and governments create data and release them in increasing quantities, we see that the availability and accessibility of data are not the same thing. Understanding the nature of what large-scale data sets offer, how to write stories and how to extract meaning that makes sense of information which can be flawed or partial, is important work. Just as journalism needs those with a more profound understanding of communications technologies and information science, so it needs data scientists and statisticians as a core competency within the field.

There is a tight and symbiotic relationship between networks of users and journalists and data. Journalists should be able to analyze the data and metrics that accompany their own work, have a familiarity with the idea that metrics represent human activity. They should also know how to understand this feedback and interpret it sensibly, so that they can improve the reach and content of their stories.

In 1979, the security expert Susan Landau drew a distinction between secrets and mysteries. Surveying the way that the Iranian Revolution had taken the United States completely by surprise, she noted that the intelligence community was focused on secrets—trying to understand the things the shah’s government was hiding—rather than on mysteries—what was happening in various public but not widely visible groups loyal to Ayatollah Ruhollah Khomeini.

In journalistic terms, the most famous news story in living memory—Watergate— was based on the acquisition of secrets. Mark Felt, a high-ranking FBI official, delivered insider information to the Washington Post’s Bob Woodward, essential to the reporting he did with Carl Bernstein on the Nixon White House. Watergate’s hold on the self-conception of the traditional U.S. press remains significant, even as many of the stories of the last decade have hinged on mysteries instead of secrets. The faked business dealings of Enron and Madoff, and Barclay’s manipulation of the LIBOR rates, were all detected by outsiders. (Indeed, one of the reasons that Bethany McLean, who broke the Enron story in Fortune magazine, has not been widely lionized is that offering her accolades for correctly interpreting and following up on publicly available data would mean admitting how few members of the business press operate that way.)

Even as the world itself has become more complex, the volume of available data on many of the important actors—businesses, politicians, priests, criminals—has grown dramatically. One of the key tools for understanding mysteries is the ability to examine data for patterns that may be hiding in plain sight.

Understanding Metrics and Audiences

A startling number of newsrooms we studied still do not implement “live” metric dashboards such as Chartbeat or Google Analytics or, more often, do not create access to these tools for all journalists. Understanding how journalism is received, understanding what causes virality in content, and being able to see what is read, heard or viewed by whom is an important aspect of journalism. It can, but does not necessarily, entail increasing page views or unique visitors by manipulating content, although there is something to be said for the approach of Gawker editor A.J. Daulerio, who circulated a memo making clear that “traffic whoring” would be a rotational part of staff members’ duties. An honesty in identifying targets and goals, a sensitivity to relevant and irrelevant data, and a willingness to respond to feedback are not anathema to sustainable journalism but part of it.

The following of technical trends and traffic leads to tedious practices that do not necessarily burnish the brand of journalism, such as search engine optimization (the practice of making pieces perform as well as they can on Google through testing links and headlines). At the same time, making journalism easily discoverable by audiences who are faced with a filter problem is a service. The fact that audiences increasingly reach news stories through links shared on social networks rather than through news aggregators has implications for reporters and editors. General ignorance of how people consume information was not an issue when the industrial model prevailed, but in today’s fragmented and fraying world, knowledge of how audiences consume information, and whether what you write, record, or shoot reaches the people whom you want to see it, becomes critical.

Coding

If there are two significant language barriers that journalism needs to traverse, one is statistics and data skills and the other is technical aptitude. Journalists should learn to code. It’s true that to be fluent and useful in many programming languages requires very highly developed skills; not every journalist will be able to do this, and not every journalist should do this. But every journalist needs to understand at a basic literacy level what code is, what it can do, and how to communicate with those who are more proficient. John Keefe, who leads a small team of newsroom developers at WNYC, makes the point that the entry-level skills for tools and applications of code are falling all the time.

One journalist, who works in a more technical environment than most, identified the lack of engineering expertise as a major barrier to news organizations’ ability to make progress: “Even the most well-resourced newsroom has a ratio of developers that is no more than one developer to 10 journalists, [and] that ratio is far too low. And the quality of many newsroom developers is also far below that of people working for engineering companies like Facebook and Twitter.”

Leadership at the board level in most institutions is biased toward business and editorial skills and light on engineering knowledge. This is a cause for some concern as we see an increasing reliance on third-party platforms that might provide an excellent set of tools for journalists (Twitter is arguably the most useful tool for journalism since the telephone), but which are not inherently journalistic. Even for journalists who never end up writing a line of code meant for daily use, basic technology literacy is as important a skill as basic business literacy.

Storytelling

Writing, filming, editing, recording, interviewing, designing and producing remain the bedrock of what journalists do. We focus less on these skills because we do not expect the basic skills of being able to identify and report a story to change, and they remain central to a journalist’s skill set. As part of technical literacy, journalists need to understand how each of these skills might be affected by a development in technology or a shift in human behavior. Narrative can be created by the new skills of aggregation, which implies understanding sources and verification of disparate material. One aspect of working with networks and crowds is the journalistic skill of aggregation.

Even though it is an example to make many journalists wince, when Jonah Peretti of BuzzFeed talks about the “disappointed animal” slideshows that power the site’s traffic, he makes the point that a great deal of skill is directed toward what makes a piece of content appealing for others to share. More cerebral curatorial and aggregational exercises, such as Maria Popova’s Brain Pickings, may prove the point in a more highbrow way, surfacing essays on the nature of beauty rather than dogs that look like world leaders, but the underlying skills are analogous.

Project Management

As we see more effective models of journalism emerge from a remaking of the existing process, one widely held observation is that journalists are having to move from a world where the sole focus of their activity was their own stories to a host of different concerns. Steve Buttry, who blogs frequently on newsroom change and leads the training and skills program for Digital First Media, identifies this as “project management skills—being able to keep across all parts of the process and understand how they can be brought together to produce something that works.” An editorial idea no longer has the dominance it once had in a fixed product like a newspaper or news bulletin. Now the idea must also work according to a large number of variables, often with the input of others, and in a way that is technologically viable and responsive to audiences. The story format becomes less like a unit and more like a stream of activity. As human resources in the newsroom continue to be reduced, planning how a story “scales,” or why a piece of code is being written, or what the imagined outcome, goal or impact of a piece of journalism is, becomes important, and finding metrics to accompany that internal target becomes equally important. The diminution of newsroom resources accompanied by the increase in coverage of already well-covered events, such as the U.S. presidential primaries or the Olympic Games, creates an inequality in coverage and wastes money on duplicated efforts.

One of the central themes in this essay is how journalists have to become more skilled at collaboration, with technologies, crowds and partnerships, to help scale the considerable task of reporting events. Working collaboratively and across disciplines should start in the newsroom, which is where this organizational skill set should flow from. This in itself requires journalists to be freer to think about and improve the overall processes of journalism.

Hamster Wheels and the Flat Earth News

The process of journalism is so being radically remade by the forces of technology and economics that there is no longer anything that might be described as “an industry” for the individual journalist to enter.

There is no standardized career path, set of tools and templates for production, or category of employers that are stable and predictable. A job at the Washington Post used to carry with it a certain set of career assumptions, in the same way that a job at General Motors might. An entry-level job on a copy desk or as a junior beat reporter could be plotted in a trajectory that mirrored the product itself. What a journalist did in the industrial age was defined by the product: a headline writer, a reporter, a desk editor, a columnist, an editor. As deadlines melt, and we are in an age where the story as the “atomic value of news” is in question, what journalists do all day is more defined by the requirements of the unfolding events and the audiences consuming them.

In both car manufacturing and legacy news organizations, the available jobs are markedly fewer and often different. While sharing many of the same characteristics of disrupted industries like car manufacturing, news journalism has undergone a much more profound shift in its constitution. General Motors still produces cars, and for the moment they still have four wheels, an engine and a chassis. But what journalism can be and what the output of a working journalist might look like is far more fluid, by the very nature of information and distribution technologies.

As we see a migration from journalism as an activity that required industrial machinery and resulted in a fixed product to one where individual freedom and means increases and responds to user needs, how will individual journalists influence the process of their work? The key differences in processes are clear:

Deadlines and formats for journalism become unrestricted.

Geography becomes less relevant for information gathering and the creation and consumption of journalism.

Live streams of data and social activity provide new and unfiltered source material.

Real-time feedback influences stories.

Individuals become more significant than the brand.

These technologies have, as we know, also undermined the existing business models for journalism. The conditions within the news industry have led to individual journalists feeling disempowered rather than having more influence over their working lives. What Dean Starkman describes as the spinning of the hamster wheel and what investigative journalist Nick Davies outlines in his book “Flat Earth News” are both descriptions of this phenomenon.

The recycling of press releases, the production of more with less without a fundamental change to process are, we would agree, the enemy of good journalism. We would, however, contend that this is unlikely to be the dominant model for journalism in the future, as the economics of paying journalists to produce lowvalue information will not hold. If there is a role and business model for hastily assembled duplicate material, then this is likely to be most successfully pursued by companies such as Demand Media or Journatic that employ algorithms and cheap, outsourced labor.

Individual journalists who create high-quality journalism, regardless of how it is supported, will exercise more autonomy and creative control over their work. Larger and more diverse audiences will be available to them at low or no cost.

Perhaps the best recent example of how a journalist exploited the opportunities of technology outside the process of the newsroom is that of Andy Carvin at NPR. Carvin’s voracious tweeting from Washington about the events of the Arab Spring in 2011 put him at the center of a network for U.S. audiences and other journalists following the narratives. The essence of what Carvin achieved was not to replicate the reporting secondhand, like a reporter sucking up wire copy and spitting out stories, but that he made public the kind of behind-the-scenes processes that specialist desk editors bring to stories. Instead of this knowledge and process remaining between the editor and journalists at NPR, feeding their stories, it was published onto the real-time social media. One reason Carvin thinks he was able to pursue a new avenue of activity is that his day job, as head of NPR’s social media strategy, was not seen as journalistic in the first place.

Other examples of individuals who have disrupted the processes of journalism are numerous but it is rare that the best exponents of it have, like Andy Carvin, found adequate freedom within their own institutions to develop their work. Burt Herman left the Associated Press to develop Storify. Ory Okolloh put together the team that built Ushahidi and subsequently licensed crowd mapping software back to newsrooms when her weblog, Kenyan Pundit, proved inadequate as a platform to convey to the outside world the ethnic violence that was occurring in the aftermath of the 2007 Kenyan election.

It is worth noting that in 2012, during a presidential election year, several of the most keenly watched journalists at the most traditional news outlets emerged not through reporting ranks but through relatively experimental routes of selfpublication. Nate Silver’s career focused on economic consultancy and modeling of baseball statistics. As a part-time, and largely anonymous, political blogger, Silver developed his FiveThirtyEight.com blog which, in 2010, was licensed by the New York Times.

Parallels exist with Ezra Klein, the economics and political commentator who started his first blog at 19, moving the eponymous Ezra Klein platform to the American Prospect and later to the Washington Post. In both instances, the experimental risk and the laborious business of building audiences and figuring out a unique position for themselves was done by individuals with free blogging software, to be bought out by news brands that, for all their superior resource and journalistic luster, had failed to incubate such bright stars.

The next phase of development will see similar bursts of individual brilliance and enterprise in emerging areas, perhaps of visualizations, of data creation, sharing and aggregation. Newsrooms no longer treat blogging, or tweeting, or live coverage with the same caution and bafflement they once did (“once” being just five years ago).

In five years’ time, owning live data feeds from distributed sensor networks, developing automated content, choosing or building technologies that reflect journalistic values, holding partnerships with different specialists and institutions, and experimenting with superstar aggregators, animators and performers could be as commonplace as licensing a blog.

How Will a Journalist’s Work Change?

It is difficult to say exactly what the smaller newsroom will look like, but there are ways the average journalist’s work will change over the next few years. Again, there are gradations of change: the role of a copy editor at the New Yorker and the process of production there might change less over the next few years than that of a community manager or data reporter at Nola.com.

Journalists will still be working in immersive environments, adapting their working patterns to a world of continual live conversation and information, which can be both exhausting and distracting. The ultimate goal of continual engagement, however, is to produce journalism that is of high quality, and significant insight and impact. Measures for the aims and outcome of journalism will be routine, and public.

The presence of metrics and data, relating to both the outside world and their own work, will become a daily reality. Feeds of information delivered in real time—a Twitter of data—will play a greater part in shaping editorial decisions and stories. Defining the ownership of these data, deciding what can be outsourced to other commercial technologies but what needs to be kept, will be the job of journalists. So will writing algorithms. Specialist journalists, whether they are animators, interactive cartoonists, writers, videographers, psephologists or engagement specialists, will spend time understanding the technological changes to their field of practice and experimenting with new tools and techniques. Publishing developments will move at the speed of the web, not at the speed of online newsrooms.

Individual journalists will spend more time in collaborative relationships. This might be with technologists, working out better systems; with specialists or academics in their field; or with other journalists to develop stories or software and in editing and aggregating the output of others. Although journalists should already spend time following up stories and engaging in public discussion on social networks or in comment threads, their ability to add value for users with these techniques will increasingly become part of their value.

Every journalist can now be a publisher. One very obvious side effect of newsroom automation is the lowering in value and utility of the role of editors. Visionaries at the top of organizations will still set the tone and editorial direction for brands, and perhaps each topic will have a specialist editor. Time saved by the automatic organization and editing of pieces, however, dramatically reduces the need for editors to oversee every part of the process. Newsrooms can no longer afford senior staff who do not produce content. Every new desk editor should at least be aggregating and linking to work both inside and outside his or her organization, providing meta-analysis of the process and sources, following stories through cultivating and recommending sources in public.

Section 2: Institutions

Two leading publications chronicling the journalism profession are the venerable Columbia Journalism Review, founded in 1961, and the upstart Nieman Journalism Lab, based since 2008 at Harvard University’s Nieman Center. Both represent high peaks in the often barren landscape of newsroom shoptalk and media criticism. Reading them, however, you may start to wonder whether they are even chronicling the same industry.

With articles documenting the sad decline of a range of traditional newspapers and journalism institutions (from the Philadelphia Inquirer to the San Jose Mercury News to everywhere in between), CJR can often read as an elegy for a vanished world. The Lab, however, overflows with news about the latest journalism experiments, chronicling a range of media organizations, many of them barely a week old and some of them not even launched yet. While there is some doom and gloom at the Lab, and some future-oriented thinking at CJR, this contrast is unmistakable for anyone hoping to stay current about the latest developments in the news business.

The problem with talking about journalism institutions, and one reason that discussion of them tends to be so polarized, is that both CJR and the Lab are telling honest stories. It is a moment of both catastrophe and rebirth for institutions that house journalistic work.

The story we tell ourselves about news institutions, in short, is really three stories, all occurring more or less simultaneously. There is a story of institutional decline and collapse, a story of institutional rebirth, and, perhaps most importantly for our purposes, a story of institutional adaptation. Where death ends and rebirth begins, the degree to which new institutions bear some responsibility for the decline of the old, whether more is being lost or gained, and how we can possibly tip the scales toward “gain”—all these are tangled arguments that arise from the fact that we are not observing a single story unfold. We are watching three.

A Story of Institutional Decline and Collapse: In Michigan, Louisiana and Alabama, Advance Publications is moving out of the daily newspaper business, cutting the number of days it prints a traditional hard-copy paper. From Chicago to Boston to San Francisco, news organizations are struggling with ethical and logistical questions as they increasingly outsource their local coverage to content farms (and the Philippines). The venerable Philadelphia Inquirer finds itself with its fifth owner in six years. Even the New York Times, although buoyed by its digital subscription model, is locked in a struggle with its union over plans to freeze pensions, cut health care benefits and increase hours worked. And these are only the headlines from this week. Two years ago, we were discussing the closure of some newspapers in Denver and Seattle. And two years from now? As we argued in the opening section, even if the news business stabilizes, it is unlikely that it will ever experience the kind of profitability it maintained prior to 2005.

A Story of Institutional Rebirth: But decline isn’t the only story here. While reports on the future of news often tout Talking Points Memo and ProPublica as emblematic of the institutional rebirth simultaneously being experienced by the news industry, these sites are, by digital standards, graybeards. Websites like SCOTUSblog can exist for several years before an event like the Supreme Court’s health care decision propels them toward wider visibility, and the same holds true for Nate Silver’s FiveThirtyEight.com blog covering national elections, now part of the New York Times. A quick look at the Knight Foundation News Challenge awards in June 2012 reveals a half-dozen new and not-sonew institutions—Behavio, Signalnoi.se, Recovers.org, the Tor Project and others—working to provide journalistic information to communities. And these are merely the organizations mentioned in one round of the challenge; there have been many others.

The conventional wisdom about these emerging institutions, appearing in a multitude of studies ranging from the 2011 Federal Communications Commission (FCC) report “The Information Needs of Communities” to a case study of Baltimore carried out by the Project for Excellence in Journalism, is that none of them will replace the original reporting being produced by traditional (and declining) news outlets. Insofar as sheer volume of news is concerned, we do not dispute this claim. But we also think the story is more complicated, and we’ll turn to some of the reasons for that later on.

A Story of Institutional Adaptation: The focus on decline and emergence also obscures a third story, one that may be, in the end, the most important of all. How do new entrants in the field of journalism ever get to the point where they can be said to have reached a point of organizational stability? How do they move from being a precarious startup to a fully fledged member of the journalism community? As we will discuss below, one of the strengths of institutions lies in their ability to change personnel without risking their organizational extinction. How that happens, and how an emerging news organization becomes an institution, is one of the central questions confronting journalism as it moves into the digital age.

We also need to ask how traditional news organizations are reshaping their processes to adapt to a changing information environment. A forthcoming case study of the New York Times by Nikki Usher, an assistant professor at George Washington University, will hopefully go some distance toward answering this question, but we also need to start synthesizing the ways that creative news organizations are adapting to the digital age. Researchers need to build on a basic sociological insight—the fact that most news institutions try to routinize disruption with as little change to their work processes and ideological self-image as possible—and begin to ask how creative institutions work around these systemic and self-imposed constraints.

When it comes to news institutions, we’re telling ourselves a lot of stories at once. While the stories of decline and rebirth make up the majority of discussion about the “future of news,” there is a relative gap when it comes to understanding the third story, that of institutional adaptation. Though the effect of the internet on the American journalism ecosystem has often been portrayed as anti-institutional, serving mainly to erode or even destroy institutional viability, its effect is actually more complex. While the internet has indeed disrupted many existing institutions, it has also helped usher in many new ones. Much of the fate of the news business will be decided not by what is going away, and not by what is exciting and new, but by how new institutions become old and stable and how old institutions become new and flexible.

At this point, it is important to keep two things in mind. First, while we will stress the relative inflexibility of large-scale institutions, we are not claiming that all institutions everywhere are incapable of change. What we are instead arguing is this: Changing the institutions of news is not impossible, but it is hard, and harder than it might logically appear to those on the outside. Arguments about the economic efficiency of change, the normative value of change, and the managerial imperative of change are often both true and, from an institutional point of view, irrelevant.

Second, news institutions that are able to adapt represent one of the most valuable potential sources for growth and evolution within the larger news ecosystem. Adaptation has a powerful impact no matter where it occurs, of course, but larger news institutions are somewhat like a battleship; while it takes them a long time to turn around, once the turning has stopped, they can move forward with an impressive amount of power and speed. Newsroom executives, editors and managers should keep in mind that it is upon their ability to think differently that much potential ecosystem change depends.

What Are Institutions, Anyway?

So just what are institutions, anyway? Economist Geoffrey M. Hodgson has argued that institutions are “the kinds of structures that matter most in the social realm: they make up the stuff of social life.” Institutions, Hodgson writes, can be defined as “systems of established and prevalent social rules that structure social interactions.” Sociologist Jonathan Turner offers a somewhat more wordy analysis; institutions, he argues, are “a complex of positions, roles, norms and values lodged in particular types of social structures and organizing relatively stable patterns of human activity.”

Without a doubt, some complicated stuff. But what matters for our purposes here is the embedded argument that institutions need to be understood as something that can, at least in theory, be located outside of a particular physical struc ture. Office buildings and even payroll invoices don’t serve as the bedrock of institutional material; rather, institutions are fundamentally a series of social rules that create stable patterns of behavior. Of course, working together every day in a newsroom or getting paid to perform a certain kind of work doesn’t hurt the establishment and reinforcement of these social rules, but money and physical proximity aren’t always the essential thing.

It would also be a mistake to think about institutions as simple agglomerations of rational individuals, each making a calculated choice that entering into institutional arrangements is the best way to maximize his or her self-interest. In the words of Walter Powell and Paul DiMaggio, two leading sociologists,

“while institutions are certainly the result of human activity, they are not necessarily the products of conscious design … the new institutionalism in organization theory and sociology comprises a rejection of rational-actor models, a turn toward cognitive and cultural explanations, and an interest in properties of supra-individual units of analysis that cannot be reduced to aggregations or direct consequences of individual motives.”

In other words, while understanding individuals is an important part of understanding institutions, there is an accumulated detritus within institutions that makes them irreducible to individual behavior. All of this boils down to third argument, one that we think can shed some light on the crisis plaguing journalism today. We quoted a scholar above who noted that institutions organize “relatively stable patterns of human activity.” Stability has its advantages, and we’ll discuss some of them below, but, as Powell and DiMaggio put it, “behaviors and structures that are institutionalized are ordinarily slower to change than those that are not … institutional arrangements are reproduced because individuals often cannot even conceive of appropriate alternatives.”

Why Institutions Matter

During our interviews with journalists in a variety of institutional settings, we were struck by the contrast between the pride they expressed in the organizations they worked for and the frustration many of them felt when talking about the slowness of organizational adaptation. As one reporter put it, “I don’t think there is a lack of will to change at these huge organizations, but the cost and the risk are really high. It could be a financial disaster, yes, but it could also be a cul tural disaster in the newsroom. And no one knows what this [new newsroom] is supposed to look like. At every iteration, when you look at something, you only know how it works when it’s broken.”

We’d sum up the general lament this way: The presence of process is a bigger obstacle to change than the absence of money. This conundrum isn’t surprising; as we noted in our definition of institutions, the entire purpose of institutional arrangements is actually to ingrain and rationalize standardized patterns of behavior—in other words, to make change hard.

Occasionally, this frustration with the slowness of institutional change spills into a general organizational nihilism: If institutional arrangements are failing, the thinking goes, and if these failing organizations won’t face reality and change, then blow them up and start from scratch! The problem with anti-institutional thinking of this sort is that, paradoxically, the very qualities that make organizations conservative are the same ones that occasionally make them such powerful producers of the “iron core” of news.

So what kinds of journalism do news institutions make possible, and is there a way to preserve their positive affordances while simultaneously opening them up to evolution and change? Is there any way out of this institutional paradox? Institutions add the ingredients of leverage, symbolic capital, continuity and slack to the news production recipe. More broadly, institutions use these ingredients to produce two different kinds of democratically relevant news—generic information about public events and a more specialized information designed to have an “impact” upon other social institutions. Confusion about the purpose of journalism, and the journalistic tendency to deliberately conflate these two types of information production, make it harder to come to grips with how best to preserve leverage, symbolic capital, continuity and slack under changing technological conditions.

News, Bureaucracies and Beats

Modern American journalism can trace its origins to the 1830s, when a growing crop of “penny press” editors sought to standardize and rationalize the production of regular news. Rather than relying on letters from abroad, the news brought to colonial harbors by trans-Atlantic passengers, or stories clipped from other, circulating newspapers, the reporters employed by the penny press sought out specific “beats”—most often the courthouse, the police station and the society party. They did so, in part, because each of these locations could be relied upon to be regular, predictable generators of the kind of news valued by the growing mass of literate news consumers. The story of early journalism, in short, is the story of an emergent institution seeking out more established institutions in order to feed the 19th-century “hamster wheel.” Journalism studies scholar Matthew Carlson generalizes the historical argument, invoking the earlier research of Mark Fishman (1980), who proposes that “bureaucratic affinity” propels bureaucratically organized news organizations to seek out other bureaucracies to provide information.

Sociologists of news often focus on the negative consequences of this bureaucratic affinity. “While journalists do not purposively seek to bolster those with power, the news legitimates ‘institutions of social control by disseminating to the public institutional rationales as facts of the world,’” Carlson continues. Journalists, meanwhile, usually focus on the accountability function embedded in this institutional monitoring; “eye always on bureaucracies,” as reporter David Burnham put it in a 1998 article in Nieman Reports.

But why are news institutions particularly suited to covering large bureaucracies and governmental and corporate organizations? As David Simon argues:

It’s hard enough to hold agencies and political leadership accountable in a culture that no longer has the patience or inclination to engage with the actual dynamics of actual institutions. At this point, we are having trouble as a society recognizing our problems, much less solving any of them. But absent a properly funded professional press—one that covers the civic bureaucracies with constancy and tenacity, we’re going to have even less of a shot going forward.

The new organizations emerging in the digital age, Simon further contends, are ill-suited for this kind of work:

As for the blogosphere, it just isn’t a factor for this kind of reporting. Most of those who argue that new-media journalism is growing, exploding even, in a democratic burst of egalitarian, from-all-points-onthe- compass reportage are simply never talking about beat reporting of a kind that includes qualitative judgment and analysis. There’s more raw information, sure. And more commentary. And there are, for what it’s worth, more fledgling sites to look for that kind of halfway-there stuff. … [But] beat reporting—and the beat structure of a metropolitan daily—is what is dying here.

Simon’s argument is a powerful one, but it is largely anecdotal. Can we define with any more specificity exactly what it is that institutions do? And once we specify, can we figure out a way that their core functions can be preserved, even in a period of transition? Here are four factors that define the value added of a news institution when compared with a random assortment of individual journalists.

Leverage

If journalism is, at its root, designed to provide the public with the information it needs to be self-governing, and if part of that information is the insight that emerges from the aggressive and often hostile monitoring of a variety of social institutions, why would anyone in power ever talk to a journalist? Why would the subjects of monitorial scrutiny not simply communicate with each other and with the public directly, avoiding all dealings with news reporters? In part, for self-interested reasons: government officials and other powerful people know that talking to the press is always an opportunity, however limited, to “get your side of the story out,” even if the results will ultimately be damning. In part, however, officials engage with the press because these officials fear the consequences of non-response.

Journalism institutions, at least in their 20th-century incarnation, had a few qualities that allowed them to increase their power vis-a-vis other structures of public governance. The first was their claim that their authority was directly proportional to their mass audience—the notion of leverage. A large audience, in this case, was the guarantor of power insofar as readers and “public opinion” were perceived as being shaped by journalism on a large scale. Ironically, the roots of this equivalence between audience and power lie not in the penny press era but the party press era that preceded it, where there was a more direct correlation between the size of circulation lists and the strength of a party in a particular area. Nonetheless, the era of “mass” media included the notion that a mass audience was responsive to, and influenced by, the conduct of journalism.

Today, the notion of leverage, at least insofar as it is guaranteed by audience size, is undergoing a shift. While no one denies that today’s journalism institutions remain uniquely powerful in their ability to mobilize public opinion and punish wayward politicians, the fragmenting of the news audience has upended the traditional notion of the audience as a mass. Once again, this is not to deny that traditional news institutions possess large online audiences, as the managers of these websites never tire of pointing out when they compare their number of unique visitors and page views to those of tiny local blogs. What has changed is not the size of the audience, per se, but the way the relationship between institution and audience is understood—between journalism and its image of the audience. Changes in this image of the audience are deeply connected with a second set of shifts: the decline in traditional news institutions’ symbolic capital.

Symbolic Capital

Along with their levels of financial capital, news institutions have witnessed the decline of a second form of capital—reputational capital. Part of the historical authority of news institutions cannot be reduced to such easily quantifiable metrics as audience size, revenue, or even Pulitzer Prizes. In the long-term sweep of history, the 20th century saw news institutions move from being exciting, muckraking and often scandalous conveyors of useful information and advertisements to the sober guardians of democracy itself. This is an exaggeration, of course, but it is not an entirely unfair one. The reasons for this change lie outside the scope of this paper, but they are as much cultural and sociological as they are economic, and the myth of Watergate marked more the culmination of a long-term reputational upswing more than it did the emergence of it. Between roughly 1908 and 1968, news institutions became the “Fourth Estate.”

Reputational capital primarily attached itself to journalism as a profession and as a set of institutions, rather than to individual journalists. What this meant was that, at least in part, the levels of symbolic capital possessed by individual reporters were as much a function of where they worked as who they were. Although there are exceptions (I.F. Stone being a particularly prominent example), the symbolic capital that individual journalists possessed in the minds of the public and in the minds of politicians was largely a product of their institutional and professional affiliations.

In short, a second advantage news institutions provided to reporters and to journalism as a whole was a remarkably powerful brand. While it’s hard to sort out the chicken-and-egg problems of the 21st-century news industry (did journalism’s reputational decline lead to its economic difficulties, or did economic difficulties lead to reputational decline?) the fact remains that the trends in this area have been in largely one direction: down. Much like the economics of monetary capital, the economics of journalism’s symbolic capital appears caught in a structural, not cyclical, downturn. In the 21st century, reporters and newsroom managers and executives are going to need to think hard about these institutional shifts.

Continuity

News institutions exist in time as well as in space, and we can helpfully think of continuity as “accrued leverage, distributed over time.” This, perhaps, is the most essential of the four ingredients that make up the institutional stew, although it is often the one most under-theorized. Continuity means being able to decide to cover a certain story, beat or section of society persistently and over the long term, even as the individual reporters come and go. The Philadelphia Inquirer has covered crime in the city of Philadelphia since the paper was founded, and this coverage does not stop when the lead crime reporter retires. In theory, at least, it is the institution that monitors crime in Philadelphia. This is the essential function of those “stable patterns of behavior” mentioned above when we were defining institutions—the idea that the process carries on at a level beyond the individual.

Drawing on an analogy proposed by Len Downie and Michael Schudson in their 2009 report “The Reconstruction of American Journalism,” we might say that institutional continuity provides support for journalism’s watchdog function and also its scarecrow function. Both a watchdog and a scarecrow stand guard. But the fact that only a watchdog actively barks and the scarecrow does not bark does not always matter. Though the scarecrow “does nothing,” its very existence, the very fact that the crows know it is out there, “watching,” is often enough to constrain bad crow-like behavior. And the same goes for journalism. The watchdog press, it must be admitted, barks only rarely. But the continuity of that press, the fact that it is “out there,” is often enough to constrain bad behavior on the part of powerful institutions.

Most discussion of how news institutions might be affected by diminishing institutional capacity, whether those institutions entirely go away or simply cover fewer topics, focuses on the watchdog function—the fact that fewer stories will be covered than before and that the watchdog will bark less. We think the real institutional function at risk in this case, however, is the scarecrow function. Both functions are related, of course, and success in actively keeping corporations and politicians honest leads to a stronger sense that journalism is out there, keeping watch. The real dilemma for the news business, however, is how to convince people that it still matters.

Slack

News institutions, or at least those organizations we have traditionally thought of as news institutions, do more than cover a single issue. They do more than manage beat coverage, and they do more than mount long-term, resource-intensive special investigations. They do all three. And they have been able to do so because of their ability to rapidly deploy excess capacity. This institutional slack implies that news organizations, traditionally, have had the ability to adapt to uncertain and rapidly changing world events in short order. Paradoxically, their operational conservatism has provided organizations with the ability to be quite nimble when it comes to the one thing all those conservative processes are designed to facilitate, that is, report the news.

Many emerging news institutions, highly focused enterprises living permanently close to the bone, lack this excess capacity. Technically Philly is a website with a single mission—to cover business stories related to Philadelphia’s high-tech businesses. Similarly, the Texas Tribune, the Voice of San Diego, the Smoking Gun—the common characteristic of most news startups is to avoid trying to be all things to all people. Andrew Donohue, editor of the Voice of San Diego,

summarized it this way: “[More] than beats, people here specialize in particular narratives within beats. We’re not going to cover something unless we can do it better than anyone, or if no one else is doing it.” There is nothing wrong with being focused, of course. Nor do we think the massive duplication of labor currently at work in the news industry (sending hundreds of reporters to all cover the Super Bowl, for instance) is either healthy or sustainable. We simply want to point out that the removal of excess slack from the arsenal of news institutions is a genuinely new development, one whose full implications remain unclear.

Recommendation: Form Partnerships

As institutional capacity declines, news organizations need not sacrifice the depth of their offerings given the resources available elsewhere in the ecosystem. In other words: make journalistic partnerships a more regular part of the institutional repertoire.

In our opinion, there is a stark difference between institutions that see partnerships as a genuine part of their DNA and those that do not. A true genuine embrace of partnerships does not ultimately hinge on the benefit of that partnership to the institution; rather, it hangs on the ability of that partnership to bring value to the ecosystem as a whole.

News institutions, to conclude, have provided added public value to the political and journalistic spheres by leveraging the work of many people, by accumulating symbolic capital, by laying down stable patterns of behavior that can guarantee continuity over time, by being able to focus on many things at once, and by generally fulfilling the scarecrow function of the press as much as they do the watchdog function. Many of these institutions are under significant threat because of the economic, social, political and cultural changes in the larger media ecosystem. And it is in this moment of crisis that the liabilities of institutions—liabilities that paradoxically arise from the very same sources of strength that served them so well in moments of stability—rear their heads.

The Dilemma of Institutional Change

Over and over again the working journalists we interviewed, across a variety of publications and media types, lamented the inherent difficulty in shifting the directions of their legacy media organizations to meet the challenges of the digital age. Zach Seward, the former editor of outreach and social media for the Wall Street Journal and now a senior editor at the Atlantic Media business publication Quartz, told us that the very success of newspapers at doing what they do makes changing them difficult:

The notion of adjusting course for an organization that is either still obligated to put out a daily print product or is otherwise very good and well oiled at a particular process feels as though the best an organization in that situation can do is make slight adjustments, if they are obligated to a production process that already exists. It’s truly no small miracle that daily news organizations are able to produce what they do already, so 100 percent of effort is expended on existing processes.

What we’ve called this “presence of process” doesn’t manifest itself just when it comes to making big decisions. Indeed, the nature of institutional processes is that they are enacted on a daily, even hourly, basis. Process shapes what is and isn’t possible, not just in conversations between reporters, editors and publishers, but in the very technological infrastructures that make the production of journalism possible. Tools put in place to manage process also put in place the assumptions used to design the tools.

Take newsrooms’ content management systems (CMS). A CMS has a built-in idea of workflow—when and how content gets created, edited, checked and published. As a result, a CMS doesn’t just help an organization manage its content in a particular way; it also deflects or even prevents it from managing it in ways that aren’t built into the CMS.

The point is general, of course; all process exists to forestall alternatives, but CMSes are often at an extreme, because the requirements and assumptions are encoded in software and are difficult to argue with, or to override. As Anjali Mullany, a former online editor with the New York Daily News and now a social media editor with Fast Company, put it:

The CMS and the project management systems are the crux of a lot of these [process] problems. Maybe 90 percent. Sometimes workflow and CMS aren’t even compatible, or the CMS is inconsistent with the workflow. Or the workflow destroys the CMS. Look at any major organization, where it’s multiplatform. It’s not uncommon to see the same version [of a story] a few times. Or several reporters did the same story because they weren’t communicating. The great, flexible CMS that will allow you to change your process over time does not exist. You should do this: try to find the one reporter in NYC who likes their CMS. This is a huge problem. If your CMS restricts you, it’s going to restrict everything about the newsroom. The technology you’re using is going to change what you produce.

The dilemma here is clear. We already noted that institutions can be defined as stable patterns and processes that allow collections of people and technology to accomplish more than they would as a mere aggregation of individuals. These institutional processes provide news organizations with many advantages vis-avis other political, social and corporate institutions they monitor. But these stable patterns, particularly when geared to particular production cycles that are themselves wrapped around particular technologies, can constrain news organizations as much as they empower them to report the news.

Matt Waite notes that the problem with large, hierarchical organizations is not that they discourage creative thought—a subtle and important distinction: “When working in a newsroom, [process is] a huge problem. But often in rigid hierarchies, working within constraints, we could have the greatest creativity. The problem was just getting someone to say ‘yes.’ Getting it to happen.” He also noted that organizations with highly refined processes tend can make trying novel approaches politically difficult: “Newsrooms are still structured like the military. That makes it hard to do anything without stepping on someone’s toes.”

We can also catch a glimpse of the difficulty of institutional change by looking at how startup news organizations, though largely made up of veteran journalists and editors, navigate changes in process. Andrew Donohue recalls that when the Voice of San Diego began, “we were just doing what we did at newspapers, but online. Report through the day, wrap up at 7, then put it up on the site. We were not worried about constant updates.”

We heard a similar story from a senior editor at the New York Times: “We were told effectively that the cuts meant doing more with less, one less person, no letup in the coverage. At no point were we ever asked by someone who had the technical capabilities or authority to actually change the tools or the ways we might use them: ‘Let’s look at what you have to do in a day and see how we can change processes.’ This is what was so maddening.”

At a smaller, nimbler organization like the Voice of San Diego, however, it was easier to shift this legacy process toward one that made a bit more sense in the current technological era. We had “a structured routine that slowly unwound as we got more people, and as social media came upon us. Now, our routine very different in that we both get our stories the traditional way, through sources and observing, but we have to decide how to present the story—a blog post, a daily, a three-month thing, a crowdsourced form. So that’s the biggest question these days.”

The “process gap” is often most visible in work patterns tied to content management systems, because those systems exhibit a double conservatism. First, deploying a CMS represents such an enormous effort, the design of the technology typically reflects managerial choices about how employee workflow should work. Second, like Donohue’s account of process at Voice of San Diego, CMSes are typically updated incrementally; products with print-centric daily rhythms that are adapted for the internet often feel like web- and mobile-centric features are an afterthought, because they often are an afterthought.

It is possible to get a sense of how misfit many existing production processes are by seeing what “digital native” CMSes and their attendant processes look like. To take one recent example, Vox, the publisher of several niche media sites, including SB Nation and the Verge, designed its own CMS from scratch. As Trei Brundrett, Vox’s vice president of product and technology, put it in a public interview, “We map our development plan around the tools that our editorial and advertising teams tell us they need.” This seems an obvious way to work, but it actually involves essential and rare skills: an editorial staff that can correctly characterize its needs; management that encourages editorial and technical collaboration; editorial and technical departments able to talk to one another; and a technical staff talented enough to create a working product that is simple and stable enough to be usable. The point here is not that every news organization should build its own CMS—that’s not possible and wasteful even if it was—but rather to illustrate how far print-centric tools are from fitting the new realities of news production.

The units of journalism are often tied to the logic of daily updates, a logic that does not always exist under conditions of digitization. In response to changing user expectations of time and timeliness, organizations need to rethink everything about how stories are organized and accumulate in the queue of news work. The newsroom assembly line is almost entirely anachronistic as a way of producing content to be produced for digital use, and it must be rethought.

Recommendation: Manage the Internet’s Technological Demands

A failure to rethink workflow under condition of digitization can often lead news organizations to suffering all the drawback of digital processes while achieving none of the benefits. Some commentators have referred to this worst-case scenario as the “hamster wheel”—increasing demands on journalists’ time and loss of professional autonomy.

The hamster wheel is real, but many who discuss it mistake its cause. We are not technological determinists who blame “the internet” for the hamster wheel effect. Rather, we blame news organizations themselves for adhering slavishly to old processes under new technological conditions. In other words, technological demands of the internet must be managed in order for the hamster wheel to be avoided. Examples of how to manage the internet might include a focus on intelligent linking rather than constant aggregation and rewrites of already existing news, rotating “link whoring” duty, as Gawker does, and many other process changes.

Recommendation: Be Able to Override Your CMS

Content management systems often embody ossified newsroom processes. To the degree this is the case, the ability to subvert a content management system can be a powerful strike against the casual tyranny of impractical process. Journalists should prepare, individually or in teams, to be able to override every step of their CMS. With luck and persistence, these hacks and workarounds can lay the groundwork for a more rational process in the future.

There is an analogy here with the design of medical information systems. As hospital records have become digitized, there is tension, as always, between security and access. A system that is secure enough to prevent all misuse would end up preventing at least some good but unpredictable uses as well. However, a system that allowed all potential uses would do too little to secure its contents.

The usual compromise is a “break the glass” function (analogous to breaking the glass covering of an alarm bell). A doctor who requests files that the system, for whatever reasons, says are not accessible to her, can override the security, saying, in essence, “My need for these files trumps the systems’ security model.” If she does this, she then gets access to the files.

However, to do so, she must be logged in so the system knows her identity, she must provide a rationale for why she is overriding the system, and she is told that her override will be audited within 24 hours. If her reasons for doing so are spurious, she will be disciplined.

What we are recommending is the journalistic equivalent of “break the glass” for overriding the assumptions a CMS makes about process and control. If a journalist wants to bypass or override a particular step, for reasons that seem justified and urgent, she should be able to do so, provided she is sufficiently senior to have internalized the local version of news judgment; that she is identified to the system and willing to provide the rationale for the override; and that she is willing to vouch for this rationale when reviewed by management.

This opens the door to the possibility of errors of commission, of course, errors that come from journalists doing something they should not have done, but far too many CMSes force errors of omission, which is to say errors that prevent journalists from taking advantage of an obvious opportunity. By allowing journalists to override their own processes as needed and with review, news organizations can keep their desire for predictable workflow from crushing the opportunity for novelty and initiative on the part of their staff.

Recommendation: Embrace Transparency

As a counterpart to the power of hacking your process and working around your CMS, news institutions should also make the new processes they are using to generate quality journalism transparent and systematizable by other organizations. In other words, when you invent a process that works, you should “show your work” so the same process can be used by other news outlets. ProPublica has been an industry leader in this regard. While some news organizations might fear that this kind of transparency will “aid the competition,” the fact remains that, for a century, news processes were an open book. We see no reason that organizations cannot continue to make money and get scoops in this new era, even when they show their work.

Why Engage in Journalistic Work? Motivation and Institutional Impact

The fact that an increasing number of individuals contribute to the information ecosystem for free, or do so for reasons that do not strictly boil down to making money, has caused almost as much consternation in the media industry as has the question of pay walls. Early optimism about the ability of “citizen journalists” to transform the news business was quickly overtaken by both professional defensiveness and the economic crisis that enveloped the newspaper business (a crisis that had nothing to do with amateur production of content, but which was often lumped in with arguments about citizen reporting).

We will discuss the role amateurs and interested citizens play in the larger news ecosystem in the next section. For now, it is enough to argue that we think both sides of what is now a very sterile debate are missing the point. The role of everyday people in news production is an institutional question as much as it is an economic one. In general terms, the fact that at least some news producers contribute their labor for free means that a world of limited information has now become a world of overwhelming, often unprocessed, information. This poses a general challenge for news institutions: how to come up with new institutional processes and procedures to go from an information-scarce environment to one that is information-rich.

In more specific terms, one of the major dilemmas of amateur production is how to organize, rationalize, and systematize that production. It is not a coincidence that Amanda Michel, the former head of the Huffington Post Off the Bus project, began her career as an organizer rather than a journalist. As an organizer, Michel was well trained in understanding what amateurs and volunteers can do, what they can’t do, and how to get them to work together for the benefit of a larger institution. How to manage amateur production can thus be tied to larger questions of how new entrants to the journalistic ecosystem might turn themselves from ad hoc networks to institutions. We now turn to that larger question.

Information and Impact (or, What Is Journalism For?)

Institutions provide certain key advantages when it comes to reporting news in the public interest: the kind of leverage, symbolic power, continuity and slack necessary to go toe-to-toe with other institutions: politicians, governmental agencies, businesses, schools, nonprofits, religious organizations. Yet the very same “systems of established and prevalent social rules” that help give institutions their heft also, in their inertia, serve to block necessary and needed change.

The solution to this paradox is not to abandon institutions. Nor is it to blindly stick with the institutions that have traditionally provided the best journalism in the past. Institutions are needed to do certain kinds of important things—but we need to reinvent the existing ones and to invent new ones. We need to focus on the way formerly ad hoc social arrangements become institutionalized, the barriers to such institutionalization, and the lessons and strategies for reporting the news that can be gleaned by watching this institutionalization take place.

There are two dilemmas of institutionalization at the heart of 21st-century journalism. The first, obvious and widely discussed since the 1990s, is the requirement for traditional news organizations to adapt to the internet, and the attendant difficulties they are having in doing so. The second, however, is less widely discussed: New forms of news production, from Andy Carvin’s curated Twitter feeds to MapLight’s database journalism to the stabilization of nonprofit web publishers like the Voice of San Diego or the Texas Tribune, have to become institutionalized, because without the virtues of institutions, albeit ones fitted to digital production, these new efforts will not be able to survive or to become persistent or powerful enough to discipline other institutional actors.

An example of a new, loosely structured digital journalism organization achieving some level of institutional stability can be found in the paradigmatic case of Talking Points Memo. We focus on TPM here, not because it has not dealt with its share of struggles and institutional challenges, but precisely because it has. Understanding the dynamic interplay between organizational challenge and institutional evolution is key to understanding the ways that the news media ecosystem is changing. Launched in 2000 by Ph.D. student and journalist Josh Marshall, the site was largely indistinguishable from the numerous “single-person” political blogs that were launched during the early days of the blogging revolution.

In 2002, the architecture of the site was fairly typical of the blogging genre at this early stage, with a “personalizing” photo of Marshall himself and a two-column setup (links in a narrow column on the left, major content in the middle of the page). Four years later, in 2006, the look and feel of the site illustrated the emergence of a very different organization. The picture of Marshall remained, but a far more structured page greeted readers.

Most importantly, by 2006 TPM was employing journalists, a process that began in 2005 when Marshall solicited money from readers to hire two full-time staffers; he raised $100,000 directly from the public. The right-hand column also linked to TPMMuckraker, an affiliated project that aims to do more original reporting and “muckraking.”

By 2007, the architectural transition of Talking Points Memo was complete. The web page had come to resemble a full-time journalism operation, with boxes, links and different size fonts indicating different branches of the project and various editorial judgments about important news. The growth in staff continues apace; in 2010 it had 16 employees, and by 2012 it had 28. The site also received a significant financial investment in 2009 from the venture capital fund Andreessen Horowitz.

By looking at the arc of Talking Points Memo over time, we see the emergence of a non-institutional website in 2000, followed by an increasingly complex level of organizational structuring, staff growth and symbolic capital accumulation (the site won a Polk Award in 2008 for its coverage of the politically motivated firings of U.S. attorneys). While TPM is, by now, an “old” project in digital terms, it is useful for precisely that reason. Only by looking at the history of digital organizations on the web can we see how the story of journalism in the digital age is more than simply one of decline and birth. There is institutional stabilization as well.

Just as important, the story of Talking Points Memo represents the stabilization of a hybrid series of old and new journalistic practices, not simply the adoption of traditional reporting methods for the digital age. TPM was a pioneer in what is now known as iterative journalism, which it defines as the “using of tips, reporting, and explanatory writing from readers alongside original reporting to piece together wide-ranging stories.” Although less is known about how TPM incorporates these practices into the 2012 iteration of its organizational structure, there is little doubt that the solidification of Talking Points Memo’s institutional capacity represents the mainstreaming of a certain set of organizational practices.

A more proximate example unfolded over the summer of 2012, when Homicide Watch D.C. was threatened with shutdown. Homicide Watch, as described in Section 1, represents a fusion of traditional court reporting and novel technical infrastructure; it operates on a tiny budget; and the founders, Laura and Chris Amico, offer licenses for their platform to other news organizations. It is an ideal case for creating high value at low cost by rethinking process. Nevertheless, by the summer of 2012, after two years of operation, Homicide Watch was threatened with shutdown, for two reasons. The first was that, despite the Amicos’ offering the platform for licensing, few news organizations bit.

Homicide Watch is so different from the “story-driven/should we report this?” model of the traditional crime desk that no existing organization could use the platform without altering its internal assumptions and processes as a side effect. The process gap made it far harder than the Amicos imagined to license their platform.

Despite this persistent difficulty, they kept the site going, running on a shoestring. Then came the second problem. Laura Amico, the reporter in the duo, got a Nieman Fellowship at Harvard. Faced with even the temporary departure of the founder, Homicide Watch had none of the advantages of large institutions—a deep bench of talent, employees with overlapping responsibilities who can pick up the slack, and so on.

It was only a last-minute Kickstarter campaign that enabled the hiring of D.C.- based staff for Amico to work with remotely that saved the site. This delays but does not solve the problem—small organizations like Homicide Watch are marvels of low-budget leverage, but they are also perennially threatened. To survive and spread their model, they will need to acquire more secure sources of funding, a larger and more varied staff, and more complex processes for managing that staff. They need, in other words, to become an institution.

Recommendation: Create “Startup Guides”

Starting a new news organization isn’t as hard as stabilizing these startups over the medium to long term. Because of this, successful startups (such as Talking Points Memo, the Texas Tribune, West Seattle Blog, Baristanet) should create publicly accessible “startup guides” that can be used by emerging news organizations.

We also need to keep in mind that, because these organizations are successful, their founders might have little time or interest in devoting resources to explaining their success. They, after all, have journalism to produce! For this reason, these organizations and others like them should receive foundation money that will allow them to engage in this “meta-reflection.”

Understanding how new journalistic organizations stabilize themselves, and how in so doing they make a particular set of institutional behaviors seem like common sense, is a missing link in our attempts to understand the emerging news ecosystem. It is a financing gray area as well. Most foundation dollars are directed toward projects that can demonstrate a tangible “impact,” which makes them less likely to help organizations engage in the boring, out-of-sight practices of institutional stabilization (things like setting up payroll systems, purchasing office space and providing employee health care, as well as training new employees and hardening institutional norms). Now that large national foundations like the Ford Foundation are increasingly investing in traditional media outlets such as the Washington Post and Los Angeles Times, investments in smaller, not-quite-newbut- not-yet-legacy outlets seem even less likely. The Washington Post received $500,000 from the Ford Foundation; it is not hard to imagine what Homicide Watch might be able to accomplish with a fraction of that money.

Recommendation: Rethink How to Deploy Funding

“Public” or noncommercial resources (including government and foundation money) should be used primarily to helping organizations institutionalize. Paradoxically, this is what these foundations and the public sector appear the least comfortable doing, focused as they are on demonstrating impact. Given the importance and fragility of new players, there must be a rethinking of this funding strategy in the foundation world.

When all is said and done, how are we to understand if news institutions— whether old, new, or somewhere in between—are doing what they are supposed to do? How do we measure the success of these organizations? When success is primarily defined as “business success,” the answer is simple—although by that metric, the news industry has been in a tailspin for at least half a decade. Once we no longer define success as simply “making money” but rather as “making an impact on the world,” however, our calculations change. There are many more ways of defining impact than there used to be, although the complexity of the question has correspondingly increased. To understand if institutions are working, we need to understand their purpose, and we need to measure the impact they are having on the institutions they monitor.

The question of “impact” has only recently begun to become a topic of conversation within news organizations and in the “future of news” conversation space. ProPublica has long been a leader in thinking about the actual impact of journalism, writing in its “about” page that “in the best traditions of American journalism in the public service, we seek to stimulate positive change. We uncover unsavory practices in order to stimulate reform.” ProPublica adds that it does this “in an entirely non-partisan and non-ideological manner, adhering to the strictest standards of journalistic impartiality.” It concludes by noting that “each story we publish is distributed in a manner designed to maximize its impact.”

This would appear to be a noncontroversial mission. Surprisingly, however, it is one that is not publicly echoed by more traditional media organizations, although a desire for “impact” does undergird journalistic belief structures more generally. Often, news institutions will argue that they are there simply to “present the facts” and that questions of what those facts will do lie outside their purview. Journalistic institutions usually view the news consumer as an empty receptacle for public information who, when well-filled with the proper knowledge, will act in a variety of democratic ways. The impact of the news, in other words, comes not from the news producers but the news consumers, from the democratic citizens themselves.

It should be clear by now that this empty receptacle analogy for thinking about, to quote NYU journalism professor Jay Rosen, “what journalism is for” is obviously not an analogy to which we lend much credence. Instead, we believe that it is news institutions themselves that often do the most to advance positive democratic outcomes. Given this, it has become essential to understand exactly how news organizations make an impact, and for news companies to admit that they are in the impact business.

We are heartened by the announcement, in the summer of 2012, that the Knight- Mozilla Foundation will be placing one of its fellows at the New York Times specifically for the purpose of designing ways for news organizations to measure impact. “What we do not have are ways of measuring how a piece of journalism changes the way people think or act. We don’t have a metric for impact,” Aron Pilhofer, the newspaper’s editor of interactive news, wrote on his blog.

This is not a new problem. The metrics newsrooms have traditionally used tended to be fairly imprecise: Did a law change? Did the bad guy go to jail? Were dangers revealed? Were lives saved? Or least significant of all, did it win an award? But the math changes in the digital environment. We are awash in metrics, and we have the ability to engage with readers at scale in ways that would have been impossible (or impossibly expensive) in an analog world. The problem now is figuring out which data to pay attention to and which to ignore. It is about setting up frameworks for testing, analysis and interpretation that are both scalable and replicable. It’s about finding that clear signal among the white noise that tells us whether our journalism is resonating or not, whether it is having the impact we believe it should. Helping us clear away the noise is the goal of our proposal to host a Knight-Mozilla fellow.

We hope that this step by the New York Times and the Knight-Mozilla Foundation will open the door to other news organizations thinking hard about what they do, and why it matters. Only if they start to think of themselves as organizations that “do things” in the world can we ever hope to understand the value of news institutions, and the ways we can replace the institutional value currently being lost in the digital tsunami of the early 21st century.

Recommendation: Assess and Value Impact

Make assessing impact, including job assignments and promotions, part of organizational culture. Consider partnerships with organizations that can provide information or insight into areas of desired impact.

What New News Institutions Will Look Like

We’ve now discussed why institutions are essential to secure the proper functioning of a healthy journalistic ecosystem. We’ve also discussed an institutional paradox: The traits that make organizations successful during times of relative social stability can be the very traits that leave them unable to adapt to a rapidly changing organizational reality. Given all this, what would healthy news institutions look like in the 21st century? What kind of institutional arrangements should newsroom editors, corporate CEOs, rank-and-file journalists and future of news commentators demand?

We should note, right off the bat, that the news institutions of the future will be smaller than they are today; drawing from our earlier arguments, we acknowledge that staffing reductions, lowered budgets and a need to “do more with less” have become the “new normal” for journalistic organizations. We also think that news organizations will probably get new forms of funding from a number of sources, including some form of digital subscription, website advertising, social media-driven sales strategies (such as those adopted by BuzzFeed), foundation grants and governmental subsidies. It is not our intent to recommend any of these revenue sources over others, although we do note that certain forms of revenue generation make the institutional strategies we envision below easier, while other choices make the transition harder.

We want to argue that news institutions of the future, apart from simply being smaller and revenue agnostic, should have three defining characteristics. They will have a hackable workflow. They will embrace a form of what we call “networked institutionalism,” and many of the largest, national journalism organizations should embrace local accountability journalism in partnership with local news outlets. Finally, news institutions will have to dramatically rethink what counts as “valid journalistic evidence,” find new ways to evaluate this new evidence, and program these collection and evaluation process into their hackable workflows.

The Hackable Workflow

Currently, news production processes are designed around two imperatives. The first is that they rationally manage the generation, transmission, editing and production of content, and do so for as many simultaneous platforms as possible. The second imperative, related to the first and largely a legacy of the print/broadcast production process, is that this workflow management is designed to produce a single finished product that will be “consumed” once and then disposed of. Thinking about workflow this way (and, more importantly, managing the production and dissemination of content this way) makes sense only so long as this “create once/consume once” model holds.

Online, journalistic content can be produced, added to, altered and reused forever. To take advantage of this change, workflow will have to be altered to support these new technological and cultural affordances. Creating a workflow that reflects the more flexible production of digital content will have the secondary consequence of making rigid newsroom routines more “hackable.”

The organizational breakthrough of the hacker-journalist lies not in being up to speed on the latest social media tools or even in being able to manage a thousand- column Google Fusion Table. Rather, the key insight of journalists raised on the rhythms of digital production and programming languages is the understanding that “content” is not used once and then discarded. Rather, content is endlessly reusable and should be designed for perpetual levels of iteration. In our interviews with working journalists, we were struck by the degree to which all news organizations remain trapped in a basic newsroom workflow that sees the ultimate goal of journalistic production as a singular, finished product. Rebuilt news institutions will design their workflow around a new, basic fact: News is never a finished product, and there is never a daily paper or evening newscast that sums up the work of the entire day.

This implies that news content, and the production of that content, will take iteration as its starting point. News products will have to be made as reusable as possible: on other platforms, on other devices, in new news stories, and even by other news organizations.

It also has another consequence: Newsroom CMSes will have to be designed to allow them to be broken. The obvious corollaries are that the act of choosing (or, in rare cases, designing) CMSes will have to include questions of who can override the expectations embedded in the CMS, and how, and that the processes put in place around the CMSes will have to emphasize the ability of at least some employees to exit the expected process in order to make novel decisions in response to novel circumstances.

In other words, they need to be flexible and adaptable to particular organizational needs. The focus of news production management should not be the creation of a final product within a one-size-fits-all workflow; rather, the focus should be upon the creation of endlessly iterable content through a highly hackable CMS.

The Networked Institution

Much ink has been spilled over the question of organizational partnerships in the news business, and many arguments have been advanced as to how institutions need to be more open to collaboration with other members of the digital news ecosystem. To date, however, the verdict on existing collaborative projects is mixed. A number of the New York Times’ most highly touted collaborations (with the Chicago News Cooperative, the Bay Citizen, and the CUNY-sponsored Local, for instance) have come to a rather inglorious end; at the same time, many Times partner organizations have noted how working with a powerful organization has the potential to distort their own organizational priorities. The notion of institutional collaboration, while intellectually powerful, is in need of some rethinking.

We want to argue that the news organization of the future will probably not be an entirely open institution whose primary purpose is collaboration, nor should it focus on only collaboration in an entirely project-based sense. Instead, we’d recommend a strategy much like that pursued by ProPublica in its “Free the Files” project.

In Free the Files, ProPublica sought to crowdsource the collection of FCC political advertising buys. And because the media markets in question are inherently local, ProPublica essentially engaged in an act of local accountability journalism, even as it coordinated this journalism on a national scale. The final step for a project like Free the Files would be to collaborate with local news organizations to publish the data in relevant, journalistically interesting ways. This is neither permanent collaboration, nor is it based around a onetime event. Instead, it is using smart, targeted networked institutionalism to fill a gap opening up in local accountability reporting. Not surprisingly, this new collaboration is also based around the existence of new forms of journalistic evidence, specifically large data sets.

New Forms of Evidence

In Section 1, we discussed new skills that will be required of the post-industrial journalist. In many respects, these skills can be summarized as an ability to recognize, rather, evaluate and display new forms of journalistic evidence. What do social media conversations, large data sets, and on-the-scene, first-person media production all have in common? In essence, they present the 21st-century journalist with a plethora of new sources that can be integrated into the journalistic production process. As we already argued, these changes in the larger media ecosystem present the individual journalist with new challenges and a need to master new skills. Every individual working in the news business thus needs to take this requirement seriously. At the same time, the institutions in which these journalists are embedded need to create organizations and newsroom workflow patterns that support individual journalists in this regard. In other words, we cannot continually require reporters to master new skills and evaluative procedure without simultaneously providing them with a workflow and an organizational structure that shows them that such skill mastery is valued and rewarded. Such a workflow will need to be both hackable and networked in smart, labor-enhancing ways.

Conclusion: Journalism, Institutions and Democracy

In a 1995 essay, the late communications theorist James Carey writes eloquently about what he calls the Fourth Estate view of journalism, a view of the relationship between the media and democracy that did not emerge fully until the 1960s and the Watergate era:

In this view, journalists would serve as agents of the public in checking an inherently abusive government. To empower it to fulfill such a role, the press had to possess special rights to gather news. Thus, under a fourth estate model a free press essentially was equated with a powerful press possessing special privileges of newsgathering.

Under the Fourth Estate view, according to Carey, the press increasingly began to see itself as the public’s representative within the political arena. For this notion of representation to resonate, however, the public not only had to see the press as its authentic political stand-in, but also had to believe that this representative press was capable of accurately understanding and portraying the basic empirical reality of the world. It is fair to say, if surveys of trust in journalism have any validity at all, neither of these conditions holds true in 2012.

What Carey did not consider—what almost no one considered in the world of 20 to 30 percent newspaper profit margins that still prevailed less than a decade ago—was that the press might also become incapable of fulfilling its end of the newsgathering bargain. From the 1960s on, most media criticism consisted of the argument that journalism was capable of far more powerful, in-depth, aggressive newsgathering than it decided to undertake. As Downie and Schudson argue in their analysis of “accountability journalism,” and as the 2011 FCC report on community information ecosystems reiterates, the problem with news today is as much one of incapacity as it is of purposeful neglect. We have also analyzed the connection among institutional capacity, the problem of time, and beat reporting in our discussion of David Simon’s arguments: In short, much of journalism’s value added lies in the operation of the daily routines, this monitorial beat system is best facilitated by healthy institutions, and institutional decline is leading to the evisceration of the unique journalistic resource.

At this point, a brief discussion of the economics of the news business is unavoidable, because it is at this moment that the future of news consensus breaks down. According to at least two camps in this debate, better market mechanisms will lead to revived institutional health, although the manners by which these camps define “better market” are directly opposed. A third perspective despairs that a market-based solution to the problem of institutional news industry decline can be found.

The first cluster of thought, represented by future of news thinkers like Jeff Jarvis, believes that the digital news ecosystem itself represents a more transparent, accurate marketplace than the monopoly news market of the previous regime. The contention here is that the funding for public interest journalism will emerge from a combination of transparency, increased public sharing and improvements in the ability of the advertising industry to micro-target consumers. Pointing to the monopoly status enjoyed by the most powerful news institutions for nearly a quarter-century, these thinkers see the current moment of informational abundance, the ability to tailor content to consumers, and “frictionless sharing” as remarkable steps forward from an earlier, less free model of media production.

David Simon, in comments on the blog post discussed above, nicely articulates a second understanding of what a “better” market means, one apparently shared by an increasing number of news industry executives. “I believe that local news can be sustained through an online revenue stream,” Simon argues. “But it requires that institutional journalism value and protect its own copyright and act as an industry to protect that copyright. And further, it requires a real reinvestment in that product.” To this list Simon adds the imposition of pay walls, which have, he contends, already demonstrated their success at the New York Times. In short, Simon and those like him argue that unified action to crack down on aggregators and charge for news will address the causes of newsroom decline via industrybased solutions. To maintain news organizations’ position as the dominant provider of news, speed bumps should be installed on the internet.

A third perspective despairs that either of these market-based solutions can be easily conjured up. Thinkers and writers in this camp point out just how unusual the confluence between wealthy capitalist institutions and the public-minded journalism they produced actually was. They argue that digital market dynamics actually punish institutional players that seek to create broad-based, monitorial media content. Unlike thinkers in the second camp, however, they do not believe that the current dynamics of the digital news system can be easily undone, nor do they think the dynamics necessarily should be undone even if such an option were possible. Some thinkers within this perspective move from here to an argument that the public goods produced by news institutions (particularly beat reporting) can be funded only via non-market forms of subsidy, whether philanthropic or proceeding more directly from the state.

The three authors of this essay would place themselves in this third category, a standpoint that also informs our transition from institutions, in this section, to the news ecosystem that immediately follows in Section 3. We must move away, in other words, from pinning our democratic hopes entirely on the Fourth Estate conception of the press. Public accountability must come, in part, from the networked news ecosystem itself. Let us be clear: This is not to argue that these news networks exist in some sort of institution-free vacuum. Indeed, journalism institutions turn out to be some of the most important nodes within this new digital environment. Nevertheless, they must coexist in new ways, alongside and in concert with more groups and institutions than ever before—not simply for economic reasons but also for democratic ones. They must lean on these new groups and networks in new ways. We are echoing here our opening argument that the journalism industry is dead but that journalism exists in many places.

In the essay quoted earlier, James Carey contends that the “watchdog notion of the press, a press independent of all institutions, a press that represents the public, a press that unmasks interest and privilege, a press that shines the hot glare of publicity on all corners of the republic, a press that searches out expert knowledge among the welter of opinion, a press that seeks to inform the private citizen, these are ideals and roles that have served us well through some dark times.” But, he continues, “as the century progresses, the weaknesses of modern journalism have become increasingly apparent and debilitating.”

Carey’s thoughts on the benefits and weakness of the Fourth Estate are as true now as they have ever been. The crisis, however, is even more acute than it was when he wrote those words in 1995. The communicative universe, moreover, has changed radically. If the democratic accountability fostered by the institutional press is to survive in a post-Fourth Estate world, democratic accountability must itself become a networked property.

Section 3: Ecosystem

The only reason to talk about something as abstract as a news ecosystem is as a way of understanding what’s changed. The most significant recent change, of course, is the spread of the internet, connecting our computers and phones in a grid that is global, social, ubiquitous and cheap. As new capabilities go, the ability for any connected citizen to make, copy, alter, share and discuss digital content is a lulu, upending many existing assumptions about news and about media in general.

The news business in the 20th century was a fairly linear process, where reporters and editors would gather facts and observations and turn them into stories, which were then committed to ink on paper or waves in the air, and finally consumed, at the far end of those various modes of transport, by the audience.

A pipeline is the simplest metaphor for that process, whether distribution of news was organized around the printing press or the broadcast tower. Part of the conceptual simplicity of traditional media came from the clarity provided by the near-total division of roles between professionals and amateurs. Reporters and editors (and producers and engineers) worked “upstream,” which is to say, as the source of the news. They created and refined the product, decided when it was ready for consumption, and sent it out when it was.

Meanwhile, the audience was “downstream.” We were the recipients of this product, seeing it only in its final, packaged form. We could consume it, of course (our principal job), and we could talk about it around the dinner table or the water cooler, but little more. News was something we got, not something we used. If we wanted to put our own observations out in public, we needed permission from the pros, who had to be persuaded to print our letters to the editor, or to give us a few moments of airtime on a call-in show.

That pipeline model is still central to the self-conception of many institutions in the news business, but the gap between that model and the real world has grown large and is growing larger, because the formerly separate worlds of the professionals and the amateurs are intersecting more dramatically, and more unpredictably, by the day.

The main effect of digital media is that there is no main effect. The changes wrought by the internet and mobile phones, and the applications built on top of them, are so various and pervasive as to defeat any attempt to understand the current transition as a single force or factor. To understand this as a change to the ecosystem, it helps to have a sense of where the changes are showing up, and how they interact.

Here are a few surprises in our little corner of the 21st century:

In 2002, after Senate Minority Leader Trent Lott praised Strom Thurmond’s segregationist 1948 campaign, one of the people who did Lott in was Ed Sebesta, a historian who had been tracking racist statements made by American politicians to segregationist groups. Shortly after Lott said his praise had been an uncharacteristic slip, Sebesta contacted Josh Marshall, who ran the blog Talking Points Memo, to share similar (and similarly racist) comments made by Lott dating back to the 1980s.These comments undermined Lott’s ability to characterize his comments as a slip and led to his losing his Republican leadership position. Sebesta had built the database of racist speech on his own, without institutional support; Marshall was an amateur blogger (not yet having incorporated); and the source contacted the news outlet, 1,500 miles away, rather than vice versa. Indeed, as mentioned in Section 2, Talking Points Memo became the institution it is today because of what Marshall was able to do as an amateur (another example of institutional stabilization).

In 2005, the London transit system was bombed. Sir Ian Blair, the head of London’s Metropolitan police, went on radio and TV to announce that the cause had been an electrical failure in the underground. Within minutes of Blair’s statements, citizens began posting and analyzing pictures of a bombed double-decker bus in Tavistock Square, and in less than two hours, hundreds of blog posts were analyzing this evidence. These posts reached hundreds of thousands of readers and explicitly contradicted Blair’s characterization.Seeing this, and overriding the advice of his own communications staff, Blair went on air again less than two hours later to say that it had indeed been a bombing, that the police didn’t have all the answers yet, and that he would continue reporting as they knew more. When he spoke to the public, Blair had the power of all the traditional media behind him, but it was clear that merely having a consistent message on every broadcast channel in existence was no longer the same as having control.

Starting in 2010, in a series of reports called Dollars for Docs, Pro- Publica covered the flow of payments between the pharmaceutical industry and prescribing physicians. It was a story that had been covered before in bits and pieces, but ProPublica brought several things to its investigation not previously seen, including a database it assembled from data the pharmaceuticals were required to make public, along with the ability and journalistic will to mine that database.Dollars for Docs was not just a new report. It was a new kind of reporting. Though much of the data used were publicly available, they had not been centralized or standardized in a form that could make them useful; armed with this database, ProPublica has been able to report on a national story, while also providing tools for other organizations to cover the same issue as a local story; as of this writing, it helped spark stories in 125 other publications. (As a nonprofit, ProPublica can be both a news retailer and wholesaler.) In addition, it has been able to make its database as local as any news story can ever get: individual users can type the name of their doctor into the database and get a customized report. The harvesting and organizing of publicly available data thus became a platform for national, local and personal reporting.

Better access to individuals, as with Ed Sebesta; crowds, as with the London bloggers; and machines, as in Dollars for Docs, are driving working models that would have been both unthinkable and unworkable even 10 years ago: Huffington Post’s Off the Bus project, covering every Iowa caucus in 2008 with citizen journalists, would have bankrupted the organization had it been done with

stringers. The Guardian decided to crowdsource the tracking of expenses by UK members of Parliament, because the job, done by employees, would not just have cost too much but taken too long.

Journalists have always used tip lines and man-in-the-street interviews, and members of the audience have always clipped and forwarded favorite articles. What’s new here isn’t the possibility of occasional citizen involvement. What’s new is the speed and scale and leverage of that involvement, the possibility of persistent, dramatic amounts of participation by people previously relegated to largely invisible consumption. What’s new is that making public statements no longer requires pre-existing outlets or professional publishers.

Tip lines worked well only in geographically local areas, but NY Velocity was able to reach halfway around the world to get its critical interview in the Lance Armstrong doping case. Man-in-the-street interviews are random, because the professionals controlled the mode and tempo of public utterances, but with Flickr and weblogs, British bloggers could discuss the London bombings in public, at will, and with no professionals anywhere in sight. Dollars for Docs took disparate data and turned it into a database, which gave ProPublica an ongoing resource that was reused by it, other organizations, and millions of users over the course of two years and counting.

This is a change in degree so large, in other words, that it amounts to a change in kind. As Steven Levy observed, writing about the iPod, when you make something 10 percent better, you’ve made an improvement, but when you make something 10 times better, you’ve created a new thing. New digital tools can accelerate existing patterns of news gathering, shaping and publishing so dramatically that they become new things.

We are living through a shock of inclusion, where the former audience is becoming increasingly intertwined with all aspects of news, as sources who can go public on their own, as groups that can both create and comb through data in ways the professionals can’t, as disseminators and syndicators and users of the news.

This shock of inclusion is coming from the outside in, driven not by the professionals formerly in charge, but by the former audience. It is also being driven by new news entrepreneurs, the men and women who want to build new kinds of sites and services that assume, rather than ignore, the free time and talents of the public.

The importance of news isn’t going away. The importance of dedicated professionals isn’t going away. What’s going away are the linearity of the process and the passivity of the audience. What’s going away is a world where the news was made only by professionals, and consumed only by amateurs who couldn’t do much to produce news on their own, or distribute it, or act on it en bloc.

This a change so varied and robust that we need to consider retiring the word “consumer” altogether and treat consumption as simply one behavior of many that citizens can now engage in. The kinds of changes that are coming will dwarf those we’ve already seen, as citizen involvement stops being a set of special cases and becomes a core to our conception of how the news ecosystem can and should function.

Ecosystems and Control

To talk about a “news ecosystem” is to recognize that no news organization is now, or has ever been, absolute master of its own destiny. Relationships elsewhere in the ecosystem set the context for any given organization; changes in the ecosystem alter that context.

This essay began with a focus on the individual journalist, and on the various ways she can gather, process and make sense of information and events vital to public life. Most journalists do their work inside institutions; those institutions are shaped by everything from the size and makeup of the staff they employ to their self-conception and source of revenue. These institutions in turn shape the work of the journalist: which stories she can and can’t pursue, what is considered good or bad work, who her collaborators can be, and what resources are at her disposal.

Those institutions are themselves in an analogous position, operating in the media environment that covers the news (and sometimes even the part that doesn’t). This news ecosystem (hereafter just “ecosystem”) is made up of other institutions— competitors, collaborators, vendors and suppliers—but it is also made up of the ways other actors affect those institutions. The audience’s preference for news about Hollywood over Washington, the presence of the competition just a click away, the Supreme Court’s current interpretation of the First Amendment, and the proliferation of high-quality cameras on mobile phones are all part of the news ecosystem of the early 21st century, the effects of the ancient and modern all mixed together.

The ecosystem also shapes institutional capability: the kinds of stories that do and don’t get pursued are affected by everything from audience and advertiser desires to narrative frames. Everyone knows how to tell the story of a cheating athlete or a business gone bankrupt, but there is no obvious narrative frame for the tension between monetary and fiscal union in the EU, even though the latter story is by far the more important. Similarly, the facts and assumptions around things like access to data, validity of sources, the nature and limits of acceptable partnerships, and so on affect what institutions believe they can and can’t do, and should and shouldn’t do.

In the pipeline model of news, the existing institutions could be thought of as a series of production bottlenecks, owned and operated by media firms, and from which they captured income from both advertisers and audience. These bottlenecks were a byproduct of the incredible cost and difficulty of reproducing and distributing information, whether via printing press or broadcast tower. As noted in the last section, this was an ecosystem in which the institutions themselves had a high degree of control over their own fates.

A large, competent staff was required to print and deliver a daily paper; an even larger one was required to make and broadcast a news program. These costs and difficulties limited competition, as did the geographic range of delivery trucks and broadcast signals. Within the small numbers of organizations that could create and distribute news, whole professional structures arose.

Newspapers and magazines saw this institutionalization first, of course; the printing press preceded not just radio and movies but also steam engines and telegraphs. The entire professional edifice of writers and editors and publishers and, later, illustrators and layout artists and fact checkers and all the rest of the apparatus that went into creating a newspaper were built around—and often quite literally on top of—the giant machines that put the ink on the paper. Radio and TV news departments followed the same pattern, inventing professional categories and practices to subdivide and systematize both the work and the categories of employment that went into making broadcast news.

Then came the internet, whose basic logic—digital replication, universally available, with no division of participants into producers and consumers—is at odds with the organizing principles of news production as it has existed since the 1600s. Abundance creates more disruption than scarcity; when everyone suddenly got a lot more freedom, every relationship in the old “charge for operating the bottleneck” model was up for grabs.

The arrival of the internet did not herald a new entrant in the news ecosystem. It heralded a new ecosystem, full stop. Advertisers could reach consumers directly, without paying a toll, and it turned out many consumers preferred it that way. Amateurs could be reporters, in the most literal sense of the word—stories from the Szechuan quake to Sullenberger’s Hudson River landing to Syrian massacres were broken by firsthand accounts. The doctrine of “fair use,” previously an escape valve for orderly reuse of small amounts of content among a small group of publishers, suddenly became the sort of opportunity that whole new businesses of aggregation and re-blogging could be built on top of. And so on.

When changes are small or localized and existing institutions are well adapted to those conditions, it doesn’t make much sense to think about things as an “ecosystem”—simply responding to competitive pressures and adapting to small and obvious changes is enough. For institutions that produce news, however, the changes of the past decade have not been small or localized.

A common theme in writing about the response to those changes by traditional news outlets is the failure of newspaper management to recognize the problems they would face. This, in our view, misdiagnoses the problem: The transition to digital production and distribution of information has so dramatically altered the relations among publishers and citizens that “stay the course” has never been an option, and, for the majority of the press that was ad-supported, there was never an option that didn’t involve painful restructuring.

A similar theme has been unpredictability and surprise, explaining the current crisis with the rationale that recent changes were so unforeseeable and have transpired so rapidly that traditional organizations were unable to adapt. This view is also wrong: There were coherent predictions of the trouble the internet would cause for the news industry going back to the late 1980s, and despite frequent invocations of “internet time,” the pace of this change has been glacial; dated from 1994 (the first year of the broadly commercial web), management has had 75 consecutive quarters to adapt.

Individual accounts of even successful adaptation to the current ecosystem make it clear how hard such adaptation is. To take one example, in August 2011, the New York Daily News launched innovative live coverage of Hurricane Irene, replacing the front page of its website with a live blog, Storm Tracker.

The News then dispatched reporters out into the city, armed with cameras and phones (often the same device) to document everything from the evacuation efforts, to residents’ struggles to shelter in place, to the effects of the wind and water itself. These live reports were interspersed with messages from weather services, emergency services and city government, all unfolding along with the storm.

The News’ effort in live disaster blogging was a triumph, for which the News rightly won considerable praise. Also, it almost didn’t happen. The precipitating event for Storm Tracker was not a new web strategy but the failure of an old one. The News building is on Water Street, in a Class A flood plain, so the police severely limited the number of workers who could go there on the weekend Irene blew in. This would seem to be no problem for filing digital copy, except that the News’ content management system had been engineered to be difficult to log into if you weren’t in the building.

As noted earlier by Anjali Mullany, who pioneered live blogging at the News and oversaw Storm Tracker, the need to establish a production process around a CMS creates a large but often hidden tax on attempts at innovation. In this particular case, the Daily News had taken a tool that could have been accessible to anyone working for the paper anywhere in the world, and added security constraints so that it instead behaved like a steam-driven printing press—workers had to be near the machine to operate it, even though the machine was a networked computer.

The defining need that drove the launch of Storm Tracker, in other words, wasn’t to find new ways to inform the residents of New York City during a big storm, but simply to find a way to keep the website up when terrible engineering decisions collided with terrible weather.

This was one essential factor in the launch of Storm Tracker. There was one other. In interviews with Mullany about Storm Tracker’s success, she noted that it was fortunate that Irene had hit in late August instead of early September, because in late August, most senior management were on vacation and thus could not override the decision of the News’ junior but more web-savvy staff to try something new.

As noted in Section 2, institutions are designed to resist change—that is their core competence, in the language of management consultants. The risk, of course, is that too much success in that department can preserve an institution’s internal logic right up to the moment it collapses. If what it takes to innovate in the manner of Storm Tracker is brain-dead technology management, the fear that your newsroom will be washed out to sea, and senior management gone fishing, then the prospects for orderly innovation among legacy organizations is grim. (As a dreadful coda, Hurricane Sandy flooded the Daily News building, and the users of the CMS suffered the same issue as during Irene. Even a year after the original crisis, no one had adapted the system to allow for a distributed workforce.)

Given this, the old news industry’s collective fabulation about restoring status quo ante has itself been harmful. News organizations should obviously do what they can to improve their income, but the reliable revenue, high profits and cultural norms of the news business in the 20th century are gone, and the ecosystem that reliably produced such effects is gone as well. For individual journalists and for the institutions that serve them, cost containment, plus restructuring in the direction of more impact per hour or dollar invested, is the new norm of effective news organizations, the pattern we’ve taken to calling post-industrial journalism.

Post-Industrial Ecosystem

What does post-industrial journalism look like? It starts with the assumption, introduced in Section 2, that news organizations are no longer in control of the news, as it has traditionally been understood, and that the heightened degree of public agency by citizens, governments, businesses and even loosely affiliated networks is a permanent change, to which news organizations must adapt.

As one example of this change, the ejection of the Occupy Wall Street movement from New York’s Zuccotti Park in November 2011 was broken not by the traditional press, but by the occupiers themselves, who sent word of the police action via SMS, Twitter and Facebook. More pictures and video of the event were generated by the participants than by the traditional media, in part because the overwhelming majority of available cameras were in the pockets of the occupiers and in part because the police closed the airspace above the park to news helicopters. Reporters on the scene hid their press badges because ordinary citizens had better access to the events in question than credentialed members of the press.

Similarly, the news organizations that ran leaked documents from WikiLeaks often described WikiLeaks as a source rather than as a publisher, on the rationale that WikiLeaks provided the material they were working from. This makes sense in a world where holders of important information can’t publish it on their own and where publishers don’t share source materials with one another. But there is no longer a right answer to the question, “Who is a publisher and who is a source?” WikiLeaks is a source that can publish globally; it is a publisher that collaborates on delivery of raw material with other publishers.

Coverage of events like #Occupy and Cablegate (as well as Tunisian uprisings, Syrian massacres, Indonesian tsunamis, Chinese train crashes and Chilean protests) simply cannot be described or explained using the old language of the pipeline. The best argument for thinking of news as an ecosystem is to help reexamine the roles institutions can play in that ecosystem.

Imagine dividing the new entities in the news ecosystem into three broad categories— individuals, crowds and machines (which is to say, both new sources of data and new ways of processing it). Individuals are newly powerful because each of them has access to a button that reads “Publish”; material can now appear and spread, borne on nothing but the wings of newly dense social networks. Crowds are powerful because media have become social, providing a substrate not just for individual consumption but also for group conversation. Kate Hanni was able to use newspaper comment sections to drive her “Airline Passengers Bill of Rights” because she had a better sense of those papers as watering holes than they had themselves. And machines are newly powerful because the explosion of data and analytic methods opens whole new vistas of analysis, as with lexical and social network analyses that followed the release of State Department cables.

As with the inability to make WikiLeaks stay firmly in the category of either source or publisher, there is no stable attitude that a news outlet can take toward the new agency of individuals, the spread of ridiculously easy group-forming, or the increase in the volume of raw data and the new power of analytic tools. As the Daily News’ unwitting experiment with disaster blogging demonstrates, these are not resources that can be added to the old system to improve it. These are resources that change any institution that adopts them.

Now imagine dividing up the core operation of a news organization into three overlapping phases—gathering information about a story, shaping it into something ready to publish, and then publishing it. This taxonomy of a news pipeline into getting, making, and telling is of course simplistic, but it captures the basic logic of news production—take material broadly from the outside world, shape it into whatever your organization considers a story or a segment or a post, and then send the newly fashioned material back out into the world.

Armed with these two triads, we can ask, “How do individuals, crowds and machines affect the work of getting, shaping and telling?”

As one example, the “getting” phase of the news story was the cycling blog, NY Velocity, founded in 2004 by cycling enthusiasts Andy Shen, Alex Ostroy and Dan Schmalz. Though the site existed mostly to cover bike racing in New York, the people running it grew increasingly alarmed at what they thought was a culture of willful blindness around the possibility that Lance Armstrong, seven-time winner of the Tour de France, had been doping with Erythropoietin, or EPO, a blood-boosting drug. NY Velocity interviewed Michael Ashenden, the Australian physician who had developed a test for EPO; in the interview, Dr. Ashenden went on the record as saying he believed, after testing a sample of Armstrong’s blood during his 1999 Tour de France win, that Armstrong had been doping. This was original, long-form reporting, and the resulting 13,000-word interview became a public rallying point for cyclists who believed not merely that Armstrong had cheated his way to those wins, but that the professional sports journalism world was far too willing to look the other way. NY Velocity’s founders were willing to pursue a lead tenaciously and publicly; not only were they completely vindicated in their suspicions, but they also demonstrated that professional journalists may simply not be covering a story well enough and that dedicated and knowledgeable insiders can sometimes fill this gap.

To take another intersection of traditional practice and new capability, consider the way the ability to assemble groups has changed creating a story. Huffington Post’s 2008 reporting project was able to cover every site of the Iowa caucuses because it could dispatch a volunteer to each site for an hour or two, something that would have been too expensive with freelancers and too travel-intensive for full-time staff. The volunteers for Off the Bus were not the people creating the eventual report on the caucuses—the project was instead a hybrid of distributed reporting and centralized writing of the story; it was, in a way, a return to the old separation of reporters in the field and rewrite men in offices close to the machine.

Still another cross-tab of existing jobs and new resources is the way a story can be told by letting machines do some of the telling. Several projects using Ushahidi, the “crisis mapping” tool, have crossed over from “resource for recovering from a crisis” to “resource for understanding the crisis as it happens.” Ushahidi has been used to create real-time maps of voter intimidation, street harassment, radiation levels and snow removal—every instance of Ushahidi for newsworthy events is an example of machines altering how data are collected, collated and displayed.

Every one of the core activities of getting, making and telling is being altered by new ways of involving individuals, groups and machines. As noted in Section 2, the significance and pervasiveness of these alterations is likely to defeat institutions’ ability to integrate change slowly. Many of the recommendations in this section are thus echoes of those from the section on institutions; when they are repeated here, it is with greater emphasis on the way that using these new resources and capabilities means adaptation to an altered ecosystem.

News as an Import-Export Business

One way to think about ecosystems is to ask what flows between its participants. As noted, flows in the 20th century were relatively linear and predictable; where there was significant complexity in flows of information, they tended to be embedded in highly specified business dealings, as with the use of syndicated or wire service copy.

The value of an Associated Press story to an individual newspaper was reflected in the interest of the locals; a subscription to the AP was justified when the value of that interest helped the paper generate more in ad revenue than the feed cost them.

This was a system where flows of business value were specified in bilateral agreements and priced in dollars—a newspaper signs an agreement with the AP in return for access to its feed. Compare that to the Huffington Post’s original model: the realization that some of HuffPo’s published material could excerpt existing stories, add commentary, and produce an economically viable new product. Fair use has existed in this form for decades; what changed was the conditions of the ecosystem. HuffPo management realized that fair use, as applied on the web, meant that, in essence, everything is a wire service and that excerpting and commenting on unique content from the Washington Post or the New York Times was actually more valuable to readers than contracting with the AP or Thomson Reuters.

The Huffington Post has often been criticized for this stance, but this is shooting the messenger—what it did was to understand how existing law and new technology intersected. The AP itself is experimenting with holding back key stories from its subscribers, in a bid to acquire more direct traffic. Similarly, the AP’s case against Shepard Fairey, an artist who created an iconic image of Barack Obama as a derivative work from an AP image hinged on the idea that AP had the right to photograph Obama without his permission but that Fairey couldn’t use that likeness to create a related one. In the Fairey case, there was no objective reality that the case could be adjudicated on—there was simply a set of legal doctrines.

The old ethic was described by Terry Heaton in a post entitled “Why don’t we trust the press?”:

Nobody ever mentions anybody else in the world of news gathering unless a copyright claim forces it. Before the Web, this was understandable, because as far as anybody knew, our reporters had all the angles on everything. The idea that the guy across town had it first was irrelevant, so why mention it? As far as our viewers or readers were concerned, we were the font of all knowledge. Besides, we had the time to gather everything we needed anyway. It was the world of the “finished” news product. But now, with news in real time, everybody can clearly see stories develop across all sources. We know who got it first. We know when something is exclusive. Our hype is just nonsense.

It has become obvious, in the new news ecosystem, that the notion of everyone producing a finished product from scratch is simply not the normal case. We are each other’s externalities. This has always been the case to some degree—newspapers famously helped set the agenda for broadcast media in the 20th century—but it was often hidden in the manner Heaton describes. The explosion of sources and the collapse in cost for access has made the networked aspect of news more salient. The tech site Slashdot was clearly a source of story ideas for the New York Times’ Science Times section; Boing Boing sends traffic to obscure but interesting websites, which often become fodder for stories, elsewhere, and so on.

In some ways, the ecosystemic aggregation, inspiration, excerpting and even wholesale ripping-off of journalistic content marks a return to earlier ages of newsgathering in which country newspapers often consisted of little more than week-old stories copied from metropolitan dailies. The ability to aggregate news, 18th-century style, was due in part of a lack of institutional norms (was reprinting news “illegal”? Few editors probably thought of it in those terms) and in part due to technology (few people in New York City would ever see a newspaper in rural Kentucky). The idea that news could be syndicated, for fee, is a relatively new concept in journalistic history.

The syndication model that existed under the 20th-century news production regime thus isn’t coming under pressure because of bad actors, but because the basic configuration of the media landscape has changed dramatically. In the old model, reuse of material was either contractual (freelancers, wire services) or hidden. In the new model (old models, really), there are many forms of reuse; some are contractual, but most are not. The AP is a particularly visible case, but every news institution is going to have to position or reposition itself relative to new externalities in the ecosystem.

The spectrum of the exchange of value between individuals and organizations is enormous and highly gradiated—there is now an institutional imperative to get good at developing partnerships, formal and informal, that have become possible in the new ecosystem. To take one recent example, important both in itself and for what it says about the changing world, the ability to translate written and spoken material has become dramatically easier and cheaper.

Automated translation tools are far better today than they were even five years ago, as with the use of Google Translate by English speakers to read Arabic tweets; crowdsourced translation, as with dotSub or the TedTalks translators, can convert astonishing amounts of material in short periods; and the rise of institutions given to consistently bridging linguistic and cultural gaps, as with Meedan or China‑ Smack. Every institution in the world now faces two strategic choices: when, and out of what languages, do we begin translating primary course material or existing reporting to present to our audience, and, second, when and into what languages do we translate our own material to attempt to reach new audiences. Imagining news as a linguistic import-export business, investing in importing from Arabic into English, at potentially all levels of the cost-quality trade-off, could be valuable for any U.S. newsroom that wants to cover geopolitics, while, given the demographic trends in the United States, investment in exporting from English to Spanish could add huge value in audience acquisition and retention.

Recommendation: Get Good at Working with Partners

There is a famous photo, from the 2008 Olympics, of a phalanx of sports photographers on a platform, all angling to get what is essentially the identical shot of Michael Phelps. The redundancy pictured is staggering. There is something like half a million dollars’ worth of gear committed to capturing a single point of view, and worse is the human cost of dozens of talented photojournalists competing for minimal incremental value.

This sort of competition, where every institution has to cover the same thing in only slightly different ways, was absurd even when those organizations were flush. Now, with many resources gone and more going, it is also damaging.

News institutions need to get better at partnering with individuals, organizations, even loose networks, both to expand their purview and reduce their costs. Successful examples range from the New York Times/WNYC SchoolBook partnership, designed to improve education coverage to both participants, to the aforementioned WikiLeaks and Dollars for Docs examples, to arm’s length use of online data hosted by the Sunlight Foundation or Data.gov. In particular, finding ways to use and acknowledge the work of such partners without needing to fit everything into a “source or vendor” category would expand the range of possible collaborations.

Recommendation: Figure Out How to Use Work Systematized by Others

This is a subset of the previous recommendation. We are seeing a huge increase in structured data (data that come in a highly ordered and welldescribed fashion, such as a database), and a related increase in APIs (application programming interfaces, a systematic form of machineto- machine conversation). Taken together, this means a potential rise in collaboration without cooperation, where a news outlet builds on data or interfaces made available elsewhere, without needing to ask the institution hosting the data for help or permission.

This is obviously valuable, as it provides low-cost, high-quality access to previously unavailable source material. As with so many new capabilities in the current environment, however, structured data and API access are not new tools for doing things the old way. These are tools whose adoption alters the organization that uses them.

The most obvious obstacles to taking advantage of work systematized by others are the technical skills and outlook required to use it. This problem, fortunately, is getting somewhat better, as tools like Many Eyes and Fusion Tables are making it easier for less tech-savvy people to explore large data sets looking for patterns. Even with this improvement, however, there is a need for basic numeracy among journalists, something we’ve taken to calling the “Final Cut vs. Excel” problem, where journalism schools are more likely to teach tools related to basic video production than to basic data exploration.

This emphasis on tools for presentation over investigation is most acutely a problem in the nation’s journalism schools, of course, but it is widespread in the industry. (As Bethany McLean of Vanity Fair said to us, “Anyone who’s good at understanding corporate balance sheets is likelier to work on Wall Street than cover it.”)

The subtler obstacles are cultural—using work systematized by others requires overcoming Not Invented Here syndrome and accepting that a higher degree of integration with outside organizations will be necessary to take advantage of new sources of data. Another obstacle is cultural— data and APIs are often freely available, but the hosting organizations want credit for helping to create something of value. This imperative pushes against the aforementioned tendency not to credit others publicly.

This logic is not just about using others’ work, of course. News organizations should do a better job of making their work systematically available to other organizations for reuse, whether by sharing data or by sharing tools and techniques. There will always be a tension between competitive and cooperative logic in the news ecosystem, but in the current environment, the cost of not undertaking shared effort has gone up, the cost of lightweight collaboration has gone down considerably, and the value of working alone has fallen.

As noted in Section 2, presence of process is often a greater obstacle to change than absence of resources. Taking advantage of work systematized by others and figuring out ways of making your work systematically useful to others are ways to do higher quality work at lower cost, but doing so requires an organization to start treating the newsroom like an import-export business, rather than an industrial shop floor.

Self-definition as Competitive Advantage

There is no solution to the present crisis. One corollary is there is no stable state coming to the practice of news any time soon. We are not living through a transition from A to B (Walter Cronkite to Baratunde Thurston, say) but a transition from one to many, from a world where Cronkite could represent some central focus to a world with a riot of competing voices—Thurston and Rachel Maddow and Juan Cole and Andy Carvin and Solana Larsen as a few members of a cast of millions. We’ve seen this in microcosm—the transition from broadcast to cable networks on TV, or, as a less popular example, from terrestrial to satellite radio led to a shift from networks that catered to a broad swath of people to highly specific niches (Comedy Central, Food, and, on satellite radio, not just blues music but Delta blues or Chicago blues).

Linking is the basic technological affordance of the web, the feature that sets it apart from other forms of publishing, because it says to the user: “If you want to see more on the topic being discussed, you can find related material here.” It is a way of respecting the users’ interests and ability to follow the story on their own.

In the practice of news, the most basic form of linking is to source materials. A discussion of a recent indictment should link to the text of that indictment. A discussion of a scientific article should link to that article. A piece about a funny video should link to that video (or, better, embed it).

This is not sophisticated digital strategy—it is core communicative ethics, yet it is disturbing that so many journalistic outlets fail this basic test. At fault are the usual cultural obstacles (as with Terry Heaton’s observations about not giving credit), ingrained habits (news desks used to be limited by space or time constraints to excerpting source materials), and commercial concern about sending readers elsewhere.

None of these obstacles, though, merits much sympathy. The habit of not giving credit, while widely practiced, is plainly unethical. The web no longer feels novel to the audience; it’s well past time for its core practice to be internalized by journalists. And refusing to link for commercial reasons may make sense to the ad sales department, but it should horrify anyone whose job involves public service.

The public value of linking to source materials is so obvious, and so easy, that organizations that refuse to do it are announcing little more than contempt for the audience and for the ethical norms of public communication.

The internet, of course, provides infinite potential variety, making the argument in favor of niche audiences (and niche loyalty) strong here as well. In addition, the old logic of geographic segmentation of local coverage allowed news outlets to buy wire service news or syndicated packages, secure in the knowledge that their audience wouldn’t see the same content published or aired in a neighboring town. With the rise of search as an essential form of finding content, however, the average user now has access to thousands of sources for the story of the Somali pirates, the vast majority of which are drawn from the same wire service copy.

This creates a new imperative for news organizations, for which the strategy of “We are all things to all people in a 30-mile radius” is no longer effective. There are useful services to be rendered by hyperlocal organizations (the St. Louis Beacon, the Broward Bulldog), others by hyperglobal ones (the New York Times, the BBC), others still by highly specialized niche sites of analysis (Naked Capitalism, ScienceBlogs), and so on.

This is a breadth vs. depth trade-off. The web creates a huge increase in diversity over a world dominated by broadcast and print media. More recently, an increasing amount of news is flowing through social media sites, and especially Twitter and Facebook; the growing dominance of the social spread of news and commentary further erodes the ability for any one site to produce an omnibus news package.

There is a place for rapidly produced, short pieces of breaking news. There is a place for moderately quickly produced analysis of moderate length (the first draft of history). There is a place for careful, detailed analysis by insiders, for insiders. There is a place for impressionistic, long-form looks at the world far away from the daily confusion of breaking news. And so on. Not many organizations, however, can pursue more than a few of these modes effectively, and none that can do all of them for all subjects its audience cares about.

This is partly because institutions always face breadth vs. depth trade-offs, but the internet has made them considerably worse—masses are bigger, as with the spread of the news of Michael Jackson’s death. Niches are nichier—coverage of mortgage issues at Lenderama, or Latino youth issues at Borderzine. The fastest news can be faster—the White House announcement of Osama bin Laden’s death was prefigured on Twitter more than once by independent sources.

Recommendation: Give Up on Trying to Keep Brand Imprimatur while Hollowing Out Product

This is principally a negative recommendation.

Two things that have changed dramatically in the past decade are the value of reputation (higher) and the cost of production (lower). So many sources of news are now available that any publication with a reputation for accuracy, probity or rigor has an advantage over the runof- the-mill competition. However, digital tools have also dramatically reduced the cost of finding and publishing information, leading to a profusion of outlets that publish by the ton.

It is tempting for the publications with the good reputations to combine these two changes, to find some way to extend their reputation for high quality over new low-cost, high-volume efforts. This was the rationale that led to the creation of the Washington Post’s blogPost aggregation and commentary feature, made famous by the resignation of Elizabeth Flock after being reprimanded for not having attributed some of the material she was aggregating.

It’s worth quoting from the column the Post’s ombudsman, Patrick B. Pexton, wrote after she resigned:

Flock resigned voluntarily. She said that the [two] mistakes were hers. She said it was only a matter of time before she made a third one; the pressures were just too great.

But The Post failed her as much as she failed The Post. I spoke with several young bloggers at The Post this week, and some who have left in recent months, and they had the same critique.

They said that they felt as if they were out there alone in digital land, under high pressure to get web hits, with no training, little guidance or mentoring and sparse editing. Guidelines for aggregating stories are almost nonexistent, they said.

Flock and her fellow aggregators were caught between the commodity news logic of an aggregation site and the Post’s brand, a tension that also showed up in the New Yorker providing a platform for Jonah Lehrer’s recycled content; as Julie Bosman noted in the New York Times, the magazine’s “famed fact-checking department is geared toward print, not the web.” It also appeared in the Journatic scandal, where fake bylines were added to stories written by overseas freelancers.

In all of these cases, the temptation is to place a low-cost process under a high-value brand. It’s clear that rapid commodification of ordinary news is not just inevitable but desirable, to free up resources for more complex work elsewhere. It’s also clear that the temptation to make commodity news look like its non-commodified counterpart is also significant, even for institutions as august as the Post and the New Yorker.

Basic respect for the journalistic effort demands that you give people doing commodity work clear guidelines about what is and isn’t permissible. Basic respect for your audience demands that it be given clear guidelines about the source and process of news.

“Breaking news from around the web” can be a valuable feature and asking people in the Philippines to write what is essentially standard copy, given a particular set of facts, are both useful strategies. But presenting them as no different from more aggressively researched, composed and checked stories creates both short- and long-term risks that are not worth the momentary arbitrage opportunity of marrying a good brand with cheap content.

The change in the ecosystem here is that functions previously executed among competitive news organizations, and especially scoops and breaking news, are now taken over by platforms. Any given news organization may set itself up to be faster at breaking sports news than Deadspin, say, or faster at breaking tech news than Scobleizer, but no organization today can consistently beat Facebook or Twitter on speed or spread.

One final observation: A core thesis of this essay is that the country’s news organizations are no longer adequate to ensuring coverage of the news on their own. This puts existing institutions in the awkward spot of needing to defend or even improve parts of the current ecosystem from which they may not profit, and which may benefit their competitors.

Were news organizations merely commercial entities, this would be impossible— Best Buy has little interest in improving the electronic ecosystem in ways that might benefit Amazon or Wal-Mart. News organizations, however, are not merely commercial entities. They are instead constituted to shield newsroom employees from most of the business questions a paper faces (however imperfect such “Chinese walls” turn out to be in practice). Indeed, if news organizations were not sources of such tremendous civic value, separate from the logic of the market, their commercial senescence would make no more difference than the closing of the local travel agent’s office.

Given this, and given the need for post-industrial journalism that makes considerably better use of an hour of a journalist’s time or a dollar of an institution’s money, news institutions large and small, commercial and for-profit, executional and educational, should commit themselves to two changes in the current ecosystem.

Recommendation: Demand that Businesses and Governments Release Their Data Cleanly

The most valuable dollar a news organization can make is the dollar it doesn’t have to spend, and in the 21st century, the easiest dollar not to spend is the dollar spent gathering data. In keeping with our recommendation that news organizations should shift some of their priorities from covering secrets to covering mysteries, anyone who deals with governments or businesses should demand that publicly relevant data be released in a timely, interpretable and accessible way.

Timely means that the data should be made available soon after being created. It is of far less value to know what committee recommendations were after a piece of legislation has gone up for a vote. Interpretable data come in a structured and usable format. Data should be made available in flexible formats, such as XML, and not inflexible ones, like PDF. (Indeed, using a format like PDF for publishing is often a clue that an organization has something to hide.) Accessible means that the data are made readily available over the public internet, instead of being kept on paper or made available by request only. The FCC’s ruling that broadcast outlets had to publish their political ad records online, rather than keeping them available “for inspection” at the station, was a big improvement in accessibility.

Every news outlets should commit some resources, however small, to taking an activist stance on this issue. Better access to better data is one of the few things that would be an obvious improvement for the news ecosystem, one where the principal obstacle is not cost but inertia, and one where the news organizations’ advantage in creating improvement is not expenditure of resources but moral suasion.

Recommendation: Recognize and Reward Collaboration

Organizations that offer grants and rewards provide a signaling mechanism for how practitioners of journalism should regard themselves and their peers.

These organizations should begin offering grants or create award criteria or categories that reward collaboration, either explicit, as in the case of SchoolBook, or implicit, as with organizations that provide access to their data for reuse by other organizations, as with Dollars for Docs. Similarly, awards for successful reuse of a reporting template—for example, other news organizations ferreting out Bell, Calif.-style corruption— would help alter the current valorization of handcrafted work that tends not to be repeatable, even when the reporting uncovers a potentially widespread problem. It was a huge loss for the nation that no organization undertook a systematic look at other states’ nursing boards after the California scandal or a harder look for off-balancesheet vehicles after Bethany McLean wrote about Enron.

McLean noted, in an interview for this report, that a key part of her ability to study Enron was cultivating skeptics as sources—her initial interest came after a short seller characterized Enron’s financial statements as incomprehensible. This might seem like an obvious strategy, but few in the business press followed it, either before the fall of Enron or, far more alarmingly, even afterward.

Organizations that shape assumed community norms among journalists and editors should highlight efforts that build on previous work. As with all grants and awards, these changes will reach only a few institutions directly but will reach many indirectly, by communicating the kinds of work that might reap either commercially unconstrained funds, the admiration of one’s peers, or both.

Conclusion: Tectonic Shifts

It was a memo from the future, an astonishing look at the dawn of the public digital landscape, from senior newspaper management. In 1992, Robert Kaiser, the managing editor of the Washington Post, attended a meeting in Japan populated by visionary technology leaders who introduced him to the future of “multimedia” and to the idea of personal computers and digital networks as alternate methods of delivery for media businesses.

Kaiser then wrote a 2,700-word memo for Post Co. CEO Don Graham and the newspaper’s senior management that opened with the (inaccurate but evocative) metaphor of the frog in boiling water:

Alan Kay, sometimes described as the intellectual forefather of the personal computer, offered a cautionary analogy that seemed to apply to us. It involves the common frog. You can put a frog in a pot of water and slowly raise the temperature under the pot until it boils, but the frog will never jump. Its nervous system cannot detect slight changes in temperature.

The Post is not in a pot of water, and we’re smarter than the average frog. But we do find ourselves swimming in an electronic sea where we could eventually be devoured—or ignored as an unnecessary anachronism. Our goal, obviously, is to avoid getting boiled as the electronic revolution continues.

Kaiser goes on to describe what he learned at that meeting, about a world where electronic distribution and consumption reshape the media landscape. Kaiser not only warns his fellow executives about the risks of being devoured or, worse, ignored, but he also goes on to propose that the Post immediately undertake two R&D projects: “1) Design the electronic classifieds now” and “2) Design the world’s first electronic newspaper.”

When the full copy of Kaiser’s memo circulated among the news cognoscenti in the summer of 2012, it kicked off a flurry of public discussion about how prescient Kaiser had been, and how unfortunate it was that this incredible preview of what was coming—written before the public launch of the web—didn’t get acted on.

Much of that conversation about what might have been, however, missed a second, critical aspect of the memo: Even if the Post had executed swiftly on everything Kaiser had proposed, it wouldn’t have worked. Despite Kaiser’s brilliance in laying out the great forces then only barely visible, his memo also contains hints of the difficulties adapting to a world where the internet was normal.

Kaiser assures his fellow executives that since people will need filters for all the new information, they will therefore need professional editors:

Confronted by the information glut of the modern world, I suspect even the computer-comfortable citizens of the 21st Century will still be eager to take advantage of reporters and editors who offer to sort through the glut intelligently and seek to make sense of it for them. Interestingly, when I asked a number of people at the conference what they’d like to be able to do in the electronic future, many spoke of finding all the extant journalism on subjects of interest to them. (CompuServe now offers a rather primitive grazing tool to permit this sort of thing.)

Kaiser looked right at the “rather primitive” capability—search—that would eventually float first Yahoo and then Google and assumed that it would remain marginal, because he assumed the business he was in—editorial judgment— couldn’t be displaced. Likewise, his pair of proposed R&D efforts contained the very thinking that would sidetrack a thousand attempts at innovation; Kaiser said, of his electronic classifieds, that the Post should also

… reserve the right to postpone implementation until a moment when we’re confident we’ll make more money (or deter a competitor) by launching the electronic product.

Even someone who had seen far into the future still missed the crucial lesson, one that Alan Kay and his cohort had clearly tried to impart: No one could reserve the right to postpone implementation of the future. The big but hidden mistake was the assumption that the Post, or indeed any institution, could opt out of the coming changes. This mistake was made more punishing because Kaiser’s assumptions didn’t allow for the possibility that new distribution channels for news and advertising might generate less money per user, rather than more.

This was the real issue, impossible to recognize then, but obvious in hindsight: The problem legacy news organizations faced over the two decades since Kaiser’s trip wasn’t competition but revolution. They assumed that new technology would raise rather than lower ad revenue, or that it would deliver more control to the publisher rather than to the reader. This was consonant with everything that had happened up to 1992, but it wasn’t what was about to happen as the internet started giving everyone a lot more freedom.

Tectonic Shifts

In the 1990s, those of us thinking about the relationship between the internet and news organizations wrongly assumed that the core problem those organizations faced was understanding the future. This turned out to be a merely ancillary problem. The core problem was adapting to that future.

The story of journalism in 2012 is still often told as the story of the breakdown of the old world, the end of the period when “the news” was whatever an enumerable collection of institutionally stable actors chose to publish. This assumption ran so deep that even someone who had seen decades into the future could still believe that the digital turn in the newspaper business would favor traditional virtues of editorial choice over the new ones of user empowerment and that the business case for electronic media was around revenue generation rather than cost reduction.

That “End of an Era” story, though, is itself ending. We are living in the least diverse, least inclusive media environment we will inhabit for the foreseeable future, which is to say that the ecosystem forming around us will include more actors and actions than even today’s environment does.

It’s easy to equate this increase in public speech with an increase in chaos, but chaos is a wasting asset—what seems hopelessly confusing today will be tomorrow’s new normal. The old order won’t be restored, but people will get used to the new one that’s emerging.

Though we have generally concentrated on the question “What does the production of news like today?” in this section we will ask a related question: Given the forces already at work, what will the production of news look like in 2020, seven years from now? This is as far from today as today is from 2006, when You- Tube, Twitter and Facebook were all in their infancy.

As with any exercise in prediction, we will get this at least partly wrong, overestimating some changes, underestimating others, and, most significantly, failing to predict new forces that will appear in the next seven years. Our goal here is for accuracy in direction, not endpoint; we believe that many of the forces that will shape the news environment in 2020 are already visible today, in the same way that social networking and user-distributed video were visible seven years ago.

In 2020, there will be considerable surface continuity with the news environment of the 20th century. There will still be a Los Angeles Times and a CNN. Yet this continuity of institutions will be accompanied by a reconfiguration of almost every bit of the media world in which they operate. As George W.S. Trow put it in “Within the Context of No Context,” his wonderful and strange musing on the changed social landscape in the United States:

Everyone knows, or ought to know, that there has happened under us a Tectonic Plate Shift […] the political parties still have the same names; we still have a CBS, an NBC, and a New York Times; but we are not the same nation that had those things before.

Trow was discussing the loss of any obvious core of civic culture in the aftermath of the late 1960s, but the figure of a tectonic shift can also serve as a metaphor for the media environment today. The label “CBS News” still describes the journalistic arm of a U.S. broadcaster, but it no longer stands for “Tiffany” standards in news, and it no longer occupies a position of unquestioned centrality in the news environment. This is partly because CBS itself approaches news differently, but mostly because the competitive and consumptive landscape of news has shifted so dramatically that even if CBS News’ sole goal in the last two decades had been to preserve its former position, it would have failed.

The news ecosystem of 2020 will be a study in expansion, with heightened contrasts between extremes. More people will consume more news from more sources. More of these sources will have a clear sense of their audience, their particular beats or their core capabilities. Fewer of these sources will be “general interest”; even when an organization aims to produce an omnibus collection of news of the day, the readers, viewers and listeners will disassemble it and distribute the parts that interest them to their various networks. An increasing amount of news will arrive via these ad hoc networks than via audiences loyal to any particular publication.

Almost every aspect of the news environment will be more variable than it is today. We’re not shifting from big news organizations to small ones, or from slow reporting to fast. The dynamic range of journalism is increasing along several axes at once. The internet has unleashed demand for more narrative and more data-driven news, for a wider range of real-time sources and wider distribution of long-form pieces.

A few organizations will have larger newsrooms than today, mostly subsidized by media services sold to professionals (as with Thomson supporting Reuters, or Bloomberg’s purchase of Business Week.). Most news outlets, though, will have smaller newsrooms, measured by full-time headcount. At the same time, there will be many more niche players than today, with smaller and more narrowly tailored operations (the Outer Banks Voice; Hechinger Report).

There will be more nonprofit news organizations, driven by several kinds of donation—direct cash subsidy by philanthropies and other donor organizations (Ford Foundation funding Los Angeles Times reporters; William Penn Foundation funding PennPraxis), user donations of cash (NPR; TPM), and in-kind donations of the time and talents of a particular community (as with the creation of Wikipedia disaster articles, or Twitter hashtag streams).

The obvious benefit of increased subsidy for news is its increased availability. The equally obvious downside is that it risks further blurring the boundary between public relations and journalism. The growing number of news outlets, and their varying motivations and funding sources, increases the need for self-policing, as independent news outlets learn to better identify, label and publicly rebuke “churnalism.” (As David Weinberger has noted, transparency is the new objectivity.)

The decay of the traditional agenda-setting function of the press will continue, and with it the idea of “the public” as a large, interconnected mass of newsconsuming citizens. Choice in available media outlets will continue to expand, leading not so much to echo chambers as to a world of many overlapping publics of varying sizes. Seen in this light, the long-term collapse of trust in the press is less a function of changing attitudes toward mainstream media outlets than a side effect of the continuing fragmentation of the American media landscape. (It is probably time to retire the idea that there is something called “the press” that enjoys a reputation among some group called “the public.”)

The shift in control of distribution will also continue. The old model, where most users visited a home page or used a mobile application tied to a single organization, will continue to lose ground to superdistribution, users forwarding relevant materials to one another. We already live in a world where the most widely circulated stories acquire audiences that dwarf the median headcount. Adapting to this increasingly unequal distribution will require that most organizations get better at working with their users to filter and pass on relevant material.

This superdistribution won’t just be about the spread of new material; one of the great surprises of Twitter, a medium built around “short” and “now,” is how much demand it has exposed for long-form writing and video. News.me, a recent startup, filters through people’s Twitter feeds and recommends the most widely viewed links from previous 24 hours; a remarkable amount of what gets surfaced is not singing cats but long, careful pieces of reporting or opinion.

Though the “hamster wheel” (chasing transient viewers with rapid publication of sensational stories) is an obvious effect of the internet’s colonization of the news landscape, the increase in the dynamic range of the news environment is taking place at both ends of the distribution; the hamster wheel has been accompanied by an increase in large-scale reporting and analysis.

More techniques will be deployed in the production of news—algorithmic data analysis, information visualization, solicitation of amateur input, feedback loops with crowd reaction, automated production of data-driven stories. More generalists will be working in niche subjects; interviewers on particular topics who create, edit and distribute photos, audio or video as a newsroom of one. Narrower and deeper specialization will occur among the newsrooms that have staffs large enough to allow collaborative units to work together: By 2020, the most expert data miner, information visualizer or interactive experience designer will have a far more refined set of tools and experience than any of those people do today.

Each newsroom will become more specialized, with less simple replaceability of employees and functions from one newsroom to the next. Each newsroom will have a better sense of who its partners are, among institutions and the general public, and will have customized its sense of how best to work with them. Many producers of the kind of material we used to regard as news won’t be news organizations, in any familiar sense of the word. The police blotter will come from the police. Environmental data will be presented via interactive tools hosted by the Sierra Club. Wikipedia and Twitter will strengthen their roles as key sources of information for breaking news.

As Kaiser and the Washington Post eventually found out, there’s no reserving the right to postpone implementation for the kinds of changes we are witnessing. There is only the struggle to adapt and to secure a niche in the ecosystem that allows for stable creation of long-term value.

What Should Journalists Do?

Like a Necker cube, it’s possible to look at the journalism landscape and see one of two sets of relationships—the work of individual journalists supporting institutions, or the work of institutions supporting individual journalists. There is some truth to both views, of course, but we have concentrated on the latter, for several reasons.

First, the work of journalists takes both logical and temporal precedence over the work of institutions. Second, the act of witnessing, discovering or understanding what is important, and then conveying that in a way that various publics can understand, is the sacred task; concern for journalistic institutions takes on public urgency only to the degree that they help the people engaged in those tasks. And third, far too much of the conversation of the past decade has assumed that the survival of existing institutions is more important than the ability of anyone to take on that sacred task, however it’s done.

Though the concept has been somewhat tainted by the cheesiness of “Brand You!” boosterism, we live in an age where the experiments of individual journalists and small groups are ideal for identifying possible new sources of value— process is a response to group dynamics, so the smaller the group, the easier it is to balance process and innovation (though later, of course, those innovations will have to be rendered boringly repeatable).

If you were looking for an ideal mantra for a journalist, writer, analyst, media artist, data miner or any of the other roles and tasks that matter today, “Proceed until apprehended” is a good one. As an NPR executive said to Andy Carvin during his invention of the curated Twitter news feed, “I don’t understand what you’re doing, but please keep doing it.”

In this essay, we’ve offered a description—several, actually—of the skills and values an individual journalist can bring to bear. The range of these descriptions exists because journalism is not moving from A to B, from one stable state in postwar America to some new, alternate state today. Journalism is instead moving from one to many, from a set of roles whose description and daily patterns were coherent enough to merit one label to one where the gap between what makes Nate Silver a journalist and what makes Kevin Sites a journalist continues to widen. With the coming increase in possible modes and tempos of journalism, our overall recommendations for journalists are these:

Know yourself. Know what you are good at and what you are not good at, and know how to explain those things to others. Know your areas of expertise, both for content (North African politics; civil engineering; historical weather patterns) and skills (are you an interviewing journalist? A researching journalist? A Final Cut journalist? An Excel journalist? A Hadoop journalist?).

Know when the tools you need are algorithms or crowds. Know when a person you need to talk to is more likely to be found via Twitter than directory assistance. Know when your network can help; know when someone in your network’s network can help, and get good at asking for that help (and also get good at rewarding people who help).

Know when process is aiding your work and when it’s not, and, to the degree you can, know when to break the glass in the latter case. Know when to work alone, when to call for help, when to partner outside your usual sphere.

Much of this is about specialization of one sort or another. It’s possible to specialize in content: in the kind of material you cover, or the kind of background you master, or the kind of people you interview. It’s also possible to specialize in technique: you can get good at mining databases, reading investment documents, traveling in distressed zones, or engaging users, and each of those skills will be transferable to many areas of inquiry. You can specialize in content and be a generalist about technique, you can specialize in technique but be a generalist in content, or you can specialize in both. (Specializing in neither used to be a fine answer; less so today.)

Journalism schools will have to adapt to these changing models as well. Already, journalism schools are more like film schools than law schools, which is to say that the relative success or failure of a J-School grad is going to be far more variable than it used to be. There are fewer entry-level jobs—the jobs that used to serve as unofficial proving grounds and apprenticeships—in metropolitan dailies and local TV than there used to be. Furthermore, the careers students head into will be more variable, and more dependent on their ability to create their own structure, as opposed to simply fitting into a position in a well-known collection of rich and stable institutions.

Schools should respond by helping students both understand what sorts of specializations they’d like to undertake, and how to go about them, a task that has much less to do with fitting them to particular institutions, as with the old (and now harmful) broadcast vs. print split, and much more to do with fitting them to particular forms of inquiry, wherever and however they practice it.

The fate of journalism in the United States is now far more squarely in the hands of individual journalists than it is of the institutions that support them. To get the kind of journalism that a complex, technocratic democracy requires, we need the individual practitioners to take on the hardest part of the task of working out what constitutes good journalism in a world with no publishing scarcity.

What Should Legacy News Organizations Do in This Environment?

Though many existing institutions still regard the principal effect of the current changes as continued loss of revenue, the restructuring of American journalism is far more influenced today by organizational models than by income (or lack thereof). With a handful of exceptions, for-profit news organizations will have to continue to reduce expenses to below still-falling revenue, but simply cutting will leave us with legacy institutions that do less with less.

Existing institutions must adapt their journalistic operations, not just their bottom line, to the internet. Doing more with less is always easier said than done, but as Homicide Watch or Narrative Science demonstrate, it is not impossible.

Though we put forward several recommendations in the body of this essay, our overall recommendations for legacy institutions are essentially these:

Decide what part of the news you want to report and how.

Get out of any activity that doesn’t support those goals.

Look for partnerships or collaborations with other organizations that advance those goals at lower cost than you could manage in-house.

Work to make the remaining activities either excellent or cheap (or, ideally, both).

Some legacy news organizations will simply shrink the cost of filling the news hole with no other reorganization, a move that will amount to getting out of the hard news business by degrees. Some of these organizations may be able to survive with their newly lowered expenses, but the reason to care about the continued health of legacy news organizations has always been about the public service they provide; organizations that shrink without trying to take on new, cheaper capabilities are abandoning at least part of that public service mission. They will also attract fewer competent journalists.

Keeping expenses below revenue remains a problem, of course. Advertising declines—six years and counting—have left the nations’ newsrooms, subsidized by that money, in a parlous state. Given advertisers’ continued decampment to alternate platforms and the dreadful logic of a declining print audience—income compresses faster than costs for running the presses—many legacy organizations will have to operate with an expanded sense of where revenue can come from: running events, applying for grants for specific beats, digital membership dollars from the most committed 5 percent of readers. Continued reduction of costs, however, remains the most obvious strategy.

There is no way to support the old “one-stop shop” model for supplying all (or even most) of a user’s news and information, because, without geographic barriers to entry, there is very little defensible advantage in running commodity news that’s the same as in the next town or state over. Like the principle of subsidiarity for the U.S. government (that the federal government should ideally run only those services not better run by the states, states than cities, and so on), news should be produced and distributed by the people best able to cover it. This suggests a shift to dramatically increased specialization and partnership.

In practice, many legacy newspapers have followed this advice by loading their front pages with reams of AP content and the occasional “big blowout” news story, a prime example of adapting to loss of income rather than adapting to the internet. A digitally minded news organization would dispense with running commodity content online entirely, perhaps linking to important news, or even excerpting the content of smart bloggers or other aggregators. No matter what specific decisions get made in this regard, however, news institutions that see the “front page” as their primary organizational concern will miss many opportunities for reinvention.

The wastefulness of pack journalism and the empty calories of unimproved wire service news are both bad fits for most institutions in the current environment. The organizations that set out to provide a public with a large part of the news will more often be aggregators, in the manner of Huffington Post or BuzzFeed, than traditional news organizations, if only because the cost and quality curve favors that form of aggregation over expensive improvements of syndicated content or, further up that curve, creation of custom material that lacks either a passionate audience or a long shelf life.

Similarly, newsrooms will have to decide what parts of their operations to commodify. Much checklist reporting (e.g., brief pieces on last night’s game or this quarter’s sales figures that have to be present, but don’t have to be long or excellent) can be replaced by aggregation, or by machine production. For most organizations, anything that is high touch but low value (and by high touch we mean anything that involves more than 10 minutes of paid human attention) should be automated, outsourced to partners or users, or removed entirely.

Newsrooms that have mixed reporting goals—breaking news and long-form analysis—will have to get better at understanding the sharpening trade-offs between speed and depth. There is no right answer here, or even right mix: Coverage of slow-moving beats with a relative handful of relevant participants— the mining industry, or automotive design—will have a different mix than fastmoving, surprise-driven ones—electoral politics, civil wars.

Similarly, newsrooms will have to understand the sharpening trade-offs between aggregation and original reporting, and optimize for each activity differently, or the trade-offs between translating first-person accounts vs. putting journalists in between those sources and the audience, to contextualize and interpret.

Existing organizations will also have to get better at managing both relationships and data as new resources. The ability of an institution to ask its own users to participate in the creation, vetting and distribution of news, or to find firsthand witnesses or knowledgeable insiders for a particular story, will become key sources of differentiation. Similarly, the ability to master certain sorts of data and to be able to reliably create value from it over time are increasingly essential skills. (The irony of U.S. News and World Report’s long competition with Newsweek and Time is that the U.S. News college listings database may soon become more valuable than those other two publications combined.)

On the procedural front, organizations will have to get better at understanding when process is a help and when it is a hindrance, and they will have to get better at making their process “hackable.” They will likewise have to decide which employees or volunteers have the ability to violate or alter the standard institutional processes, in order to pursue unforeseeable but valuable opportunities. Of all our recommendations, this one may be the most difficult for legacy institutions to follow. But in other ways, the success or failure of many of these companies will be determined by their ability to embrace flexibility.

What Should New News Organizations Do?

The range of new models and ideas being tried by journalism startups is wide, but most of the groups working on these ideas are not yet either robust or stable. This is partly because, as in any revolution, the old things break long before the new things are put in place, but it is also because the business model in the last several decades has created a news monoculture, where advertising subsidy has been the default revenue source even for those organizations that also charged fees directly to their users.

New news organizations will have to do everything legacy organizations do in terms of mastering the trade-offs between speed and depth, aggregation vs. origination, or solo creation vs. partnership. In general, however, understanding and managing these trade-offs is easier for new organizations, simply because individual employees don’t have to unlearn previous assumptions in order to adapt to present realities. As always, the advantage young people and organizations have over older ones isn’t that they know more things. They don’t. The advantage is that they know fewer things that aren’t true any longer. Without carrying the weight of accumulated but maladaptive assumptions, they have to spend less time and energy unlearning things before they can see and respond to the present world.

Our overall recommendation for new news organizations is even simpler than for journalists or for legacy organizations:

Survive.

The visible crisis of news institutions is the shrinking of their traditional functions, but the second, less discussed, crisis is the need for institutional stability, predictability and slack among the nation’s news startups.

Much of the question of institutionalization of startups concerns the way these organizations manage income and expenses, a conversation outside the scope of what 21st-century journalism looks like. (To reiterate our position: Most of the for-profit vs. nonprofit discussion is useless. Any way of keeping expenses below revenue is a good way.) Some of it, however, has to do with organizational assumptions and capabilities built into new organizations from the start.

New organizations should assume that cost control is the central discipline and that many sources of subsidy available to startups have a limited lifespan. They should master working with amateurs, crowds, machines or other partners to keep cost low and leverage high. To survive, new news startups will need to take on some of the routinization of work and stabilization of process of the older institutions they are challenging. They should not fear becoming a little boring.

There is a certain blitheness to the conversations around the current disruption, a kind of “great cycle of life” belief that the old institutions will be weakened and that the new institutions will then automatically take their place.

That is one possible scenario, of course. Another is that the old institutions become weakened but that the new ones don’t take their place, because they lack the institutional stability to act as a counterweight to large, bureaucratic organizations. Of all the terrible scenarios it is possible to imagine, this is the worst one— the legacy organizations continue to diminish in force and function, but the new entities arising simply can’t be as effective a check on bureaucratic power.

The End of Solidarity

Perhaps the most salient change in the next seven years will be the continued weakening of the very idea of what constitutes news, and thus what constitutes a news organization. This change, long since begun by Jon Stewart and MTV election coverage, is still at work today; to the question “Is Facebook a news organization?” neither “yes” nor “no” is a satisfactory answer. (A better reply is “Mu,” programmer-speak for “The question as asked has no sensible answer.”) Facebook is critical to the news ecosystem, yet it is organized along lines that are out of sync with anything we would recognize as a journalistic organization; its presence alters the context of that question.

There will also be fading clarity as to what constitutes “the news,” full stop. Institutions persistently mistake shallow continuity for deep structure; news isn’t a coherent or ontologically robust category; it is a constantly negotiated set of public utterances by a shifting set of actors, one that happened to go through a period of relative stability in 20th-century America. We are seeing the end of that stability, the end of the curious bookkeeping that says that the St. Louis Post- Dispatch is a news organization, even though it runs Annie’s Mailbox and the funny pages, while Little Green Footballs is not, even though Charles Johnson did a better job than CBS in vetting the phony National Guard memos involving George W. Bush’s service.

The production of news has moved from being a set of jobs to a set of activities; there will always be a core of full-time practitioners, but there will be an increasing amount of participation by people working part time, often as volunteers, and distributed by people who will concentrate less on questions of what is news and what isn’t than on questions like “Will my friends or followers like this?” Increasing overlap and collaboration between the full- and part-time practitioners, and between the employees and the volunteers, will be a core challenge over the rest of the decade.

This will be a world where the biggest changes have come in the roles of not full-time journalists, but of the public, where atomized consumption and private discussion in small groups have given way to a riot of alternate ways of sharing, commenting on and even helping shape or produce the news.

All of us are adapting to this changed environment; the existing institutions and the new ones, the full-time shapers of the news and the part-time ones, the generalists and the specialists. And perhaps the single most important adaptive trait is to recognize that that we are in a revolution, in its sense of a change so large that the existing structure of society can’t contain it without being altered by it.

In a revolution, strategies that worked for decades may simply stop working (as many already have). Strategies that seemed impossible or insane a few years ago may now be perfectly suited to the current environment. This period is not over, and the end is not even in sight; the near future will hold more such reversals, so that even up-to-the-minute strategies of a few years ago (RSS feeds and staff

blogs) may fade into prosaic capabilities, while new ones (the ability to hunt for mysteries instead of secrets, the ability to bring surprising new voices to public attention) may become newly important.

More than any one strategy or capability, the core virtue in this environment is a commitment to adapting as the old certainties break and adopting the new capabilities we can still only partially understand, and to remember that the only reason any of this matters to more than the current employees of what we used to call the news industry is that journalism—real reporting, about whatever someone somewhere doesn’t want published—is an essential public good.

Methods Used in This Report

More an essay than a piece of testable scholarship, we nonetheless drew on a variety of methods while formulating our analysis, recommendations and conclusions. Primarily, the research was based in qualitative interviews, conducted both one-on-one, on location, over email or telephone, and at the offices of the Columbia University Graduate School of Journalism. A significant amount of data was gathered at a closed-door conference at the journalism school on April 17-18, 2012, that involved 21 people.

For the most part, however, this essay draws on the industry experience and previous scholarship of its authors. It attempts to combine more traditional academic theory with current developments in the worlds of journalism and digital media—always a fraught task. To the degree we have succeeded, we hope that the report is neither superficial to those coming to it as scholars, nor overly dense to working journalists who may work their way through its pages.

Ultimately, we believe that this report should also serve as a call to further, more traditional academic research. Many of its conclusions can be tested through a variety of methods and with a variety of goals in mind. Insofar as the authors each work at different schools of journalism in New York City, and insofar as each is engaged in a different aspect of scholarly production for their respective home institutions, the future for “useful journalism research” would appear bright. Ultimately, the conclusions and provocations of this essay will rise or fall based on changes within journalism itself.

Acknowledgments

In keeping with the spirit and theme of this essay, this has been a collaborative effort, inadequately accounted for by the authors’ names on the cover. All of us have benefited from observations, conversations and counsel from our colleagues who have taken time to support this effort in one way or another. Our first thanks goes to Charles Berret, a Ph.D. candidate at Columbia Journalism School, who has been with us all along and helped both coordinate and think through the various aspects of this work. We couldn’t have done this without Charles.

We thank Dean Nicholas Lemann of Columbia Journalism School, whose vision for this examination of the journalism landscape gave the project its start. None of this would have or could have happened without him. Others in the Columbia administration who aided us greatly in the process were Sue Radmer, Stephen Barbour and Anna Codrea-Rado. We thank Marcia Kramer for her patient copy editing and suggestions.

We are indebted to the support of the Carnegie Corporation, the funder of this work. We would also like to thank the Tow Foundation for its ongoing support in all our work at Columbia through the Tow Center for Digital Journalism. The largest collection of voices represented here, and the people who gave the most time, participated in a two-day meeting and workshop, held in New York City on April 17-18, 2012. Attendees included Chris Amico, Laura Amico, Josh Benton, Will Bunch, Julian Burgess, John Keefe, Jessica Lee, Anjali Mullany, Shazna Nessa, Jim O’Shea, Maria Popova, Nadja Popovich, Anton Root, Callie Schweitzer, Zach Seward, Daniel Victor and Christopher Wink. It is no exaggeration to say that we began that meeting with a disparate set of observations and ended it with the outline of this work.

Throughout the process, we have relied on the observations of our peers, either as interviewees on the present and future state of journalism, or as respondents to early drafts of the work. For that, we thank Erica Anderson, John Borthwick, Steve Buttry, David Carr, Andy Carvin, Susan Chira, Reg Chua, Jonathan Cooper, Janine Gibson, Kristian Hammond, Mark Hansen, Andrew Heyward, Alex Howard, Vadim Lavrusik, Hilary Mason, Bethany McLean, Javaun Moradi, Dick Tofel, Matt Waite and Claire Wardle. University faculty, working inside and outside traditional journalism schools, proved to be vital sources of intellectual provocation and nourishment; we thank, in particular, Rasmus Kleis Nielsen of the Reuters Institute for the Study of Journalism at the University of Oxford and Michael Schudson and Robert Shapiro of Columbia University.

Thank you, lastly but not least, to our families for the tolerance, support and advice throughout.