Thinking Critically about Mass Media

Perhaps the one thing that unites most Americans is their disgust with, and distrust of, journalism: Everyone hates the mass media. Surveys show that less than a third of Americans say that news organizations generally get the facts straight, and the level of trust is dropping.[1]Much of this distrust is expressed as a belief that journalists are not objective and, therefore, have become a vehicle for propaganda.

As is often the case, these critiques are made with no clear definition of “objectivity” or “propaganda.” In this essay I will offer some suggestions about definitions, in the hopes not that everyone will come to agreement about journalism, but that disagreements will be more productive.

Objectivity

Like all terms, “objective” and “objectivity” are used in different ways in different contexts. In everyday conversation, if someone is making an argument that seems to be unfairly biased or unnecessarily argumentative, we often counsel that the person to “try to be objective.” What we typically mean is that the other person’s passion might be impeding their ability to see things clearly. Being objective, in this case, means something like this: Try to understand your preconceived ideas about the subject and recognize how those preconceptions might skew your perspective, even to the point that you may be tempted to fudge the facts or make claims that aren’t true. When we ask each other to be objective, we are reminding ourselves to keep an open mind and not make things up just because they bolster our argument.

In that everyday sense of the term, objectivity is a good thing—for me, for you, for journalists, for everyone. Objectivity is just another way of reminding ourselves what good intellectual practice looks like. To be objective, we need not pretend we don’t have a point of view, that we aren’t passionate about our ideas and commitments. Rather, the reminder to be objective is a corrective if our passion leads us to get sloppy in our critical thinking.

Objectivity also has a more specific meaning in the context of a scientific laboratory. Because science is an enterprise carried out by humans, scientists don’t claim to have developed a method that brackets out all subjective decisions. But the scientific method offers a way to generate knowledge that can be rigorously tested and verified. Scientists develop protocols for measuring aspects of the world they wish to study and devising experiments to test hypotheses. These methods are not foolproof, but they have been extremely successful in expanding our understanding of the world.

This scientific sense of objectivity may guide our intellectual practices—we adapt ideas about measurement and experimentation in rough fashion to our everyday life. For example, if we want to know whether a dish we’ve cooked tasted better with or without hot peppers, we might conduct an ad hoc experiment by preparing the food both ways and asking our dinner guests which they prefer. Journalists do this as well, but not with the kind of rigor that one sees in a laboratory. Scientific objectivity, in the strict sense, isn’t possible in journalism.

Neither of those definitions captures what objectivity means in mainstream corporate-commercial journalism in the United States today.[2] Yes, journalists strive to be objective in the everyday sense, and when possible journalists mimic the method of scientists. But “objectivity” in practice in mainstream journalism defines a set of professional practices that are most concerned with who and what is a trustworthy source. This practice of objective journalism favors what are typically called “official sources,” which actually undermines the ability of journalists to do their job responsibly.

While journalists move about in the world and sometimes directly observe events they write about, much of journalism is based on other people’s accounts of what happens. Journalists get this information through interviewing people or reviewing documents that others have produced. The crucial question for journalists is which people and which written accounts are most authoritative? When there are conflicting accounts of the world, which can be trusted? The research on this subject,[3] and my own experience as a working journalist, points to a simple conclusion: Official sources dominate the mainstream news. An official source, in journalistic practice, is one who is associated with a reputable organization that has some credibility and status in the culture. In the contemporary United States, that means the government and the corporation, and a few other institutions that are seen as producing trustworthy knowledge, such as universities and think tanks. These become the “authorized knowers” on whom journalists rely.

Here’s an example of how journalists rely on these sources. After many of the claims made to justify the U.S. invasion of Iraq in 2003 were demonstrated to be false, journalists were challenged to explain their failure to provide a critical and independent evaluation of those claims. One such exchange took place on “The Daily Show,” with Jon Stewart questioning CNN anchor Wolf Blitzer. After acknowledging the failure, Blitzer explained that he and other journalists had done the necessary reporting but still were unable to learn the truth:

So, I remember going off. I had all the briefings. I went over, got the briefings from the CIA, the Pentagon, spoke to all the members of Congress, the intelligence committees, the House side, the Senate side. Everybody said the same thing: There is no doubt there are stockpiles of chemical and biological weapons, and it’s only a matter of time before he has a nuclear bomb.[4]

Note the sources that Blitzer includes in his list of “all the briefings” that were important in reporting the story: His sources were all officials from the U.S. government. Those officials don’t really constitute “all” of the potential sources, of course; Blitzer is suggesting that they are all the relevant sources. But might there have been others who would provide information and analysis that questioned the U.S. claims about Iraq’s weapons programs? What about sources in the anti-war movement in the United States, including former government officials who were warning that the WMD claims were overblown? Or sources in the Middle East who might have first-hand knowledge? Or sources who could speak to past cases where government officials lied about a foreign threat to justify war?[5]

Blitzer’s reflexive defense of his reporting is common in mainstream journalism. This reliance on official sources may not always produce good journalism—and sometimes may produce truly reprehensible journalism—but it’s easy to understand why the practice continues. Using official sources takes less time; government and corporate officials have large public relations operations that churn out information in a form journalists can easily use. That information is presumed credible, and journalists don’t have to defend their reporting techniques to news managers since that’s the way it’s always been done. This means the news managers can hire fewer reporters, saving on labor costs and increasing profits. And because most journalists think of themselves as working in a profession, in the same kind of position as lawyers, there is a subtle class allegiance at play. When evaluating sources, it’s not surprising that journalists favor folks who they view as being similar to them in education, social class, and worldview.

While we shouldn’t accept the claim that journalists’ professional practices produce objectivity, we also shouldn’t assume that the production of news is a totally subjective enterprise open to the whims of individuals. Journalists work within a system, interacting with political actors also working within systems, all responding to the people reading and watching the news. Rather than ask whether any one person in these systems is objective or subjective, we should understand news—like all human knowledge—as the product of an intersubjective process. The relevant questions are about the power each group has to affect the direction, framing, and content of the news. These officials are not only sources for news stories but also news shapers; they play a key role in defining what counts as news.[6] When representative of the wealthy and powerful have a disproportionate influence in that intersubjective process, the news is skewed toward the perspective of the those forces and tends to marginalize dissident voices, which reinforces the ideology of the powerful and helps make it the “common sense” of the culture by virtue of its constant repetition. These conventional reporting practices absorb a particular ideology but do not make it explicit.

One last warning about how words are used: As “objectivity” became increasingly suspect to more and more news consumers, some journalists abandoned the term and began describing their goal as “fairness.” While it’s healthy for journalists to recognize that naïve notions of objectivity are counterproductive, what’s needed is not just a shift in the term but in the underlying practices. If the professional practices that were described by “objectivity” don’t change, then relabeling them as “fairness” changes nothing. The problem isn’t the label we use to describe the practices, it’s the practices themselves.

Critiquing these professional routines is central to any sensible analysis of journalism, just as an evaluation of the practices of other professionals such as lawyers is part of understanding the role of law in society. Assessing journalism also requires that we look at the effects of the corporate-commercial structure of the mainstream news media and the larger ideological framework within which journalists work. Two important critics of the news media have argued that these forces create a journalism that often serves a propaganda function for the powerful.[7] To make sense of that claim, we need to think clearly about what we mean when we label a communication as “propaganda” means.

Propaganda and persuasion

Much like objectivity, propaganda is a term frequently used and infrequently defined clearly. The term originates in the 17th century as part of the Catholic Counter-Reformation, when the Sacred Congregation for the Propagation of the Faith was charged with spreading doctrine in response to the Protestant challenge. Until World War I, the term was used to mean any attempt to spread information and was not generally seen as a pejorative term. During that war the United States created its first official state propaganda agency to move public opinion toward support for a generally unpopular war, and the term began to acquire a negative connotation. By the end of World War II, the successful—but ugly and destructive—propaganda efforts of the Nazis solidified that association of propaganda with communication strategies designed to undermine people’s ability to participate in the honest and open dialogue essential to democracy. Today, to label someone’s communication effort as propaganda is understood as criticism.

But because democracy is based on people engaging each to persuade the other to support their proposals, it’s not enough to define propaganda as a systematic attempt to convince another to get on board with their political project. How will we distinguish between attempts to persuade that are consistent with good intellectual practice and democracy, and attempts to manipulate people that are inconsistent with good intellectual practice and democracy, what we might call propaganda? The distinction is not as easy to make as we may wish it were.

For example, this is the definition offered in a widely used textbook by two contemporary scholars: “Propaganda is the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist.”[8] I often give public talks about political subjects. In those speeches, I engage in a deliberate, systematic attempt to affect my audience’s perceptions, cognitions, and behaviors. If they don’t agree with me, I want them to change their minds. If they do agree with me, I want to solidify their position. In doing this, I don’t mention every possibly relevant fact or put forward every interpretation of those facts; I select the evidence and arguments that I think most important. Inevitably, I shape and, in some sense, manipulate. Is that propaganda? Working from that definition, it’s hard to tell.

When I pose this question to my students—is there a principled way to distinguish persuasion from propaganda?—some common answers emerge. First, they say, “propaganda is lying,” the knowing use of false statements to support a position. Certainly some of what we intuitively understand to be propaganda includes false statements, but much propaganda isn’t about claims that are clearly true or false, but about interpretation and impressions. Second, “propaganda uses emotion to manipulate people.” Again, that’s often the case, but is emotion not part of how we understand the world? If any appeal to emotion to influence people is propaganda, then there would be no role for our emotional reactions in public life, making for a sterile and inhuman public discourse. Third, “propaganda exploits powerful images to override rational thought.” But does that mean photography and film are not legitimate ways to present information about the world? If images always override our critical capacities, then we’re in real trouble.

I have not found, nor been able to construct, a definition of propaganda that with precision can distinguish propaganda from persuasion; such are the limits of language when dealing with the messiness of human affairs. But the attempt to clarify these concepts matters, because democracy is based on deliberation that, at least in theory, can produce resolutions of policy disagreements that are acceptable to every individual. In a democratic system we don’t hold out for the unrealistic goal of everyone agreeing about everything, but rather that everyone commits to a process that produces a fair resolution based on an honest and transparent process. If propaganda is a useful term to mark the communication techniques that derail that process, then we should struggle to deepen our understanding.

Rather than searching for a legalistic definition, I will offer a list of features of systems that intuitively we think of as healthy persuasion and unhealthy propaganda.

Democratic persuasion involves:

· a serious effort to create background conditions that give each person access to the resources needed to fully participate in discussion; and

· a serious effort to create forums in which access to the discussion is based not on power or money but on a principle of equality; and

· a commitment of all participants to intellectual honesty in presenting arguments and a willingness to respond to the arguments of others.

Undemocratic propaganda involves deliberate:

· falsification of accounts of the world to support one’s interests, and/or

· attempts to ignore or bury accurate accounts of the world that are in conflict with one’s interests, and/or

· diversion of discussion away from questions that would produce accounts of the world in conflict with one’s interests.

There is one disturbing implication of this framework: It suggests that virtually all commercial advertising and a significant portion of our political discourse is propaganda, or at least at some level propagandistic. From this perspective, the advertising, marketing, and public relations industries would be described collectively as the propaganda industries. When we consider how much our environment is constructed by those industries, we would hesitate to speak glibly about living in a democratic political system and a free society. When journalists become the transmission vehicle for much of this material, we might hesitate to speak glibly about a free press.

What do we say about the state of our political discourse when a presidential campaign can win the advertising industry’s “marketer of the year” award, as the Obama campaign did in 2008?[9] What do we say about a democracy in which a president’s chief of staff, when asked why the Bush administration waited until after Labor Day to launch its campaign to convince the American public that military action against Iraq was necessary, says, “From a marketing point of view, you don’t introduce new products in August.”[10]

We are left to ponder the degree to which deception, distortion, and distraction have become not perversions of an otherwise healthy public discourse but the perverse norm of that discourse. That question is disturbing, but rather than undermine our commitment to critical thinking, it should spur us to be more creative, soulful, and courageous.

-----------------------

Robert Jensen is a journalism professor at the University of Texas at Austin and board member of the Third Coast Activist Resource Center in Austin, one of the partners in the community center “5604 Manor,” http://5604manor.org/[2].

He is the author of All My Bones Shake: Seeking a Progressive Path to the Prophetic Voice, (Soft Skull Press, 2009); Getting Off: Pornography and the End of Masculinity (South End Press, 2007); The Heart of Whiteness: Confronting Race, Racism and White Privilege (City Lights, 2005); Citizens of the Empire: The Struggle to Claim Our Humanity (City Lights, 2004); and Writing Dissent: Taking Radical Ideas from the Margins to the Mainstream (Peter Lang, 2002).

Jensen is also co-producer of the documentary film “Abe Osheroff: One Foot in the Grave, the Other Still Dancing,” which chronicles the life and philosophy of the longtime radical activist. Information about the film, distributed by the Media Education Foundation, and an extended interview Jensen conducted with Osheroff are online at http://thirdcoastactivist.org/osheroff.html[3].