Applicant(s)

Size of team/newsroom:large

Categories:

Description

There was no bigger story in the United States than the 2016 presidential campaign. Our challenge was to bring light to it at a time of deep distrust with politics and the media. For years, NPR and other news organizations fact checked candidate statements. We declared them factual, nearly so, and just false. But despite a commitment to contributing to a more informed public, our work was often weaponized by campaigns, thrusting us into echo chambers that further undercut our credibility. Aggressive fact checking often seared falsehoods into our collective memory rather than wipe them out.
So in 2016, NPR approached the task with an eye to fostering a trust relationship with our audience through immediacy and transparency. We eventually landed on the old school idea of annotating candidates’ words. After important speeches, beat reporters from across the newsroom checked the facts and provided context and background. We inserted their work in the raw text for readers, and included bylines and photos.
For the conventions, we refined the design. While watching the acceptance speeches, our audience followed along on NPR.org, reading transcripts and annotations with a few minute delay. We saw that the audience responded positively to our coverage.
By the fall, we had designed an entirely new workflow that would allow us to provide true real-time annotation and allow dozens of reporters to contribute simultaneously. Our reporters were primed, and so was the audience. Approximately 3.5 million people watched the first presidential debate with us, for an average of almost 9 minutes. In total, roughly 17 million people viewed NPR’s annotations in 2016, and engaged with them at least 30% longer than the average NPR story.

What makes this project innovative? What was its impact?

Prior to NPR’s annotation project, fact checking generally involved taking a quote out of context and assessing it. To deepen audience trust, we showed the full context and allowed readers to see the politicians’ words and our work, side-by-side in real time.
Typically, when you have a viral success, the engagement tends to be low and the content tends to be shallow. This project managed to combine substantive reporting and strong engagement and still be more popular than anything NPR has ever done on the web. It is impossible to fully quantify the impact of bringing millions of new audience members to our content, but we see that it has expanded the audience for our careful, thoughtful reporting in significant ways. It's not every day that famous rappers Tweet out our content (https://twitter.com/talibkweli/status/780980021741641728)!
We are happy to say that we have inspired many news organizations to join our annotation efforts. Vox Media had a live annotation of President Trump’s inaugural address, and The New York Times added a robust annotation later that day. Both projects used code and design ideas developed at NPR. And NPR member stations have implemented the tool to cover local and regional issues in compelling ways (http://nprillinois.org/post/governor-rauners-budget-address-annotated#stream/0).

Technologies used for this project:

As David Eads tweeted (https://twitter.com/eads/status/780578980957151232?ref_src=twsrc%5Etfw&ref_url=https%3A%2F%2Fsource.opennews.org%2Farticles%2Fhow-npr-transcribes-and-fact-checks-debates-live%2F), the system looks a bit like this:
transcription service ←→ google app script → google doc (+18 factcheckers) ←→ server → s3 → embedded widget
Tyler Fisher has a detailed writeup here: https://source.opennews.org/articles/how-npr-transcribes-and-fact-checks-debates-live/
The stack is an unholy mashup of Google App Scripts, Python, Amazon S3 and Cloudfront, and virtualdom frontend code.
As the project has evolved, we have also integrated Dan Schultz's OpenedCaptions system and Pietro Passereli's OpenedCaptions buffer server to use publicly available closed caption feeds.