Methodology

“Hello,” wrote the anonymous source to a German newspaper, “this is John Doe. Interested in data?” Thus began what would soon become an international financial investigation into what are being called the Panama Papers—an investigation so massive that even whistleblower Edward Snowden, on Twitter, called it the “biggest leak in the history of data journalism.”

As journalists we are used to saving information. We securely store documents, keep meticulous notes, save and back up important emails on our computers. But what about the information we find online during the course of our investigations?

Confiscati Bene, released in mid-December in Europe, is a pioneering data journalism collaboration that digs into the $4 billion of goods in the EU confiscated from criminals by European authorities. An international team of journalists and their allies sought to create a European database of seized assets and answer troubling questions about the accountability of the process. Confiscati Bene (literally, Well Confiscated) received support from GIJN member JournalismFund.eu; the main project can be seen at http://eu.confiscatibene.it.

ByGary Price |November 13, 2015

We’re back with another Research Desk post. We’ve curated a collection of two new research databases and thirteen new or updated research reports. All of the resources listed and linked below are free to access and use.

Web scraping is a way to extract information presented on websites. As I explained it in the first installment of this article, web scraping is used by many companies. It’s also a great tool for reporters who know how to code, since more and more public institutions publish their data on their websites.
With web scrapers, which are also called “bots,” it’s possible to gather large amounts of data for stories. But what are the ethical rules that reporters have to follow while web scraping?

$8 billion in just a few hours earlier this year? It was because of a web scraper, a tool companies use—as do many data reporters. A web scraper is simply a computer program that reads the HTML code from webpages, and analyze it. With such a program, or “bot,” it’s possible to extract data and information from websites.

In this just-released video, investigative reporter Mark Schapiro goes in-depth on how to use investigative techniques in probing often complex environmental issues. Schapiro, a veteran of the original Center for Investigative Reporting, gave this talk in Hamburg at NR15, the July 2015 annual conference of Netzwerk Recherche, Germany’s investigative journalism association.

The latest tools and resources from the Research Desk: new world of drones databases available, reports from the European Parliament Research Service, Top Green Companies in the World 2015, a handy free extension to download entire pages or individual files, and more.

We’re already seeing the use of drones proliferate across a whole variety of stories — from incredible imagery of the vastness of the natural world to investigations that couldn’t be told with conventional cameras, to views of the inaccessible right under our noses. So how are the drone journalists of the future being trained for their work?