This is the first post in our Curate for Justice series, profiling and sharing lessons from innovative documentation and advocacy initiatives–specifically those which curate eyewitness media to monitor a human rights issue. If you work on such a platform yourself, we’d love to hear from you. Click here to learn more.

A year ago, WITNESS collaborated with the FiSahara International Film Festival to develop Watching Western Sahara, a platform for curated and contextualized footage of social movements and human rights abuses in the occupied territory.

In that process, we identified critical questions to ask when designing a platform for curated citizen footage. We hope that these questions will help peers embarking on similar initiatives identify stakeholders and determine appropriate tools, team, process, and workflows to curate citizen reports.

For the purposes of this guidance, we’ll use the following terms:

Report – An individual citizen report (i.e. a photo of a weapon; video of a protest; 1st person description of a polling station, eyewitness video of police encounter)

Open-Source Research – The use of publicly-available documentation and data, often recorded and shared by non-professionals (such as YouTube videos and social media posts) in an investigation

Who creates the reports?

A trained team of staff or volunteers? Average people taking pictures or videos for their own purpose? Human rights activists? Perpetrators of abuse? The profile of the people behind the reports will determine many aspects of the workflow.

Footage recorded by a trained team of professionals, for instance, may not require the same level of verification as those found online whose sources are unknown. Footage by activists, government propagandists, or anonymous groups may be disseminated to further a political agenda or to sow chaos and confusion. Footage by at-risk activists could be used against them by repressive regimes if their identity is revealed.

Are the creators at risk, or will they be if their reporting is exposed and/or amplified?

How are the reports discovered and included?

Will the reports be submitted to a database by creators? Or will they be found by curators through open-source research?

If reports are to be submitted, what is the outreach plan to ensure that those creating the reports are aware of the database or app, motivated to contribute to it, have the technical capacity to submit reports, and to do so while minimizing risks?

Tools such as ProofMode, Eyewitness for Human Rights, Ushahidi, and the ACLU Mobile Justice App have been created to facilitate the process of submitting reports from the ground to a database. In some contexts, using a specialized app may be preferred, while in others, having a human rights-related app on a cell phone could put a person at risk if their phone is confiscated.

Who curates and/or verifies the reports?

Are they trained professionals? Trained activists? Will they be identified as a cohort, or can anyone participate anonymously? Will they be dispersed or a part of a team? Are they also the creators of reports, or do curators conduct open-source research to find and verify online reports?

If the curators are responsible for finding reports, they may need to be trained in open-source curation, such as the team at the Digital Verification Corps initiative of Amnesty International and several universities. First Draft is an initiative dedicated to sharing resources on verifying and contextualizing open-source media.

What are their technical skills and capacity? What languages do they speak? What tools do they use?

The capacity of the curators will determine the tools used, the training required, and the resources needed. For instance, if the curation process will conducted by a team, a curation platform such as Check, which is a platform designed for transparency and collaboration, may be valuable. If the process will be conducted by a small trained group, you can design a workflow that fits the team’s needs and capabilities.

Do the reports contain sensitive identifying information?

If the reports identify political activists in a repressive environment, document someone being sexually harassed, or otherwise expose an individual in a compromising situation, what steps will be taken to assess and minimize harm caused by the curation and amplification of such reports?

Are the reports at risk of being taken down from social networks and websites?

In many cases around the world, activists have found that valuable and sensitive reports of human rights abuses are targeted by online takedowns requests, resulting in the reports being removed by the hosting website. In other cases, the uploader may choose to take down the report or restrict access to it due to unanticipated attention.

Will amplifying the reports increase the risk that they are removed from public access?

If there is a significant risk that valuable footage may be removed, you may want to consider developing an archive to preserve the curated reports, like the Syrian Archive does in its process of collecting and analyzing online reports of the war in Syria. To get started, see this blog by WITNESS’s archivist for considerations around archiving online media.

Who is the target audience?

This will determine language of the content, as well as whether the reports will have to be explained in a way that is more easily understood by those who may not be familiar with local context. For instance, many of the videos we curated on Watching Western Sahara document protests. To ensure that international viewers could understand what the protests were about, we often translated chants and protest signs from Arabic into English.

What are the standards for the target use of the curated reports? Lawyers, for instance, will require more details about the verification process and source of the report than news consumers. What information will be valuable to them? Will they be interested in seeing patterns by analyzing reports by geography, time, and other factors? Consider the needs of your audience so that when you process reports into a database, you can add appropriate context and metadata, and determine the best way to publish the curated reports, from a map to a raw database, to an interactive visualization.

Is the content graphic or disturbing?

If so, you may want to consider providing vicarious trauma resources and resiliency training with the group of curators, and include a warning to the audience. Check out this resource for guidance from WITNESS on handling graphic footage, and this guide by First Draft on how to mitigate the impact of viewing distressing images.

Have you created or worked with a platform of curated citizen reports? What questions and guidance would you add to this list? Write them in the comments, or fill out our questionnaire to help us gather resources and strategies from diverse perspectives and experiences.