Webinar Recap: Newest Lessons in using Open Data Effectively

Author:

Seth Otto

September 7, 2017

This past July, NetHope’s Crisis Informatics team produced a three-part webinar series dedicated to open-source tools that are readily available to the humanitarian sector. Topics includedopen imagery, crowdsourcing solutions, and the topic of this recap: open data.

NetHope’s Newest Lessons in using Open Data Effectively webinar hosted two of our industry’s leading open data providers: Humanitarian Data Exchange and Humanitarian OpenStreetMap Team. Representatives from both organizations shared the context and efficacy of their tools in the humanitarian space, and how new users can participate.

The Humanitarian OpenStreetMap Team (HOT) is a small nonprofit that applies the principles of open-source and open data sharing for disaster response and economic development through their unique use of the free geospatial database, OpenStreetMap (OSM).

Mhairi O’Hara, a GIS Project Manager at HOT provided an overview of their OSM export tool, and some of the important work that has been done with the extracted data. O’Hara suggested that OSM is the Wikipedia version of Google maps due to the geo-data being open-source, created through crowdsourcing and small-scale funded projects. With the help of skilled humanitarian organizations, OSM is an invaluable tool for resource-poor governments and communities who lack the capacity, capability, and funding to produce accessible geo-data.

In some places, the geospatial information in OSM can be much more detailed than Google Maps. Often down to the building level, their information can be vital for humanitarian response efforts. After the Nepal earthquake, OSM’s global network of online volunteers mapped over 6,000 square miles in only four days.

A small nonprofit with only 40 staff members, HOT is 99 percent volunteer-based with over 3,000 on-and-offline volunteers worldwide. HOT is engaged in three kinds of work: disaster mapping, community development, and technical projects. Much of their work takes place in the wake of humanitarian crises and natural disasters by training volunteers in affected areas to contribute data to OSM. They also train users in the humanitarian space to use the tools necessary to access and refine the data, and how to apply it to achieve desired outcomes.

HOT’s community development work cultivates partnerships across multiple sectors to provide training in open-source digital mapping concepts and tools. Their work in this area emphasizes the inclusion of women, and those with disabilities. Once trained, and equipped, community members can participate in efforts like HOT’s Missing Maps project that focuses on mapping areas where vulnerable populations are often neglected in humanitarian response campaigns.

O’Hara explained that some of the many technical projects that HOT is engaged with are based on tools that they have created to help users access and understand OSM’s data. This video tutorial describes that process.

Some of HOT’s many projects are similar to the projects HOT has in Turkey, and Uganda. These are initiatives focused on supporting refugee self-reliance by training community leaders to map data, such as possible vulnerabilities and potential assets for refugees in the areas where they live. The #MappingAgainstMalaria initiative trains volunteers that can work remotely to map settlement areas in seven countries. Organizations and local governments who are working to slow the spread of malaria use this data in targeted intervention campaigns to reach those living in sparsely populated areas.

The Humanitarian Data Exchange (HDX) is the United Nations Office for the Coordination of Humanitarian Affairs’ (OCHA) platform for open data sharing. The objective of HDX is to make humanitarian data easy to find, share, and use in one place. Godfrey Takavarasha, an Information Management Officer with United Nations Office of Humanitarian Affairs, presented a detailed tour of the HDX platform. With close to 300 member-organizations signed on, HDX has become the most popular open platform for sharing humanitarian data. Over 4,900 datasets have been shared on HDX since it was launched in July 2014. It has over 5,000 registered users and has recorded over a quarter of a million unique visitors, and those numbers are growing.

With HDX quickly becoming the default source for global data in the humanitarian space, Takavarasha explained that UNOCHA vets and approves every humanitarian organization that shares their data through the platform. Each and every dataset is put through a quality assurance process that require datasets meet minimum quality standards, as established in the HDX terms of service. Special attention is paid to ensure that data containing personally identifiable information, and other sensitive personal and demographic information is not publicly shared on HDX.

Not everyone who visits HDX is a data-savvy user. Data for a given geographical location are often wide-ranging and can feel overwhelming. Users are permitted to download data sets for use in their preferred visualization software, but HDX also provides multiple data visualization tools that allow users at any experience level to parse through large amounts of data with the ease of a few clicks on any given map.

Before data can be used, it must first be shared.

Describing the best practices of humanitarian open data that HDX is working to establish, Takavarasha emphasized several key factors. In his view, sharing-as-a-mindset should be on par with attitudes around distributing annual reports in the nonprofit sector: Before data can be used, it must first be shared. To keep shared data up to date in an efficient way, Takavarasha suggested that organizations shift to automate the process of sharing data to save time and keep data relevant. Finally, the data must be well documented so that users never need to wonder what the data is measuring or how.

Takavarasha tied both presentations together with an example of best practice with humanitarian open data. HOT has built a data export tool (see video above) that integrates with, and automatically updates weekly to, HDX using the HDX Python library. As more organizations follow this model of data sharing, the more accurate the picture on the ground becomes for local communities and front-line field workers, allowing for more targeted and effective intervention efforts.

HDX will continue its work under the umbrella of the Center for Humanitarian Data which was established by OCHA in August 2017. Located in The Hague, The Center will focus its activities across four areas: 1) data services; 2) data policy; 3) data literacy; and 4) network engagement.

To access the presentation, collateral material, and listen to the Newest Lessons in using Open Data Effectively webinar recording that includes an extensive Q&A session not covered in this recap, visit the webinar landing page.