Challenges Holding Back Open Data In Nonprofits

Article views

4207

VIEWS

Public discourse around the many applications of big data has generally revolved around its uses in the private sector, whether this be to better advertise to consumers or increasing operational efficiency. However, there is one area that has hitherto been neglected, and it's an area with the potential to do substantial social good - the nonprofit sector.

In recent years, nonprofits have made great strides towards collecting and exploiting data, finding ever more innovative ways to help solve some of the worlds oldest problems. However, impactful data projects are not cheap. In order to join the big data revolution that has proven so beneficial in other areas, many nonprofits rely on open data. Fortunately, there is also motivation to share such data.

One nonprofit leading the way in open data is the World Bank. We spoke to Eric Anderson, a senior disaster risk manager and ICT specialist with the World Bank, about how open data was helping the organization bring relief to millions and help achieve its ultimate goal of ending world poverty.

How important is open data for disaster management?

Oh, super important! One of the biggest challenges during an emergency is that you simply don't have the time to clean up, organize, and begin the data discovery process. Having the data already open means it's more user-friendly - easier to discover, easier to access, and easier to use and reuse. This means you don't have to worry about issues regarding licensing or permissions. So I think open data as a philosophy and as a standard has really helped accelerate that data problem during emergencies.

Have you seen more organizations making their data open in recent years? Is there more that people could be doing?

In international development, I think recognition has increased that the underlying data of risk assessments, damage maps, and other products like this which deal with a lot of data, should, as much as possible, be made open. I think the World Bank, certainly many UN agencies, and governments are beginning to treat these input datasets as public goods themselves. It is a little bit of a tricky situation because many of these datasets are expensive to collect and maintain and there's a lot of data gaps. So I think one area that needs improvement is making government data available openly where it has a humanitarian case.

Do commercial obligations hold back open data?

I think that's really the forefront of the challenge - figuring out the business model for maintaining datasets. On one hand, there's a good case that some datasets should be viewed as public goods that should be freely and openly available. But on the other hand, someone has to pay to collect and to maintain it.

You find it with mapping data in particular elevation models or constant time series data, like weather data. I don't think anyone's really cracked it yet as there are different ways forward, but there is certainly a business model and a strategy that has to be built around the data.

How is technology helping you?

I think technology is driving a real demand for full raw data. You know, there are more devices connected to the internet now than people, so they are more likely to interact with datasets from their mobile phone than by downloading a PDF of a website. If you think about day to day activities, things like looking up navigation or looking for the next train, you look it up on your phone, you're not going to download and print out an itinerary or schedule. That's really provoking agencies to make the underlying dataset machine readable and define standards API's for different applications.

As you mentioned, IoT devices are becoming more prevalent and making more real-time data available. What are the challenges of collecting all that data? Do you think governments are prepared for the future?

They are not just collecting it but using it. I think in the developing world, there's obviously been a shortage of resources, skills, and people. Hence, we have to try and design data services for that bandwidth-constrained environment. I mean this in many senses of the word; in terms of the human capital as well as the skills, resources, and budgets these kinds of governments have.

I think IoT can help in other areas, but in some, there may be a deluge of data that isn't being actioned. Sources that are low complexity and low cost are actually part of the innovation today. For example, with OpenStreetMap, you know you don't have to worry about hosting as there are over three million volunteer editors and a community there. If a developing country wants to engage with OpenStreetMap, it's a relatively low barrier to entry.

When we think about IoT in the developing world, we're thinking about simple sim card connected devices like weather stations, flow meters, and other kinds of datasets. However, we also consider mobile phone transactions as mobile money is actually very big in East Africa. As people buy water with their phones, they are digitizing a payment that is actually giving you a lot of by-product information. You can begin to see other rural water points working through mobile money transactions, so if there are no sales, presumably, there's no water.

This means you might be able to see the signals for drought and climate change early. When those surface waters dry up, people come to the formal water points so the number of users goes up tenfold. However, are governments really set up to see those flags and to take action on those flags? That's part of the organizational change that needs to accompany any kind of new investment in technology and data.

If you want to hear more about how data is rapidly transforming the world, attend our Big Data Innovation Summit in London, March 21-22.

Download your free copy of DATAx Guide to Gaming Analytics

Read about the latest technological developments and data trends transforming the world of gaming analytics in this exclusive ebook from the DATAx team.