O'Reilly Radar » Greg Whisenanthttp://radar.oreilly.com
Insight, analysis, and research about emerging technologiesTue, 31 Mar 2015 20:34:47 +0000en-UShourly1http://wordpress.org/?v=4.1.1How do we get government to share data?http://radar.oreilly.com/2010/09/showstoppers-in-local-governme.html
http://radar.oreilly.com/2010/09/showstoppers-in-local-governme.html#commentsThu, 23 Sep 2010 18:00:00 +0000http://blogs.oreilly.com/radar/2010/09/showstoppers-in-local-governme.htmlOn Tuesday we wrapped up the Manor Makeover in Manor, Texas, population 6,500. In some ways, this is ground zero for Gov 2.0 at the local level. The City of Manor has done some very innovative things on a shoestring, gaining attention ranging from the blogosphere to national press and all the way up to the White House. In fact, keynote speaker Beth Noveck, Deputy CTO in the Office of Science and Technology Policy at the White House wrote up a blog post just last night. The makeover is pretty impressive — they even followed my blog post from earlier this year about how to embrace Gov 2.0. (Not that they’ve seen it — it’s probably just obvious if you think hard enough about prioritizing limited resources.)

The champions of the Manor event (which “made over” a different city, De Leon, Texas) are Dustin Haisler (Manor CIO) and Luke Fretwell (GovFresh). They are both well ahead of the open government curve, and I think they’re absolutely on the mark: with a little elbow grease, the tools exist to make government more collaborative, transparent and participatory on a limited budget. With the right political backing, connections, knowledge and personal drive, a lot can be accomplished toward making government act more like a platform, as Tim O’Reilly and others have envisioned.

But how do we get this “perfect storm” of talent and resources to see it take off? There are more than 18,000 cities and counties in the United States. Are any others doing it this well? Actually, yes. But I can pretty much count them on one hand. Watching (and giving) some of the Manor presentations, I participated with the group in the promise of government as a platform. Bonded by our collective vision, we all bring something to the table: code, software, political/administrative expertise, or perhaps just a dose of unbridled enthusiasm. However, something is “rotten in the state of Denmark.” Just yesterday, Luke posted a blog entry announcing the “end of a GovFresh era” — and this, one day after its most successful event.

As I listened, my mind kept wandering back to one common showstopper: without laws to compel the sharing of local data, it’s never going to happen at scale the way people are hoping. Even if we can get past the biggest issues — inertia and resistance to change — municipalities still don’t have the budgets and technical know-how to make it happen. Oh I know, I know — there’s always the argument that “the ROI on giving out data is a no brainer”…sorry, but the evidence isn’t there yet. If it were so compelling, more governments would be doing it.

How, then, does the public get access to data, and ideally, to raw data streams? The typical hurdles to sharing information — technology, budgets, politics, and the absence of laws requiring its disclosure — are particularly acute in law enforcement, so it makes a good case study. Tight budgets are forcing many agencies to cut personnel — and budgets were already tight before the recession. Not to mention that crime data is at best controversial and at worst vulnerable to statistical manipulation by whomever can get their hands on it. So what can we do? As far as I can tell, there are three basic tactics:

Using force, by changing the laws or creating new regulations requiring agencies to disclose data and provide it as a machine-readable feed. (Note: some sunshine laws give access to limited data, but not as an ongoing feed or through an API.) This will take forever, so it’s not a great option.

Using intimidation, by enlisting the news media to pressure the agency, or by hiring lawyers to threaten them with lawsuits, which have no basis in law or fact (caveat: I am not a lawyer, but I have hired some to help me be more informed). Although some agencies might capitulate, most won’t. And then they’ll be put on the defensive.

By creating value for agencies to entice them to share the data. Sometimes it’s as simple as asking; other times, they need to see how they will benefit or they won’t share anything. Still other times, it’s a non-starter.

Let’s talk about crime data

So what, exactly, is public data? Is crime data actually public data, and do agencies have to provide it? If so, why don’t they provide it as a raw data feed? The answer is this: while crime data concerns the general public, it is not exactly public data per se, and individual agencies have their own rules. Most won’t — or at least don’t — release it. And I don’t know of any agency, anywhere in the country of any size, that feels it is legally obligated to disclose any crime data to anyone as a machine-readable feed. Period, end of story. Please, prove me wrong.

So, given that #1 and #2 above are not going to be effective, and that there are no requirements to disclose data in an open format, let’s assume we have to use option #3 above, and create value to entice agencies to participate. We still have to deal with the barriers: budgets, technology and politics (fear, apprehension and skepticism about what will happen if they share it). While the barriers may not be completely justified, they are real.

The CrimeReports approach

At CrimeReports, there’s no question about it: we work for law enforcement. We create solutions for them. Because the laws don’t compel them to share information at all, they hold all the cards, so we draw them into voluntarily sharing crime data by showing them how it creates value for their agency. We solve their technology hurdles, and we make it affordable. Let’s face it, they have other things to worry about, like fighting crime.

One challenge on “making it affordable” is that the sweet spot price is just $100/month on average, and some still can’t afford it. No surprise that at that price, we lose money, and a lot of it, since it’s not easy to extract the data, standardize it and perform all of the work to make it available and easy to use. If it were easy, believe me, it would already be done.

The bottom line is that I think it’s good to have incident-level data available to the public in near real time. It’s good for communities. It’s good for the agency that provides it. And it’s good for individuals. But we need a business model that works. If we want that data to be out in any form, we need to have the ability to make up for the loss of our cost on performing that service.

To keep us in check, the invisible hand will exercise its magic. If we price it too high, someone else can move in and replace us, which has already happened (we actually have other viable competitors in the space). If you don’t think that will happen, think again. Our contracts are cancelable with 30 days notice, and we don’t have the exclusive ability to source the data. If you, your company, the Department of Justice, or the local newspaper wants the raw data, nothing is preventing that from happening. (In fact, some of our customers publish crime data to more than one entity.) You just have to make it worthwhile to the agency, since the other two options listed up above are unrealistic.

In a perfect world, every government agency in the country would provide an open and free data catalog. The early adopters have done it. More will continue to do so. We’ve come up with a process that works: it gets data out into the hands of the public, and that’s a step in the right direction. So let’s not let the perfect be the enemy of the good.

The discussion is far from over, and I think there’s more we can do. I’d love to hear what you think and keep the conversation running.

]]>http://radar.oreilly.com/2010/09/showstoppers-in-local-governme.html/feed7Four Steps to Gov 2.0: A Guide for Agencieshttp://radar.oreilly.com/2010/02/four-steps-to-gov-20-a-guide-f.html
http://radar.oreilly.com/2010/02/four-steps-to-gov-20-a-guide-f.html#commentsMon, 08 Feb 2010 14:15:09 +0000http://blogs.oreilly.com/radar/2010/02/four-steps-to-gov-20-a-guide-f.htmlWhat Does the World Look Like When the Work of Government is Driven by the People?

Gov 2.0 has a lot of definitions, but in observing the exciting breadth of projects currently being built, it feels a little like the Blind Men and the Elephant, where everyone defines it based on their first hand experience, but not from a holistic view. In its essence, Tim O’Reilly’s definition of Gov 2.0 is where government acts as the catalyst to let others build upon its work h — and most importantly, to multiply its impact.

For the first time in history, we’re really at a point where this is technologically feasible. Even if you have no specific tie to government, Gov 2.0 envisions a world in which — just by having experience and interests — ordinary members of the public willingly contribute to the knowledge, facts and policies that comprise our government. It might be as easy as carrying your cell phone. And it might take just 30 seconds.

In December, the Obama Administration released its long awaited Open Government Directive, which was met with enthusiasm from some, and an underwhelmed “meh” from others. The Administration has asked state and local government to adopt the Directive, but it still begs the question:

If I am an agency head and want to embrace Gov 2.0, what should I do first?

Right now it’s a confusing whirlwind of options: Create raw datafeeds in machine readable formats? Create iPhone apps? Use a wiki internally? Create a Facebook group, a Facebook page? Start posting to Twitter? The choices are infinite, but the resources are most definitely limited.

Below is a starting discussion, a “Four Steps to Gov 2.0,” designed to align the various Gov 2.0 stakeholders – individuals, governments, private companies, elected officials – toward the same goal in pursuit of open and participatory government. It applies to all levels of government at the federal, state, and local level. It attempts to structure an agency’s actions as prioritized consecutive steps, in a way that will reward those that adhere to it with more power, better engagement, and future compatibility with other government agencies, private companies, experts, and the general public. Even a few years ago, it would have been technologically impossible or at least prohibitively expensive. Now, the biggest obstacle is simply a plan and the political will.

1. First and foremost, “convene the conversation.” Governments that want to win should first maximize the free contributions of the general public and experts for issues handled by that agency. Focus on creating the systems to foster self-organization and moderation (think user voting, forum moderation, and social reputation).

Before all else, this should be the first — and only — goal of agencies at every level. The original Obama campaign site and Peer to Patent are great examples, and several other early examples are starting to emerge.

2. Next, examine your agency’s data and put it into three “buckets”. If you have not completed #1, go back and do that first because you’re leaving a valuable resource on the table. The buckets are:

Define high value data sets that can be shared in machine-readable format. This is data that is not updated frequently, never anticipates the need for improvement, and is generally referential in nature. Examples might include historical spending, infrastructure details, and census-like data.

Define high value data sets than can be interacted with via an API. This is data that anticipates improvement from the public, and/or which regularly needs to stay updated by the agency. Examples might include permits, locations of buildings, and crime data.

Define the data types that are not shared, period. Shine a bright light on these data types, and make very clear statements as to why they are not shared. If “getting to the data” is the reason for not sharing, put that to the community and you will be able to find someone to help you get that data out for free. Examples are data that is already protected by law, or which contains personally identifiable information.

3. Next, build the datafeeds, because they will help maximize the public’s information and contribution in Step #1. Push this data to the public in machine-readable formats: XML, RSS, or CSV, accessible via Web services.

4. After the datafeeds are complete, build the API. Look at this as a social compact, where as part of the exchange, companies and members of the public are able to return value back to the agency, creating an infinite loop of ever improving data. Use it to generate mechanical turk-like assistance from the public. I’ll explain some of the key components of an effective Gov 2.0 API in a future post.

After these steps have been accomplished, look at building a regular Web site, specific applications, and services. Agencies that prioritize in this order won’t put themselves at risk of building social silos (these are social networks that end at the boundary of the town, state, or agency).

I’ll consider each of these steps individually in subsequent blog posts. If you have more ideas, please let me know here or send a note at greg [at] crimereports.com.