The U.S. States Open Data Census took a new approach when rating open data across all 50 states and Puerto Rico. It allowed states to improve their results as the census looked into their open data initiatives. Instead of a closed assessment process, states watched their scores emerge and made improvements. U.S. Open Data launched the census early last year and completed it in February 2016. Waldo Jaquith, the organization's executive director, announced the final results via a tweet.

Governments approached Jaquith with the inspiration for the census. "This whole project wasn’t my idea,” he said. “While I was on stage last year at the Socrata Customer Summit, someone asked, 'Is there a data assessment for state data?’ It took, like, five people saying 'Does this exist?' for me to do it. I kept hearing from folks in charge of open data in other states that they wanted to be measured ... to see if they were doing a good job. ... Without it, they had no way of defending their work."

Jaquith described the process of assessing states’ open data efforts as a journey. "[It started by] collaborating with Rebecca Williams and Emily Shaw at [the Sunlight Foundation]. I worked with them to define the data sets. It turned out to be really fascinating to go through and personally survey the [data] holdings. One person surveyed their entire state; [other] people submitted suggestions and improvements."

Despite the interaction and improvements, "Overwhelmingly, it was a depressing process," said Jaquith. Many states didn't make important information like budgets or legislation available in open data or machine-readable formats. Several interviewees cited "data territorialism" for why open data programs suffer. Jaquith wrote on U.S. Open Data's blog: "It turns out that 73 percent of data sets published by states with data repositories are not found in the repository. Those data sets neither exist in the repository nor are they linked to from within the repository. When only 27 percent of extant core data sets are found on the site where the public is directed to go to find data, something has gone terribly wrong.

“Operators of state data repositories have got to inventory key state data holdings to ensure that they’re listed within the repository, and they also have to audit the search behavior on their sites to identify what data people want but cannot find. We’ve got to do better.”

Two states’ approaches, Connecticut's and Washington's, stood out for Jaquith. Connecticut scored the highest in the census. "[The census] was interesting to me, because states are like this middle child where attention for open data is on cities and the federal level,” said Tyler Kleykamp, chief data officer for Connecticut. "There isn't a lot out there that measures states’ open data. It was mostly, 'Do they have a portal or policy?'"

Some states might bristle at this kind of benchmarking. Others find it to be a friendly nudge in the right direction. Will Saunders, open data program manager for Washington, saw the value in benchmarking and noting areas for improvement. "The rubric for the State Open Data Census is really valuable for our program,” he said. “Things like restrictive licenses are easier to apply than to remove, and when a neutral third party calls attention to them, it makes us in the government data community think about when and why they are really useful."

CIOs and CDOs are reading over their census results as more of a to-do list than a scorecard. Nine categories provided a framework for state-level open data priorities, including checkbook (budget-related information), companies, incarceration, legislation, population projections, real estate, restaurant inspections and vehicle crashes. Each category has many factors rated by openness, accessibility, licensing, machine-readability or completeness of the data. For instance, Washington met all 12 factors and earned an A+ on its corporations’ data. California's corporations section earned an F, missing five out of 12 factors like machine-readable data.

Connecticut’s high score came from Kleykamp's persistence. U.S. Open Data's assessment style allowed for improvement. "Here are critical data sets that we [U.S. Open Data and the Sunlight Foundation] think states should be publishing to measure effectiveness,” Jaquith said. That structure and the interactive, open nature of the program allowed states to improve their scores as the census developed. And the results aren’t set in stone: As states make improvements, they can contact U.S. Open Data to update their scores.

"I was keenly interested in how we’d stack up, and I feared we’d do poorly,” Kleykamp said. “I think Waldo sets a high bar, but that’s a good thing. So, I went through [the census] and made sure all of those data sets were discoverable. I tagged them with #USOPENDATA so they were easily findable. It wasn’t an exercise to come out on top, it was a genuine effort to see where we stacked up. I thought other states that’d been doing this longer than us would come out on top. I suspect other states might grouse about it, 'We don’t collect, so how can you judge us?' But if it’s important stuff, maybe we should be doing it. That was one of approaches we took.”

A few recommendations can be gleaned from looking at the census information and top-performing states:

1. Hire an open data leader with broad institutional knowledge and relationships. Top states tend to employ chief data officers with broad governmental relationships, which informed where data could be sourced and reduced data territorialism.

2. Find opportunities to leapfrog data territorialism. Can you link to another department’s information within your open data portal? Find opportunities for cross-department benefit. If possible, limit other departments’ work to identifying resources you can link to.

3. Use the census as an open data to-do list. Identify the lowest-hanging fruit from your state's census information and zero in on the criteria for each of those categories. What, of those criteria, could you make improvements on this month? In the next quarter? As your state makes improvements, share these wins with stakeholders and open data users. And don’t forget to report improvements to the U.S. Open Data Census.

Sarah Schacht is the author of an upcoming book on open government. She is an open government and civic technology consultant and speaker with more than 10 years’ experience working with governments, civil society, civic tech and open data companies. Her writing has appeared in The Seattle Times, The Oregonian and O’Reilly Publishing’s book Open Government.

With many educational organizations shifting their entire schedules to distance learning tools or full virtual environments indefinitely, never has the statement “we are all in this together” been more poignant.