Category: Open Government

At the OSCON session on Hacking Open Government, Secretary of State Debra Bowen talked about the mismatch between the process of certifying voting systems, the changing nature of voting requirements, and the goal of open source voting software.

Currently, voting systems need to be certified in order to be used in elections. The certification process entails submitting code to a testing agency that keeps the code, tests, and results proprietary. The Secretary of State’s office has access to the data. Citizens don’t. The testing process is long and cumbersome. This imposes a significant barrier to new entrants, including open source voting systems. When new requirements are added, the system needs to be re-certified. This imposes a long delay on the adoption of modifications.

This testing process is based on a model that is older than current best practices for software design. The testing process is based on a “waterfall” fall method where software is developed, and testing is done, all in one piece after the fact.

Current best practices are different in a number of ways.
* Software is developed incrementally, and testing is done continuously, as the software is built.
* Tests are written before the software is developed. Tests serve as the detailed specification for the way the software is intended to function
* Tests are written incrementally. New tests are added to govern new behavior.
* There are automated test suites that verify that the system continues to pass tests, with old and new behavior

This suggests a different process for voting system certification.
* Tests are made publicly available. Detailed tests serve as specifications for the behavior of the voting system.
* There is an automated test suite that continually tests the behavior of voting software.
* New functionality can be added to systems and tests incrementally. Tests will verify that the system continues to function correctly, for old behavior and new.
* Results of tests are publicly available.

Using an incremental, test-driven process for voting system development and certification would improve the reliability of the process, by enabling more scrutiny. It would shorten the time needed to introduce new voting system improvements. And it would lower the barrier to new entrants, including open source systems.

This testing would cover only functional behavior of the system – are votes counted correctly, does the administrative process work. There is still a need for security and penetration testing, which goes beyond the function of the code, includes all aspects of the system, including physical security, authentication practices, data integrity, and more. And there is still a need for usability testing – which as far as I know is not yet part of voting system certification. Usability problems result in a larger portion of day-to-day voting system failure than technical failures, although technical failures can have disastrous results.

Still, opening up the functional testing process, and running it incrementally, seems as though it might offer significant benefits.

For practitioners of modern software development and testing – what do you think about this suggestion? Are there any big gaping holes that would make this nonsensical or unfeasable? Feedback most welcome.

Last week’s brainstorming session on the use of social media for voter education got me thinking about the architecture that is needed for civic participation. The underlying concept is that the government provides basic infrastructure services and data. Citizens can participate in oversight and decision-making, and build tools for additional engagement, through access to services and data.

To facilitate participation, openness is needed in several layers.

open code and open data. These are two related families of practices that engage the community in the development and review of technology; and that make public information available to the public. Open data includes basic availability, as well as support for standards and licences that enable re-use and participation.

open APIs. Application programming interfaces enable developers to build on basic government infrastructure services, creating a broader ecosystem of applications that deliver value to the public without additional government funding, and that provide services that the government can’t.

Effective practices for social participation. Several attendees noted the problems with simple comment systems that devolve into anti-social anarchy, driving away constructive citizen participation. There are many techniques, tools, and social practices to overcome these problems. Solutions are context-dependent – there is no one-size-fits all solution.

via Bruce Joffe at the Open Data Consortium the California Appeals court upheld the Santa Clara County Superior Court’s decision to require Santa Clara to provide GIS parcel basemap data under the California Public Records Act, charging no more than the cost of duplication. While 41 other counties provided basemap data for $100 or less, Santa Clara county had atempted to charge over $150,000 for the data. This is a big victory for open government data.

Transparency Camp revealed the contrast between old and new models of protecting the public’s right to know about our government.

At the same time as Transparency Camp, David Simon, an old beat reporter in Baltimore, wrote a piece in the Washington post about the good old days of crime beat reporting. Armed with a knowledge of public information law and a relationship with a pro-first-amendment judge, and motivated by his role as the representative of the public’s right to know, Simon wouldn’t take recalcitrant cops’ excuses as an answer, and relentlessly pursued the truth about crime and police activity. In the article, Simon laments the demise of beat reporting. There just aren’t reporters on the street covering a topic and pursuing the truth. Even the current judge in the district doesn’t have an interest in enforcing public information access, as Simon found recently when he tried to find information about a police shooting.

Meanwhile, over at Transparency Camp, one of the attendees was Brian Sobel the developer of the Are you Safe iPhone application that shows location-based crime information for blocks in Washington, DC. Information about crime isn’t published because one intrepid reporter made the cop turn over the crime report, but because the database of crime stats is online.

Just because there is data about a crime doesn’t mean the data is accurate or that justice is being served. In Baltimore there were no journalists or bloggers investigating the police shooting of an unarmed 61-year-old man in February, until the retired journalist starting making calls. What’s needed is not only mapping but community input, like the everyday activism on Uncivil Servants which captures reports of illegal parking by New York city employees. And like the crowdsourced journalism managed by Amanda Michel who is taking her experience with citizen journalist campaign coverage to ProPublica. Her first assignment as Editor of Distributed Reporting is to get many eyes to cover the implementation of the stimulus bill.

In David Simon’s world, a few brave reporters had the special knowledge and connections to get enforcement of open data and open records. In our world, the government policy needs to make data available as a matter of course, and crowdsourcing tools and communities need to give more people the knowledge and the courage that David Simon had to demand accurate information from the cops.

The world is different. Open data and crowdsourcing give more people the raw information and open government literacy that David Simon had. But we need the organizational structures, funding, and motivation to use them. There’s no guarantee how well the new way will work, but there are tremendous opportunities, and it’s up to us to make them work.