We use cookies and similar technologies to recognize your repeat visits and preferences, to measure the effectiveness of campaigns, and improve our websites. For settings and more information about cookies, view our Cookie Policy. By clicking “I accept” on this banner or using our site, you consent to the use of cookies.

Primary update: Voting issues in Los Angeles and Iowa

Last week Super Tuesday brought many of us to the polls to vote for our favorite candidate for President. And while voting went smoothly in most places, there was one major tech failure in Los Angeles, which saw the debut of new voting machines. Let’s compare what went wrong in LA with the earlier problems seen during the Iowa caucuses.

In our earlier blog, I brought you up to date with what happened with the Russians hacking our 2016 and 2018 elections. But the problems witnessed in Iowa and LA are strictly our own fault, the result of a perfect storm of different computing errors. For Iowa, the culprit was a poorly implemented mobile vote count smartphone app from the vendor Shadow Inc. For LA, it was a series of both tech and non-tech circumstances.

Let’s look at what happened, what were the mistakes in terms of operations, coding, and deployment, and how to prevent these things from happening again.

Iowa voting issues

First, the Iowa app wasn’t properly tested under the kind of load that would be experienced the night of the vote. This included being able to transmit the results, something that the vendor admitted in this Tweet. Given that there were more than 1,600 polling sites, that meant that the transmissions often resulted in busy signals or delays in posting results to the state election board headquarters. On the night of the caucus, reporters were left vamping on live TV because results weren’t available.

Second, the Iowa app wasn’t designed to be usable by most of the polling workers, few of which received any substantive training. Not helping matters was a complex user authentication process, which could flummox even a digital native. And some of the precincts couldn’t download the app to their phones or had other errors. Why these elements weren’t part of any quality control or operations testing is hard to say.

Next, the Iowa app actually tallied three separate result streams: one for state delegate equivalent numbers, one on the initial raw vote total and one for the final vote total. The 2016 caucus only reported the equivalents, not the raw votes -- which was a bone of contention given that the vote between Sanders and Clinton was extremely close. Instead of making the process more transparent, these trio of results made it more confusing, and it doesn’t appear that the added complexity added anything to the 2020 results. That link above walks you through a sample posting process, and even after reading it carefully it is still confusing.

But the three voting streams had another problem: while it appears that the app collected all voting data, its reporting code had bugs and not all votes were properly reported, according to NBC News. That seems like a pretty basic function that should have been tested more carefully.

Shadow’s app had several other security deficiencies that would be familiar to even those readers with basic cybersecurity knowledge. (You can view some screenshots of the app here.) The app’s code didn’t have appropriate encryption and signing, nor were its transmissions back to election HQ properly protected either. Even with these security issues, most researchers have found no evidence of any hacker activity. Since the Iowa debacle, the Democratic National Committee has banned the app from being used elsewhere in its primaries and caucuses.

Finally, Shadow also has hidden their personnel and operations details, which is distressing for a government contractor. Researchers have found that many of its employees once worked on Clinton’s 2016 campaign, adding to the conspiracy theories posted online. Shadow had a $58,000 contract to do the Nevada caucuses, but they switched to purely paper-based vote-counting methods after seeing the Iowa problems. (For those of you wondering, this contract doesn’t pay for much in the way of testing, which is perhaps one reason the app did so poorly.)

What happened in LA and elsewhere

But let’s not pick (completely) on Shadow. They aren’t the only voting tech provider. A new report from MIT found issues in the Voatz voting app. This was used during the 2018 midterms in some municipal, state and federal elections in West Virginia, Colorado, Oregon, Utah and to support overseas military voters. The vendor disputed the results, claiming that the MIT group used an ancient version of their app and improper testing procedures. Whomever you believe, clearly there was room for improvement in their technology.

But wait, there is yet another tech under fire: a custom Google calculator iPad app that was used to tally the Nevada votes. You know things are getting bad when we can’t trust a simple calculator to do basic addition.

Let’s move on to this week’s vote in LA county, which also had a number of problems. The county decided to develop its own voting technology, using open source software and custom hardware, and paid $280M to develop the system. Voters reported that many of the machines weren’t operational on Tuesday. Those that were had communication problems, since they were designed to provide real-time access to voter registration databases to authenticate voters. This mirrors the issues seen in Iowa, where bandwidth limitations also weren’t tested.

An auditor’s report written last December cited several cybersecurity flaws with the new systems, including the ability to boot from a flash drive and a lack of full disk encryption support. From my perspective, LA county had ambitious goals with its voting tech and tried to implement too many of them at once in this week’s primary.

Some lessons learned

All of these situations raise an important issue: should voting technology have to go through the same scrutiny as all Federal government software contractors, such as Fedramp? Perhaps, although that is still no guarantee that the technology will be foolproof. Part of the problem is that the voting technology supply chain is convoluted and lengthy: funds are often granted to state elections boards after a complex Congressional process, and then subject to state contracting and bidding rules. And certainly the Iowa contracts were unrealistically small given what they were trying to accomplish.

These situations brought home my previous recommendation to use paper ballots whenever possible. Maybe we shouldn’t even attempt to automate the voting process itself: after all, there is nothing wrong with paper ballots per se, and the systems used to tabulate them have worked (for the most part) for decades. What really is the issue is how voting apps can be audited. Sadly, none of the severalbills that would require voting systems to create a paper audit trail have gotten very far in Congress. LA County tried to cover this by having their voting machines print out a paper ballot that the voter would then feed back into the machine to record their vote: that was a good idea but apparently poorly implemented and the machines experienced numerous paper jams.

Another solution would be for local election agencies to work with the Elections Infrastructure Information and Sharing Analysis Center, an industry group that supports elections staff around the country and helps to improve their cybersecurity practice. This group provides security operations monitoring, incident response services and vulnerability management, along with best practice recommendations. Taking advantage of these services can avoid hacking of election websites, phishing of government officials’ email accounts and penetration of voter registration systems.

Still, most voters managed to record their votes without issues. Going forward, we need to tread carefully with developing new voting apps and new voting machines, and ensure that the testing budgets are adequate to cover the issues seen in Iowa and LA County, and that security audits are done early enough to resolve the issues uncovered.