California review of the ES&S AutoMARK and M100

California’s Secretary of State has been busy. It appears that ES&S (manufacturers of the Ink-a-Vote voting system, used in Los Angeles, as well as the iVotronic systems that made news in Sarasota, Florida in 2006) submitted its latest and greatest “Unity 3.0.1.1” system for California certification. ES&S systems were also considered by Ohio’s study last year, which found a variety of security problems.

California already analyzed the Ink-a-Vote. This time, ES&S submitted their AutoMARK ballot marking device, which has generated some prior fame for being more accessible than other electronic voting solutions, as well has having generated some prior infamy for having gone through various hardware changes without being resubmitted for certification. ES&S also submitted its M100 precinct-based tabulation systems, which would work in conjunction with the AutoMARK devices. (Most voters would vote with pen on a bubble sheet. The AutoMARK presents a fancy computer interface but really does nothing more than mark the bubble sheet on behalf of the voter.) ES&S apparently did not submit its iVotronic systems.

ES&S failed to submit “California Use Procedures” to address issues that they were notified about back in December as part of their conditional certification of an earlier version of the system. This can only be interpreted as vendor incompetence. Here’s a choice quote:

ES&S submitted what it stated were its revised, completed California Use Procedures on March 4th. Staff spent several days reviewing the document, which is several hundred pages in length. Staff found revisions expressly called for in the testing reports, but found that none of the changes promised two months earlier in Mr. Groh’ s letter of January 11, 2008, were included.

The accessibility report is very well done and should be required reading for anybody wanting to understand accessibility issues from a broad perspective. They found:

Physical access has some limitations.

There are some personal safety hazards.

Voters with severe manual dexterity impairments may not be able to independently remove the ballot from the AutoMARK and cast it.

The keypad controls present challenges for some voters.

It takes more time to vote with the audio interface.

The audio ballot navigation can be confusing.

Write-in difficulties frustrated some voters.

The voting accuracy was limited by write-in failures.

Many of the spoken instructions and prompts are inadequate.

The system lacks support for good public hygiene.

There were some reliability concerns.

The vendor’s pollworker training and materials need improvement.

Yet still, they note that “We are not aware of any public device that has more flexibility in accommodating the wide range of physical and dexterity abilities that voters may have. The key, as always, is whether pollworkers and voters will be able to identify and implement the optimal input system without better guidance or expert support. In fact, it may be that the more flexible a system is, the more difficult it is for novices to navigate through the necessary choices for configuring the access options in order to arrive at the best solution.” One of their most striking findings was how long it took test subjects to use the system. Audio-only voters needed an average of almost 18 minutes to use the machine on a simplified ballot (minimum 10 minutes; maximum 35 minutes). Write-in votes were exceptionally difficult. And, again, this is arguably one of the best voting systems available, at least from an accessibility perspective.

Okay, you were all waiting to learn more about the security problems. Let’s go. The “red team” exercise was performed by the Freeman Craft McGregor Group. It’s a bit skimpy and superficial. Nonetheless, they say:

You can swap out the PCMCIA memory cards in the precinct-based ballot tabulator (model M100), while in the precinct. This attack would be unlikely to be detected.

There’s no cryptography of any kind protecting the data written to the PCMCIA cards. If an attacker can learn the file format (which isn’t very hard to do), and can get physical access to the card while in transit or storage, then the attacker can trivially substitute alternative vote records.

The back-end “Election Reporting Manager” has a feature to add or remove votes from the vote totals. This would be visible in the audit logs, if anybody bothered to look at them, but these sorts of logs aren’t typically produced to the public. (Hart InterCivic has a very similar “Adjust Vote Totals” feature with similar vulnerabilities.)

The high speed central ballot tabulator (the M650) writes its results to a Zip disk, again with no cryptography or anything else to protect the data in transit.

The database in which audit records are kept has a password that can be “cracked” (we’re not told how). Once you’re into the database, you can create new accounts, delete old audit records, and otherwise cause arbitrary mayhem.

Generally speaking, a few minutes of physical access is all you need to compromise any of the back-end tools.

All of the physical key locks could be picked in “five seconds to one minute.” The wire and paper-sticker tamper-evidence seals could also be easily bypassed.

And then there’s the source code analysis, prepared by atsec (who normally make a living doing Common Criteria analyses). Again, the public report is less detailed than it can and should be (and we have no idea how much more is in the private report). Where should we begin?

The developer did not provide detailed build instructions that would explain how the system is constructed from the source code. Among the missing aspects were details about versions of compilers, build environment and preconditions, and ordering requirements.

This was one of our big annoyances when working on California’s original top-to-bottom review last summer. It’s fantastically helpful to be able to compile the program. You need to do that if you want to run various automated tools that might check for bugs. Likewise, there’s no substitute for being able to add debugging print statements and use other debugging techniques when you want to really understand how something works. Vendors should be required to provide not just source code but everything necessary to produce a working build of the software.

The M100 ballot counter is designed to load and dynamically execute binary files that are stored on the PCMCIA card containing the election definition (A.12) in cleartext without effective integrity protection (A.1).

Or, in other words, election officials must never, ever believe the results they get from electronic vote tabulation without doing a suitable random sample of the paper ballots, by hand, to make sure that the paper ballots are consistent with the electronic tallies. (Similarly fundamental vulnerabilities exist with other vendors’ precinct-based optical scanners.)

The M100 design documentation contains a specification of the data structure layout for information stored on the PCMCIA card. The reviewer compared the actual structures as defined in the source code to the documentation, and none of the actual structures matched the specification. Each one showed significant differences to or omissions from the specification.

I require the students in my sophomore-level software engineering class to keep their specs in synchrony with their code as their code evolves. If college sophomores can do it, you’d think professional programmers could do it as well.

The user’s guide for the Election Reporting Manager describes how a password is constructed from publicly-available data. This password cannot be changed, and anyone reading the documentation can use this information to deduce the password. This is not an effective authentication mechanism.

While this report doesn’t get into the ES&S iVotronic, the iVotronic version 8 systems had three character passwords, fixed from the factory. (They apparently fixed this in the version 9 software which is now already a few years old.) You’d think they would have gone around and fixed this issue elsewhere in their software, since it’s so fundamental.

A.4 “EDM iVotronic Password Scramble Key and Algorithm”: A hardcoded key is used to obfuscate passwords before storing them in a database. The scrambling algorithm is very weak and reversible, allowing an attacker with access to the scrambled password to retrieve the actual password. The iVotronic is supported by the Unity software but is not being used for California elections.

Well, okay, maybe they didn’t fix the iVotronic passwords, then, either. Other passwords throughout the system are similarly hard-coded and/or poorly stored. And, given that, you can trivially tamper with any and all of the audit logs in the system that might otherwise contain records of what damage you might have done.

In the area of cryptography and key management, multiple potential and actual vulnerabilities were identified, including inappropriate use of symmetric cryptography for authenticity checking (A.9) and several different very weak homebrewed ciphers (A.4, A.7, A.8, A.11). In addition, the code and comments indicated that a checksum algorithm that is suitable only for detecting accidental corruption is used inappropriately with the claimed intent of detecting malicious tampering (A.1).

We’ve seen similar ill-conceived mechanisms used by other vendors, so it’s similarly unsurprising to see it here. The number one lesson these vendors should take home is thou shalt not implement thine own cryptography, particularly when the stuff they’re doing is all pretty standard and could be pulled from places like the OpenSSL library support code. And even then, you have to know what you’re doing. As Aggelos Kiayias once quipped, don’t use cryptography; use a cryptographer.

The developers generally assume that input data will be supplied in the correct expected format. Many modules that process input data do not perform data validation such as range checks for input numbers or checking validity of internal cross references in interlinked data, leading to potentially exploitable vulnerabilities when those assumptions turn out to be incorrect.

They’re talking about buffer overflow vulnerabilities. This is one of the core techniques that an attacker might use to gain leverage. If an attacker compromises one solitary memory card on its way back to Election Central, then corrupt data on that might be able to attack the tabulation system, and thus effect the outcome of the entire election. This report doesn’t contain enough information for us to conclude whether ES&S’s Unity systems are vulnerable in this fashion, but these are exactly the kinds of poor development practices that enable viral attacks.

Finally, a few summary bullets jumped out at me:

The system design does not consistently use privilege separation, leading to large amounts of code being potentially security-critical due to having privileges to modify data.

Unhelpful or misleading comments in the code.

Subjectively, large amount of source code compared to the functionality implemented.

Okay, let’s get this straight. The code is bloated, the comments are garbage, and the system is broadly not engineered to restrict privileges. Put that all together, and you’re guaranteed a buggy, error-prone, security vulnerable program that must be incredibly painful to maintain and extend. This is the kind of issue that leads smart companies to start over from scratch (while simultaneously supporting the old version until the new version gets up to speed). Is ES&S or any other voting system vendor doing a from-scratch implementation in order to get it right? They’ll never get there any other way.

[Sidebar: I live in Texas. Texas’s Secretary of State, like California’s, is responsible for certifying voting equipment for use in the state. If you visit their web page and scroll to the bottom, you’ll see links for each of the vendors. There are three vendors who are presently certified to sell election equipment here: Hart InterCivic, ES&S and Premier (née Diebold). Nothing yet published on the Texas site post-dates the California or Ohio studies, but Texas’s examiners recently considered a new submission from Hart InterCivic. It will be very interesting to see whether they take any of the staggering security flaws in Hart’s system into consideration. If they do, it would be a big chance for Texas to catch up to the rest of the country. Incidentally, I have offered my services to be on Texas’s board of election examiners on several occasions. Thus far, they haven’t responded. The offer remains open.]

Comments

It’s not THAT bad, as long certification was denied. In my opinion, there is really only one way to “solve” this “problem” (I use quotes because, if you don’t buy the machines, there is no problem). That solution is that the buyer of the product, the government, needs to have the proper methods and incentive to only buy products that do not suck. It’s like a tiny free market and when a company provides this level of quality, it should be rejected. To me, it’s refreshing to see that kind of common sense action at play for a change.

Also, Dan, I’m stealing the phrase “arbitrary mayhem” for my own use going forward. It’s too good to leave.

One thing I find interesting is that almost all of the discussion about security and problems with voting machines (which I totally agree with as a problem) – seem to neglect the more analog approaches to influencing election results.

For example – use the same approach as some of the various precincts in St. Louis, Missouri and other areas of the country where voters are predominantly predictable – they used various tactics to reduce the number of people who got to vote – either through closing polls, or misleading as to the polling locations/etc. To accomplish the same result thing with evoting – simply have someone go into the voting booths and destroy the tamper seals.

At that point, you don’t even have to mess with the votes themselves, by raising a public complaint saying the machines were tampered with – you’ve already resulted in those votes being in question and possibly thrown out.

Now true, this would require them to actually pay attention to the tamper seals/etc, which we already known doesn’t happen, but it still seems easier in some respects than actually altering the votes.

Michael Donnelly writes: “Itâ€™s not THAT bad, as long certification was denied.”

It’s good that California denied certification. However, the same equipment is used elsewhere in the U.S. California actions will (hopefully) lead to other states following their lead. That’s what my sidebar was all about.

Nathan Neulinger discusses election fraud outside of the electronic voting systems. Indeed, this is a big problem, and while there’s plenty of debate about whether electronic voting systems security is a live problem or just a bunch of theoretical issues, there’s much less debate about whether unseemly manipulation occurs in other aspects of voting. Needless to say, I support free and fair elections. Accomplishing that requires addressing both the broader attacks, outside of the voting equipment, as well as making the voting equipment itself more tamper resistant.

Why can’t these voting machine makers just get on the stick and do it right? One company, audited properly and given even a “B” grade, would almost certainly attract a ton of business.

…a bit of a non-sequitur, but I went to the comp314 course description page and noted it says “To successfully complete the assignments, you will need to spend significant amounts of time outside of class.” When I was in college, I always spent significant amounts of time outside of class! 😉

Thereâ€™s no cryptography of any kind protecting the data written to the PCMCIA cards. If an attacker can learn the file format (which isnâ€™t very hard to do), and can get physical access to the card while in transit or storage, then the attacker can trivially substitute alternative vote records.

The system needs to be designed to protect the code-storage media from any alteration between the time that both parties are about to verify it before the election until the time both parties verify it after the election. Both code and data storage media need to be physically secure within the voting equipment until close of polls, and the data storage media needs to then be protected against further alteration until both parties have retrieved the contents and compared their copies against each other.

If such physical security exists, encryption is not needed; if such security does not exist, encryption won’t help much.

Encryption won’t accomplish much, but digital signatures could accomplish a lot. Assuming each voting machine was equipped with a distinct public key, and that key material was hard to get access to (e.g., because it’s burned into a chip that’s soldered to the motherboard), then an attacker who can intercept memory cards in transit would be unable to tamper with the data. All they could do is destroy the card. That would then be detected (hopefully!) by good election procedures, leading the election officials to try to track down the original machine for its internal copy of the records.

Ed writes: “I require the students in my sophomore-level software engineering class to keep their specs in synchrony with their code as their code evolves. If college sophomores can do it, youâ€™d think professional programmers could do it as well.”

As a professional Business Analyst, I can say it’s surprising how easy it is for specs to get out of sync with code. In a software shop the person who writes the design specs and other documentation (Business Analyst, Tech Writer, etc.) is generally not the person who implements them. Developers are supposed to follow the specs and ask the writers to update them, if necessary, but it just doesn’t always happen. While this is not a good thing, the fix is also not quite so simple as asking software engineering students to keep their own code in sync with their own documentation. In fact, I’d say that this reality is an argument in favor of exactly the kind of public scrutiny that these systems are receiving.

Downloading the report was a bad idea. I think my horrified expletives woke up all my housemates; I wouldn’t trust a mess like that to pick the winner of the Eurovision Song Contest, never mind a party candidate.

Jody Holder discovered within emails he requested that the M100 has a unique executable compiled for every distinct ballot style within an election jurisdiction for each election.

The problem is more than the arbitrary executable is invoked. Since there is such a multiplicity of executable, it is hard to determine if the executable found a given card a correct executable. This is because there is no fixed version of the executable to certify. Fixed version here means: stable across ballot style, stable from election jurisdiction to election jurisdiction, or stable from one election to the next.

Here was my take on Mr. Holder’s discovery:Blog entry
and a href=”http://www.washburnresearch.org/archive/ESSFirmware/ESS-Firmware-001.pdf”>Technical Paper (PDF)

Just because California refused to certify the latest submitted version does not mean we are protected from the multiple defects in the ES & S products. Most if not all of the same problems exist in the current system deployed, used, and approved in California, including Unity V 2.4.3.

ES & S was amply rewarded for refusing to submit their prior sysytems to the Top-to-Bottom Review. To date their Unity V 2.4.3 is still approved and their optical scanners are still approved.

You will notice that the multiple experts who were used to conduct the Review have their “resumes, biographies or curriculum vitae” listed on the SoS website.

I would like to see the qualifications that the Freeman, Craft, McGregor Group have to be the Red Team in this or any technical review. I would assert that they are not qualified and would like it proven they are. They are a prime example of the small clique of insiders that have foisted and perpetuated these defective and insecure electronic voting systems upon our country. For a documented account of their past education, employment, and experience you can go to the Florida Fair Elections Coalition website and download a 10 page PDF titled “Freeman, Craft, McGregor Group, Inc.”.

When the Diebold TSx was first reviewed by Freeman in late August 2003 he flew to Texas, inspected it, and flew back in the same day. He then submitted a report on his review to the California SoS the next day. So much for a critical evaluation. Many of us know the fiasco that resulted.

I can guarantee if the same Red Teams that were hired to review the other systems had also been used for this review they would have found much more.

For five years many of us have been doing the investigating that should have been done by our election officials. In that time there have been multiple independent reviews of these voting systems, every which one of which has shown they are defective and dangerous to our democracy. Yet they are still being used and defended by the very election officials who are charged with insuring the accuracy of our elections.

The only method by which to give any reasonable assurance that the reported election results are accurate is by a scientifically valid audit, yet that most fundamental protection is fought repeatedly by the election industry. I have no confidence, nor is there in basis for confidence, in our elections.

I will assert there is not one electronic form of voting system on the market today that we can trust in conducting our elections. I would challenge anyone to prove there is. The burden must be upon the vendors to prove they are trustworthy, not upon us to prove they are not.

Thank you for this blog post. A variation of this system was used this last Tuesday in the primary here in Minnesota. There have been reports of “discrepancies” and “unusual patterns” like only 4 votes coming out of a precinct where a man calls to comment on the great voter turnout.

Minnesota may have made a giant error just before we head into a general election. If there is ANY other information about the ES&S M100 Precinct Ballot Counter and the system of which it is a part, PLEASE let me know.

Freedom to Tinker is hosted by Princeton's Center for Information Technology Policy, a research center that studies digital technologies in public life. Here you'll find comment and analysis from the digital frontier, written by the Center's faculty, students, and friends.