During the 2006 primary elections, some verified voting advocates were troubled by reports of problems with ballot scanners.1 The famous “Hursti” hack was carried out on a scanner.2 In addition, in March 2006, another flurry of concern about scanners was caused by reports of inaccurate grading of Scholastic Aptitude Tests. The testing agencies quickly admitted that the problems were caused by poor quality control rather than an inherent weakness of the system3, but this admission receives little notice by those seeking to ‘prove’ the benefits of electronic touch screen voting machines (DREs) over paper.
There is a tendency among election officials to use reported problems as a way to excuse themselves from careful comparisons of paper ballot and DRE systems. Too often we hear them say that both systems have problems. While this is true, it is only a half truth, for not all problems are equal.
Paper ballots and precinct ballot scanners have been used in the US for over 20 years and now by nearly 50% of American voters without significant problems. Today’s increased scrutiny of all voting systems has resulted in an increase in reported errors for all types of voting systems, including electronic touch screen voting machines and paper ballot precinct scanner systems. This underscores the fact that using computers in elections can be problematic and the ability to audit a system independently is of key importance.
A problem encountered with the scanner component of a paper ballot system need not result in lost votes. If the marked ballots are correctly managed, retained and recounted, votes can still be counted in a number of different ways. But a DRE which fails may lose these votes forever.
Voters Must Be Able to Understand How Their Vote is Recorded and Counted
Democracy requires transparent electoral procedures that can be understood and monitored by citizens without requiring specialized knowledge. But DREs record votes in invisible electronic circuits that are managed by complex software. The workings of DREs are not understood by the vast majority of poll workers and voters. The software is secret, and the ever more apparent lack of attention paid to security issues by machine vendors is of great concern to the integrity of the vote.
The addition of a visible voter verified paper audit trail (VVPAT) provides a partial corrective, if it is used consistently by the voter. Recent studies have raised serious questions about whether this will be the case.4 In order to provide confidence, the VVPAT must be reviewed by the voter and properly audited by the jurisdiction, two conditions which are seldom met. It is easy for a programmer to write code that shows a different printout on the VVPAT from the vote actually recorded by the machine. If the VVPAT is not consistently reviewed by each voter, an audit or recount of the VVPAT records will not necessarily reveal a problem.
In order to produce a VVPAT we insert a computer between the voter and the printed record, forcing the voter to use a technology they do not understand. But with paper ballot systems the voter marks and verifies their vote directly on the ballot, the only technology required being a pen. This ballot is retained and considered the official record of their vote. Paper is a simple technology well understood by all citizens. It is easy and familiar to use, voters understand and can confirm paper records, and if required can even assist in performing hand counts.
Problems in the 2006 elections
Several questions should be asked when reading through a list of problems in the 2006 primaries (see footnote 1):
To what extent were problems caused by the rush to meet the HAVA deadlines and consequent reduction of quality control by the vendors?
Many problems with both DREs and ballot scanners were caused by HAVA’s unprecedented demand that a great deal of equipment and services be provided by a handful of manufacturers within an extremely small time frame. In many cases vendors were late in delivering equipment and providing promised programming and other services. The rushed timetable was an important factor in shoddy quality control (e.g., flawed memory cards, batteries, ballots, etc.). More mistakes were caused by the lack of adequate time for training poll workers, technicians, custodians, as well as voters in the use of new high-tech equipment.
To what extent were election problems caused by machines and to what extent were they matters of “human error”?
Vendors of electronic voting equipment and many election officials wrongly claim that election problems “are rarely caused by the voting system. They are caused by human mistakes…” 6 An honest survey of the lists of problems cited in footnote #1 challenges such claims. At the same time it reveals how hard it is to separate system failures from human failures. In the year between November 16, 2004 and November 8, 2005, VotersUnite found 155 items in response to a “machine malfunctions” search.
Human error or System error?
In 2004 in Cartaret County, NC 4532 votes were lost due to a machine malfunction that may have been caused by “a single keystroke”. If the voting machine’s design allows a single inadvertent keystroke to change an election result is this a "human" error or a "system" error?
A good system must be designed to be easy to use, and in a way that minimizes mistakes by users, in this case pollworkers and voters. The way a computer interacts with the person using it is called the “user interface”. Good user interface design is clear and understandable to the novice user, and minimizes their chances of making serious mistakes. But the poor user interface design of many DREs increases the likelihood that non-technical poll workers and votes will make mistakes. And in this case, mistakes result in lost votes.
A poor user interface is a system design failure. It is inappropriate to blame pollworkers and voters for a problem caused by bad design on the part of the vendor.
John Gideon asks “where the voting machine errors end and the human errors begin” and suggests calling any problems caused directly by the vendor a “system error.” This should include poorly written code - written by vendors and sold to the counties - to produce ballot definition files as well as other programming done by vendors. The many instances of faulty memory cards delivered to jurisdictions during 2006 should also be treated as a system error.
Ballot Definition Files
John Gideon has pointed out that "every voting system includes a key component, called the ballot definition file, which tells the software how to interpret the touches on a screen or the marks on a paper ballot."7
Lacking technical experts with knowledge of election law and practices to program machines for an election, jurisdictions must hire either vendors or outside programmers to do this ballot configuration. But more ballot programming errors occur when outside programmers are used who may be unaware of local ballot rules such as rotating which candidate is at the top of the ballot.8
Gideon says that it should not be a surprise that serious problems have been caused in elections as ballot definition files are subject to little testing and no real auditing. He points out examples of ballot definition errors, which were detected on ballot scanners through a manual recount of the paper ballots, demonstrating how electronic errors can be corrected by reviewing original hand marked ballots. Gideon argues that "it is reasonable to assume that similar but undetected errors have also occurred with DREs”.
Of course, there are also errors, which can be correctly described as “human” error. The Pottawatammie, Iowa preliminary miscount was caused not only by mistakes in the vendor’s ballot programming, but also by the failure of election officials to do pre-election testing. This is a true case of human error, one “directly caused by the jurisdiction, election officials, poll workers, or voters”. The numerous procedural problems (lost and misplaced memory cards, not following prescribed election procedures, etc.) also represent “human” failure.9
What problems are common to both DRE and ballot scanner systems?
It is important to assess which problems result from inherent system flaws and which are caused by lack of proper election management and procedures.
Many problems in the 2006 elections occurred on both DREs and precinct scanner systems and resulted from quality control issues on the part of vendors, mistakes made by vendors and election officials, and inadequate pre-election testing.
Both DREs and ballot scanners require attention to such matters as battery charging, calibration, selection of the correct supplies (paper, paper rolls, pens, ink, memory cards, etc.). Errors in these matters affect both systems equally, and can be largely eliminated through proper adherence to system specifications, maintenance, and pre-election and Election Day procedures.
Which problems are unique to the paper ballot and scanner system?
Unique to this system are problems with control of the darkness of the marks the scanner must sense. But these are easily managed by using marking pens. Careful calibration and setup of the scanners is essential. On the administrative side, guidelines for interpreting ambiguous marks on paper ballots in hand counts are needed. Some states have adopted clear guidelines so that interpretation of ballots is not “subjective.”10
Paper jams are not unique to the use of scanners and ballot markers - they have been reported frequently with the VVPATs on DREs. Ballot scanners have problems with over-sized paper ballots that do not fit in the scanner, or with ballots that do not meet the machine specifications. But as noted earlier, these are administrative failures, not an inherent problem of the system.
To what extent were problems caused by the attempt to use blended systems?
Jurisdictions that combined DREs and optical scan systems are referred to as ‘blended systems’. Combining two types of systems does create unique problems, particularly at the point that vote totals from the different systems must be combined. Blended systems also seemed to lead to more usability errors because election workers must manage different machines and reconcile votes across multiple systems. The March 21 primary in Cook County, Illinois exemplified these problems.11
It must be noted that 50% of the country is now using paper ballots and precinct scanners. Entire states have implemented paper ballot based voting without any major problems.12
As jurisdictions prepare for the general elections in fall 2006, they are being advised to have ample paper ballots ready as the country learns to use new voting systems. The 2006 Maryland primary serves as a prime example. DREs did not start up, were missing required smartcards, and crashed repeatedly, which required extensive use of emergency paper ballots.13
Problems occur on all types of voting equipment, even mechanical lever machines. But depending on the ease of recovering voter intent in the event of failure, not all problems are equal. Because it preserves a direct record of the voter’s intent, paper ballot based voting provides us a way to count votes even in the event of system failure or human error.
As Avi Rubin, professor of computer science at Johns Hopkins University, recently said in "On the Importance of Paper Ballots,"
As we approach another election this fall, we have to consider the possibility of close races and lost votes. With so much at stake, it is a shame that we have to worry about whether or not computers will crash, memory cards will die, or election workers will make mistakes that could cause the wrong results to be tallied… Those responsible for choosing voting systems must recognize that all voting systems are not equal. Systems based on voter-marked paper ballots can recover the voter’s intent if there are problems with the machines. Providing for such recovery is necessary if voters are to have confidence in our democracy.
I don’t think enough emphasis has been placed on the problem of recovery in the discussion of e-voting. It is easier to recover from election problems if we have paper ballots to count… 14
1 Lists of problems in U.S. elections have been surveyed for this article, as well as other reports from election officials and election integrity activists. A primary resource for such lists and reports is http://www.votersunite.org. That website has lists of reported problems through 2005 at http://www.votersunite.org/electionproblems.asp. Problems through June 2006 are listed by voting machine companies at http://www.votersunite.org/info.asp. More recent reports can be found in the Daily Voting News at VoteTrustUSA News Section.
2 “ ‘Hacking’ and the Paper Ballot Optical Scan System”.
3 “Should we question optical scanners in the light of recent news about grading errors with the Scholastic Aptitude Tests?”.
4 The recently published Election Science Institute’s study of the Cuyahoga County, Ohio 2006 primary election demonstrates that even the addition of a voter verifiable paper trail to a DRE does not completely compensate for the lack of a voter-marked paper ballot. The study finds that “The machines’ four sources of vote totals-- VVPAT individual ballots, VVPAT summary, election archive, and memory cards—did not agree with one another.”. Gideon indicates the questions that should be asked: “While printer errors and the loss of VVPAT ballots may explain why some electronic totals were higher than the ballot totals, they do not explain why some ballot totals were higher than the electronic totals…When VVPAT totals were higher than electronic totals, did the machines print hundreds of ballots that weren’t associated with voters, did they lose hundreds of votes, or both? When the VVPAT totals were lower than electronic totals, did the machines add votes, fail to print ballots, or both?” John Gideon, “A Deeper Look as ESI’s Report of the Discrepancy-Ridden Vote Count In Diebold Touchscreen Voting Machines,” http://www.votersunite.org.
5 "Does the VVPAT Resolve Worries About DREs".
6 Dr. Brit Williams as quoted by John Gideon in “Human or Machine Error: What’s the Truth?”, July 18, 2006.
7 VotersUnite.org, “Key Component of Voting System Undergoes No Review.” July 25, 2006.
8 For example, “Too Much, Too Fast, More Than They Can Chew” by John Gideon in Election Integrity News, June 12, 2006 where the June 6, 2006 primary in Pottawattamie County, Iowa is discussed.
9 Avi Rubin’s report of his experience as an election inspector in Maryland, where the vendor provided “technical” expert had been hired the previous day and received all of 3 hours of training, leaving him helpless in the face of problems with the equipment.
10 E.g., in Oregon, pages 75 ff. Also see Douglas W. Jones, "Rules Governing Ballot Interpretation" found here.
11 See also Susan Pynchon. “Diebold ‘Blended System’ Causes Widespread Problems in Florida Primary,” Sept. 11, 2006.
12 See this report on the August 8, 2006 primary in Michigan. Another report is of a “fixable” problem.
13 Election Board Workers' Error Hinders Voting, Washington Post, September 13, 2006 .
14 Avi Rubin, “On the Importance of Paper Ballots,” Election Integrity News, Sept.4,2006.
Comment on This Article
You must login to leave comments...
Other Visitors Comments
You must login to see comments...