Undervotes in 2006 Senate Contests in Ohio Diebold Counties

Excessive Undervotes in Ohio Diebold Counties, 2006 Senate Race

by Dale Tavris

Since the 2004 Presidential election, when Ohio’s electoral votes made the difference in handing George Bush a second term in the White House, subsequent investigations found numerous irregularities” in Ohio’s election, and Ohio’s Presidential electoral votes were subsequently challenged in Congress, Ohio’s election system has been the focus of much national concern and attention.

On December 7th of this year, Richard Hayes Phillips posted an analysis of the undervote for the Ohio Senate race entitled “Unofficial Results in Seventeen Ohio Counties Cannot Be Right”, using unofficial figures from Kenneth Blackwell’s website. An undervote for the purpose of that analysis was defined as a ballot that was cast, but for which there was no vote registered for Senator (or else more than one vote for Senator, which would have disqualified the ballot).

Since that time, Kenneth Blackwell has posted
the official election results on his website. In connection with my role as a volunteer for the data analysis group of Election Defense Alliance, I recently used those results to conduct an analysis of the Ohio Senate undervote, using methods similar to Richard Hayes Phillips, in which I assessed the undervote by machine type.

Findings – excessive undervote rates in six Diebold counties

The state-wide undervote rate for Ohio Senator was 3.94%. But there was a great difference in undervote rate by county by machine type, with the Diebold counties averaging a significantly higher undervote rate (p=0.015) than counties using the other machines, as follows:

Diebold DREs (47 counties) – 4.79%

Votronic DREs (10 counties) – 3.22%

Op-scan (31 counties) – 3.29%

Furthermore, there were six counties that were definite and extreme outliers (all Diebold) compared to the other counties. Those six counties (Mercer, Darke, Highland, Montgomery, Adams, Perry) had undervote rates ranging from 11.2% to 16.3%, with an average of 13.8%, while the other 82 Ohio counties had undervote rates ranging from 0.62% to 6.76%, with an average of 3.37%. The undervotes in the six outlier counties amounted to almost a quarter (24.9%) of
the undervotes in the whole state, whereas the total votes in those six counties amounted to only 7.1% of the total votes in the state. Without those six counties, the average undervote rate for the other 41 Diebold counties was quite similar (3.47%) to the average undervote rate for the other types of machines.

The vote distribution in the six high outlier counties leaned slightly towards the Democratic candidate, Sherrod Brown (50.7%), which was less than Brown’s share of the vote state-wide (56.2%).

Discussion of the meaning of this analysis
What this analysis shows is that for the 2006 Ohio Senate race, the undervote rate for counties that used Diebold machines was substantially greater than the undervote rate for counties that used other voting machines, and that almost all of the excess undervote rate in the Diebold counties was accounted for by six counties, which were characterized by an undervote rate of about four times that of the remainder of Ohio.

Why did this occur? It seems highly likely that there was something wrong with the Diebold machines, at least a good portion of them in six Ohio counties, which caused the relatively high undervote rates. That could have been due to difficulties voters had in finding the Senate candidates on those Diebold machines, or it could have been due to failure of the machines to record the votes that the voters intended. Alternatively, it could have been due to the fact that 11 to 16% of voters in six Ohio counties decided not to vote for Senator – but that seems quite unlikely.

Whether or not the problem with the Diebold machines was purposeful, whether or not the undercount rates applied to some Ohio House or other races as well, and whether or not high undercount rates may have affected some close House or other races, is not known at this time. Nor do we know whether or not individual precincts or machines in counties other than the six outlier counties may have had similar problems. I have not had access to precinct level data which could possibly provide answers to some of these questions.

Another finding of note is that Richard Hayes Phillips’ analysis of the unofficial Ohio data resulted in findings that were substantially different than my analysis of the official Ohio Senate data, which can be seen by following the link that I supplied above (though our general conclusion of high undervote rates in several Diebold counties was similar).

This means that in some respects there were major changes in the data from the time of the first unofficial postings to the time that the official results were posted. Identification of the reasons for those changes could provide clues to some of the problems.

Comparison with the 2006 House Race in Florida Congressional District 13

The findings here have many similarities to the House race in Florida Congressional District 13, the only remaining 2006 House race that is currently still being contested. In that race it was found that there was one county, Sarasota, that exhibited a very high undervote rate for the House race, 15%, compared to neighboring counties, which exhibited undervote rates of only 2.2% to 5.3%. Almost 18,000 ballots did not register a vote for Congressperson in Sarasota County, which was tentatively won by the Republican candidate by only 369 votes.

Unlike the situation in Ohio, a good deal of illuminating additional information is available for the Florida CD 13 race. First, as
explained by Paul
, an analysis of individual ballots found that voters who failed to cast a vote
in the House race strongly favored the Democratic candidate in other races.
And secondly, an
interview of
in Sarasota County by the Sarasota Herald-Tribune identified a likely reason
for the high undervote rate: One third of voters couldn’t find the House race on their ballot, and 60% said that they did vote for a House candidate, but that their vote didn’t show up on their summary page.

As with the above noted findings in Ohio, Sarasota County used DRE machines for voting in 2006. But unlike the Ohio findings, the DRE machines used in Sarasota County were manufactured by ES & S rather than by Diebold.

Some final thoughts
The use of voting machines that produce electronic results that cannot be verified have no place in a democracy. Currently,
23 states have no requirement for a voter verified paper trail that could potentially verify the results produced by electronic voting machines. Ohio does have paper trails potentially available for that purpose. But even when paper trails are available to do that, they cannot ensure accurate election results if those in charge of elections refuse to perform a recount or if a recount is performed in a sloppy and illegal manner.

Election protection organizations such as Election Defense Alliance continue to analyze election results in order to shed as much light as possible on the myriad of problems with electronic voting. The fact that this issue has broken through into the national news media is one measure of their success. Another measure of their success is the 27 states that now require the use of voter verified paper trails to be used in conjunction with electronic voting machines (though how successful these paper trails will prove to be in actual practice has yet to be answered). With their continued work and a Democratic Congress ready to be sworn in next month I am cautiously hopeful that some real substantial progress will be made in the next couple of years towards redeeming our country’s election system.