•  
  •  
 

William & Mary Bill of Rights Journal

Authors

Dan S. Wallach

Abstract

Any voting system must be designed to resist a variety of failures, ranging from inadvertent misconfiguration to intentional tampering. The problem with conducting analyses of these issues, particularly across widely divergent technologies, is that it is very difficult to make apples-to-apples comparisons. This paper considers the use of a standard technique used in the analysis of algorithms, namely complexity analysis with its "big-O" notation, which can provide a high-level abstraction that allows for direct comparisons across voting systems. We avoid the need for making unreliable estimates of the probability a system might be hacked or of the cost of bribing key players in the election process to assist in an attack. Instead, we will consider attacks from the perspective of how they scale with the size of an election. We distinguish attacks by whether they require effort proportional to the number of voters, effort proportional to the number of poll workers, or a constant amount of effort in order to influence every vote in a county. Attacks requiring proportionately less effort are correspondingly more powerful and thus require more attention to countermeasures and mitigation strategies. We perform this analysis on a variety of voting systems in their full procedural context, including optical scanned paper ballots, electronic voting systems, both with and without paper trails, Internet-based voting schemes, and future cryptographic techniques.

Included in

Election Law Commons

Share

COinS