The Government-Could-Not-Work Doctrine

The Supreme Court has recently declared that it is presumptively unconstitutional for the government to compel individuals to do or pay for things to which they have religious or political objections. Last Term, the Court applied this declaration to uphold the First Amendment arguments made by public-sector employees, and it appears poised to vindicate similar claims by religious objectors to antidiscrimination laws in the future. But this declaration is wrong. Indeed, throughout American history—from the Articles of Confederation through Lochner v. New York and Employment Division v. Smith, the Court itself has repeatedly rejected the notion that compulsory laws, in and of themselves, are presumptively unconstitutional.

This Article offers a novel examination of the history of challenges to compulsory laws inside and outside the context of the First Amendment. For centuries, the Supreme Court has faced hundreds of challenges to objectionable taxes, objectionable drafts, objectionable regulations, and objectionable funding conditions. With few exceptions, the Court has responded that the “government could not work” if it lacked the power to compel people to do things to which they objected. Although the Constitution prescribes many specific limits on the powers of the federal and state governments, the Constitution’s very purpose was to create a union that had the power to compel political minorities to accept the will of a political majority. Such a union would be incompatible with a governing document that prohibited officials from compelling people to take any action to which they religiously or politically objected—even when those objections were sincerely held.

Borrowing the Supreme Court’s own language, this Article calls the Court’s typical response the “government-could-not-work” doctrine, and conclude that objectionable compulsion, in and of itself, should not trigger the strict scrutiny of Abood v. Detroit Board of Education. Rather, compulsory laws should be treated the same as any other law, and analyzed for whether they are arbitrary, are discriminatory, or otherwise violate specific constitutional limits.

Hacking the Right to Vote

Most Americans believe the right to vote is one of the most important constitutional rights.[1] Moreover, eight out of ten Americans are concerned the country’s voting system is vulnerable to hackers.[2] Although new voting technology has been implemented across the country, it largely enables, rather than prevents, hacking, causing “frightening vulnerabilities” for election administration.[3] It seems that “America’s most ancient civilian office, the local election clerk, has become saddled with new and alien responsibilities tantamount to a military contractor.”[4] Hacking presents a novel threat to elections and may have far-reaching implications on the right to vote.

Part I describes the current state of election technology and the hurdles preventing improvements. Part II addresses Russia’s cyberattacks in the 2016 elections. It highlights the unprecedented risk hacking poses to the right to vote and suggests that courts must intervene. Part III reviews recent litigation to suggest that vulnerable voting machines violate the right to have one’s vote counted accurately, which reimagines traditional right-to-vote jurisprudence in the context of hacking. Finally, Part IV posits that hacks that burden voter access, increase voter frustration, and foil voter participation are more likely, just as dangerous, yet less responsive to right-to-vote jurisprudence than hacks manipulating vote tabulations.

I. The Problem with Voting Technology: Federalism, Funding, and Industry

Voting technology matters so much because elections are so often so close. Accurate machines ensure that the electoral process both selects true winners and convinces losers to accept unfavorable results.[5] The constitutional right to vote accordingly guarantees that each voter has about the same opportunity to have his or her vote counted, by requiring that counting methods (e.g., voting machines) distribute counting errors roughly equally. This is the promise and peril of Bush v. Gore.[6] Problems with voting technology, where some legally valid votes may not be counted properly, which produces high residual vote rates, risk undermining the fundamental right to vote and the public’s confidence that the will of the people has been freely and fairly expressed.[7]

Despite the stakes, there are three roadblocks to better voting machines.[8] First, federalism. According to the constitution’s text and the gloss of history and tradition, states have wide discretion in election administration.[9] They run federal elections subject only to Congress’s authority, exercised occasionally, [10] to “at any time by Law make or alter such regulations.”[11] States also have plenary power over the time, place, and manner of local elections, subject to the restriction that they not overburden the right to vote.[12] Accordingly, federal legislators fiercely resist anything resembling federal interference with state autonomy.[13]

Second, funding. Although many election officials say that modernizing voting technology is an important concern,[14] there are scant resources available to them to address it.[15] Modernization, to be sure, is not cheap. South Carolina estimates that it will cost $40 million to replace its voting machines—$39 million more than its legislature allocated in 2017.[16]

Finally, industry. The roughly $300 million market[17] for voting technology is problematic. The industry is small but politically well connected, with especially strong ties to the Republican Party.[18] It is mostly regulated at the state level.[19] Customers are often locked into long-term contracts and face high switching costs, destroying industry incentives to innovate.[20] Certifying new technology takes years.[21] Equipment designs, hardware, and software are usually proprietary.[22] Companies thus fight in court to prevent prying eyes when challenged. John Kerry lost a battle in 2004 to access the source code behind voting machines in Ohio.[23] So too did a 2006 candidate for Florida’s 13th Congressional District, who alleged that machines in one county erroneously registered 18,000 “no” votes in her race.[24] Moreover, the industry is composed of only three hardware companies that manufacture over eighty percent of machines and, in contrast, a large number of tiny third-party software vendors.[25] And of the few industry-wide changes made after Bush in 2000, some actually undermined opportunities to innovate and improve the voting experience.[26]

Congress passed the Help America Vote Act (“HAVA”) in 2002 in response to Bush.[27] HAVA authorized $3.65 billion in payments to states to improve voting technology, and appropriated $3.28 billion of that amount between 2003 and 2010.[28] States used funds to purchase new machines, often direct-recording electronic (“DRE”) or optical-scan machines.[29] DREs read digital ballots. Optical-scanners read paper ballots. Both machines store votes on memory cards. Optical-scanners keep digital images of the paper ballots they read, which can provide an audit trail. DREs can, but do not always, print paper images that voters can review, although their scrolls could conceivably be hacked to print voters’ choices correctly while recording different choices on the memory card.[30] Whereas in 2000 just nine percent of voting precincts were using DREs, after HAVA was passed the number of precincts using DREs increased to sixty-seven percent, despite the risk of hacking.[31]

Whatever gains were realized in the early 2000s have been all but lost. Forty-one states still use machines that are at least ten years old,[32] which creates a higher risk of failure and predictable vulnerabilities. Thirteen states still use machines that do not provide paper trails.[33] Some states report scavenging for new parts on eBay.[34] Forty-three states and the District of Columbia use voting machines that are no longer manufactured.[35] In 2018, Congress provided $380 million more in grants to states to improve federal election administration.[36] Yet these appropriations are entirely insufficient to replace voting machines, which are “reaching the end of their natural life cycle.”[37] It would cost $2 per voter per year, [38] or over $270 million annually based on recent presidential-election turnout rates,[39] to upgrade and properly maintain voting machines across more than 10,000 “hyperdecentralized” election jurisdictions.[40]

The rapid shift to new voting technology in the wake of Bush, although well-intended, was poorly implemented. Coupled with inadequate maintenance and industry standstill, it created the conditions in which the hacks that now imperil the right to vote could occur. Indeed, software vendors, in at least one instance, let known security issues persist for eleven years.[41]

The strings attached to HAVA’s grants[42] arguably hurt election security more than they helped. First, states had to consolidate voter registration databases previously maintained at the county level.[43] That created a one-stop shop for breaches. Second, the Act’s strict (albeit necessary) voting standards limited the kinds of voting machines states could buy with HAVA funds.[44] That lead to widespread adoption of electronic voting technology,[45] which in turn created incentives for private companies to rush to market with untested machines to take advantage of the windfall of cash and to sell states products that were not needed, such as e-pollbooks, which election officials often use to check-in voters on Election Day. Finally, states had to implement changes before the 2004 federal election,[46] leaving no time for risk assessment, debugging, or testing. The speedy move to technology, without a plan or the funds to upgrade software and hardware regularly, was a solution in search of a problem: hackable voting machines.[47] 

II. Hacking and the Right to Vote

The last presidential election put election hacking on the map, although election officials have been aware of the risk of hacking for decades.[48] Russia’s attacks practically compel the conclusion that problems with election technology are not just “political questions” for the “political branches,”[49] but rights-based threats that demand the attention of courts. Where the political system fails to adequately protect election integrity and the right to vote, courts must fill the vacuum.

Russia’s attacks were, indeed, unparalleled in nature and scope.[50] Russian hackers targeted election infrastructure in twenty-one states with sophisticated cyberattacks.[51] They successfully breached voter registration rolls in Illinois,[52] stole the username and password of an election official in Arizona,[53] and infiltrated an unnamed private company.[54] Russian hackers also sent emails to 122 email addresses associated with named local governmental organizations and election officials containing malicious code[55] and accessed county election websites in Georgia, Iowa, and Florida.[56] The era of local administrative control over voting technology is over. Russia’s hacks changed the narrative.

The right to vote, which is implicated by voting technology in ways unforeseeable even a decade ago, is a fundamental constitutional right.[57] At bottom, the idea is that “[t]he conception of political equality from the Declaration of Independence, to Lincoln’s Gettysburg Address, to the Fifteenth, Seventeenth, and Nineteenth Amendments can mean only one thing—one person, one vote.”[58] The right adapts to the times, precluding first-generation infringements (restrictions on an individual’s ability to cast a ballot) as well as second-generation infringements (efforts to dilute the effectiveness of one’s vote).[59] This jurisprudence culminated in Bush v. Gore, which applied the right to vote to election administration specifically, holding that counting votes by methods or means with similar levels of accuracy, or probabilities of inaccuracy, is part and parcel of the right to vote.[60] As a result, when states rapidly modernized their voting systems, a number of technology-related challenges ensued, because of what Bush said and did not say and what states did and did not do.

The first round of voting technology challenges sought to enforce uniform adoption of electronic voting technology under the Equal Protection and Due Process clauses.[61] Studies showed that paper-based punch cards and optically scanned ballots caused a greater number of votes to be invalidated in predominantly African-American precincts than elsewhere—a “racial gap” in the residual vote rate, or probability that votes would be counted inaccurately.[62] States rendered such challenges moot by implementing electronic voting systems statewide, reducing the residual vote rate by one million between 2000 and 2004.[63]

Challenges also arose in states whose counties purchased different types of technology. For example, in Weber v. Shelley, the plaintiffs argued that although voting equipment reduced under- and over-votes in the aggregate, it still did not distribute the residual vote rate equally across all groups and thus violated the Equal Protection and Due Process clauses.[64] Because machines have varying levels of accuracy, by using one machine in some counties but not everywhere, the state subjected voters to different probabilities that their votes would be counted accurately. The challenge failed. The court found that the electronic system in use did not restrict the right to vote severely enough to justify relief.[65] Courts facing these sorts of challenges cite Justice Souter’s dissent in Bush, which justified the use of different technologies across election jurisdictions based on “concerns about cost, the potential value of innovation, and so on.”[66]

Beyond the challenges presented by the holding in Bush, voting-technology challenges continued to fail because of the standard of scrutiny established by Anderson v. Celebreze[67] and Burdick v. Takushi[68] (referred to as the Anderson–Burdick sliding scale test). Under the Anderson-Burdick sliding scale test, courts apply strict scrutiny to an election administration practice, such as what voting technology to buy or maintain, only if it is unreasonable and discriminatory or if it imposes a “severe” burden on voters.[69] If the burden is “reasonable” and “nondiscriminatory,” or it is not severe, then it is constitutional if the state demonstrates an “important regulatory interest[]”[70] or even “legitimate and valid” concerns.[71] Anderson-Burdick is the workhorse of election administration law, even though it is arguably in deep tension with the central holding of Harper v. Virginia Board of Elections, which is that any practice that burdens the right to vote and that is unrelated to voter qualifications, not just outright proscriptions of the franchise, should receive strict scrutiny.[72] Indeed, Harper said that “[t]he degree of the discrimination is irrelevant”[73] precisely because the voter regulation at issue there (a poll tax in order to obtain a ballot) was unrelated to voter qualifications. Presumably, then, something far less severe than a poll tax as a condition for obtaining a ballot would trigger strict scrutiny if it were unrelated to voter qualifications. Yet, under Anderson-Burdick, the degree of a burden, even one that has nothing to do with voter qualifications such as voting technology that counts votes with varying degrees of accuracy, seems to be a threshold question as well as a dispositive one.

The Court applies Anderson-Burdick to election administration because of the basic difficulties of administering elections.[74] Voters cannot expect perfection across jurisdictions because it is impracticable to ever fully equalize burdens. Some voters will always live farther from polling places. It will always be harder for some voters to obtain photo identification. Lines will always be longer and ballots more confusing for some voters. Some jurisdictions will always have fewer dollars or political capital to update voting equipment and will thus use older machines with greater residual vote rates. That is the inescapable reality of election administration, or so it seems. To require otherwise, in the Court’s view, would hamstring local officials seeking to impose order on a chaotic democratic process.[75] Thus, at least in the context of voter technology, states can treat dissimilar people who are similarly situated differently without running afoul of the Equal Protection Clause.

Essentially, then, Bush and its progeny suggest that unequal residual vote rates are symptomatic of inevitably imperfect technologies. Bush and its progeny also suggest those rates are innocuous, in that they are beyond the reach of the constitution’s right to vote, because they are reasonable, nondiscriminatory, and do not severely burden that right. Election hacking, however, has forced at least one court to revisit that calculus.

III. Hacking the Right to Vote

Hacking sits squarely at the intersection of the Court’s right-to-vote jurisprudence and issues surrounding voting technology. For example, in Curling v. Kemp, the United States District Court for the Northern District of Georgia found that challengers to Georgia’s statewide voting technology provided sufficient evidence to show, on the basis of a factual record that was yet to be fully developed, “that their votes cast by DREs may be altered, diluted, or effectively not counted.”[76]

First, the court did not discuss whether a particular residual vote rate must be found in order to find a right-to-vote violation.[77] In fact, because it was a pre-election challenge, no such finding was possible.

Second, the challengers actually showed “serious security flaws and vulnerabilities,” as opposed to pointing to merely theoretical or hypothetical flaws, including “outdated software susceptible to malware and viruses.”[78] This showing established “a concrete,” nonspeculative risk that ballots could be altered in a way that undermines the opportunity to cast an effective vote.[79]

Finally, the court dismissed Georgia’s argument, at the motion-to-dismiss stage, that the injury to challengers’ right to vote was caused by hackers rather than the state.[80] States typically have no duty to protect citizens from privately inflicted harms, but the court found that, for at least the purposes of the motion to dismiss stage, there was a plausible causal connection, even if only indirectly, between the state’s use of unsecure DREs and the injury to challengers’ constitutional rights.[81]

The nature of hacking is the chief reason why Curling stands apart from Bush and its progeny. At the end of an election, there will be no way to determine the accuracy of a vote count. Post-Bush courts did not foresee this possibility. In one case, for example, the Eleventh Circuit rejected an equal protection challenge based on differing methods manual recounting to determine whether machines registered the correct number of no-votes, because the mere possibility of an “allegedly inferior type of review” in the event of a manual recount was not so substantial a burden as to warrant strict scrutiny.[82]

Hacking, on the other hand, conceals its own detection. Malicious code that modifies vote counts hides evidence of its existence by also modifying the audit logs, vote records, and protective counters stored by the machine that are installed as countermeasures.[83] Even electronic ballot “images are themselves subject to manipulation by hackers.”[84] Given the archaic nature of election machines, a post-election investigation will not find evidence that anything went awry. Courts cannot rely on the absence of evidence of tampering or malfunction as evidence of absence of accuracy issues, or as evidence of user error, in the hacking era.

Hacking is no longer a far-off risk, either, but rather a near certainty. It is easy to manipulate vote tabulations even if voting machines are disconnected from the internet, or “air-gapped.” Hackers can access machines through the modems that transmit vote totals on election night.[85] Hackers can “compromise voting equipment at many points along the supply chain, from the factory assembler to the election software programmer to the technician who makes a repair or installs a software upgrade.”[86] Hackers could also commandeer remote access software that allows contractors to make updates from home, or infect installable memory cards that are carried to central-counting facilities to upload votes.[87] Hackers can even compromise computers in election offices, then spread malicious code to voting machines when election officials program ballots.[88]

Admittedly, it is harder to manipulate vote tabulations in a way that picks winners and losers—but this is because of an information gap, not a technology gap. To effectively do so, hackers “would have to know which districts could affect the outcome. Then they’d have to change just enough votes to ensure victory without switching so many that it would draw attention.”[89] All the same, Curling suggests that antiquated voting systems are hackable voting systems and hackable voting systems violate the right to vote. This is not to suggest that the right to vote requires something that is not theoretically possible, i.e., unhackable voting machines. It is only to suggest that states must not sit idly by while vulnerabilities create arbitrary disparities in whether votes will be counted accurately.

IV. Access Hacks: Third-Generation Infringements on the Right to Vote

Manipulating vote tabulation is not the only way to hack an election. “Access hacks” have the effect of placing obstacles before voters that frustrate their ability to effectively participate in the voting process. The problem is that voting operations seem to be designed to perform the simple task of casting a ballot in an overcomplicated way, like a Rube Goldberg machine. Vulnerabilities include not just machinery, but websites, registration databases, e-pollbooks, and recording and reporting systems—systems that hackers could exploit to aggregate countless low-value burdens on voters. This is the third generation, or perhaps the final frontier, of voting infringements.[90] Although harder to address in court, given existing right-to-vote doctrine, these risks can be mitigated with system updates.

Legacy systems contain known vulnerabilities that can disrupt election infrastructure. Hackers can take down voting machines through a Distributed Denial of Service (“DDoS”) attack. In North Carolina in 2016, an alleged software glitch demonstrated the chaos that an attack on infrastructure could cause, such as machine crashes, long lines, extended hours, and back-up paper ballots (if counties have them, which is by no means a guarantee).[91] Long lines destroy voter confidence “even when individuals do not experience the long lines themselves” because voters could decide that voting simply is not worth the trouble or wait.[92] Hackers can also crash e-pollbooks, which election officials often use to check-in voters on Election Day. In 2006 in Denver, for example, an e-pollbook malfunction caused about 20,000 people to leave polling places without voting.[93] In 2008 in Georgia, a similar malfunction caused two-hour-plus lines.[94]

Similarly, legacy databases are vulnerable to information exploitation, where hackers manipulate voter records to increase frustration and foil participation. Hackers could access databases to change precinct assignments to send voters to the wrong location, wasting time and costing votes.[95] In 2016, when a Russian agent logged into a single election jurisdiction’s database in Illinois, he opened a backdoor to the files on all of the state’s voters in all 109 jurisdictions’ statewide since 2006.[96] He then gained access to 15 million voter registrations, stole 90,000 files, and attempted, albeit unsuccessfully, to change voter information including names and addresses.[97] Likewise, in California’s 2016 presidential primary, hackers used private voter information, including Social Security numbers, to change voter registrations in the state’s database, preventing a number of voters from casting ballots.[98]

Hackers can even take advantage of state voter restrictions to disrupt elections and sow division. To illustrate this issue, consider Georgia. Just before Election Day in 2018, officials used an exact match voter registration law to stall over 50,000 voter registrations containing information that was inconsistent, they argued, with drivers-license records, such as mismatched signatures, omitted middle initials, misspelled names, and missing hyphens.[99] They rejected a number of absentee ballots for similar reasons.[100] A disproportionate number of voters facing stalled registrations and rejected absentee ballots were black.[101] Georgia’s secretary of state, who is now governor, used Georgia’s exactmatch voter registration law as a justification for the mass suspension.[102] Hackers could exploit Georgia’s oppressive law and others like it to precisely the same effect. By altering voter registrations to make them inconsistent with drivers-license records, hackers could depress turnout, suppress or functionally deny the vote, or change the outcome of the election. To be sure, thirty-one states introduced ninety-nine bills impeding access to registration and voting in 2017,[103] so the target market is a mile wide and the firewalls an inch deep.

V. Conclusion

It is difficult to square the extent to which we value the right to vote with the state of voting technology. Federalism, funding, and industry get in the way. Courts must then act as the forum of last resort. However, in the wake of Bush, technology became a solution in search of a problem, enabling the hacks that now imperil the right to vote. Bush’s progeny provided little recourse until Curling, where the unique nature, unparalleled scope, and concrete threat of hacking brought the vulnerability of voting machines into sharp relief. Curling offers promise in an area of the law where there is mostly peril. Moreover, although right-to-vote jurisprudence, even Curling, has little to say about what happens when hackers target information databases in order to increase frustration and thwart participation, sensible system upgrades and security protocols may reduce the likelihood of such threats. In short, judges have a role to play in holding states accountable, states must play a role in providing support to local officials across 10,000 election jurisdictions, and voters must begin demanding changes through their exercise of the franchise––by resort to the very polls that are endangered by hackers––and in keeping the faith otherwise.

 

 


[1] Brian Pinaire et al., Barred from the Vote: Public Attitudes Toward the Disenfranchi- sement of Felons, 30 Fordham Urb. L.J. 1519, 1533–34 (2002) (finding that 93.2% of survey respondents believe that the right to vote is either the most important or one of the most important rights in a democracy).

[2] Billy Morgan, New Survey Reveals Concerns About the Security of the Nation’s Voting System Ahead of the Midterm Election, U. of Chi. Harris Sch. of Pub. Pol’y (Oct. 10, 2018), [https://perma.cc/N4CG-MXQ2].    

[3] Benjamin Wofford, The Hacking Threat to the Midterms Is Huge. And Technology Won’t Protect Us., Vox (Oct. 25, 2018, 5:00 AM), [https://perma.cc/3XX4-VD2G].

[4] Id.; see also Alejandro de la Garza, Should You Be Afraid of Election Hacking? Here’s What Experts Say, Time (Oct. 25, 2018), [https://perma.cc/E7HM-A76Y] (explaining the vulnerability of elections in view of the unprecedented nature of the threat, including equipment hacks and misinformation campaigns).

[5] Richard L. Hasen, The Voting Wars 8-10 (2012) (emphasizing the importance of public confidence in election results, and of widespread election reform in securing that confidence, in the wake of Bush v. Gore).

[6] Bush v. Gore, 531 U.S. 98, 104–05 (2000) (“The right to vote is protected in more than the initial allocation of the franchise. Equal protection applies as well to the manner of its exercise. Having once granted the right to vote on equal terms, the State may not, by later arbitrary and disparate treatment, value one person’s vote over that of another.”).

[7] The nation lost approximately between 4 million and 6 million votes in the 2000 presidential election. The Caltech/MIT Voting Tech. Project, Voting: What Is What Could Be 8–9 (2001) [hereinafter Voting Technology Project]. Using residual votes and lost votes from the past four presidential elections, 1.5 million presidential votes and 3.5 million votes for governor and senator are lost each election because of problems with voting equipment. Id.

[8] Despite opportunities to improve election technology, hardware and software products have barely advanced in the last decade. See Penn Wharton Pub. Pol’y Initiative, The Business of Voting 19 (2017) [hereinafter Business of Voting]. See generally The Presidential Commission on Election Admin., The American Voting Experience: Report and Recommendations of the Presidential Commission on Election Administration (2014) [hereinafter Election Administration Commission] (explaining the problems with existing voting technology and recommending updates).

[9] U.S. Const. art. 1, § 4, cl. 1.

[10] Congress did not pass a law regulating federal election administration until 1842. Ex Parte Yarbrough, 110 U.S. 651, 660 (1884); see also An Act For The Apportionment of Representatives Among the Several States According to the Sixth Census, ch. 47, 5 Stat. 491 (1842). Congress passed comprehensive statutes in 1870 and 1871 in order to enforce the Fifteenth Amendment. See Force Act of 1870, ch. 114, 16 Stat. 140 (1870); Force Act of 1871, ch. 99, 16 Stat. 433 (1871) (amending the Force Act of 1870); Ku Klux Klan Act, ch. 22, 17 Stat. 13 (1871). Between 1957 and 1982 Congress passed several laws protecting the right to vote free of intimidation and arbitrary or capricious factors. See, e.g., 42 U.S.C. §§ 1971 et seq. (2012).

[11] U.S. Const. art. 1, § 4, cl. 1.

[12] See U.S. Const. art. 1, § 4, cl. 1; Wesberry v. Sanders, 376 U.S. 1, 6–7 (1964); Tashjian v. Republican Party of Conn., 479 U.S. 208, 217 (1986) ( “The power to regulate the time, place, and manner of elections does not justify, without more, the abridgement of fundamental rights, such as the right to vote.” (citation to Wesberry omitted)).

[13] Wofford, supra note 3.

[14] See Election Administration Commission, supra note 8, at 11 & n.10 (finding that, in a nationwide survey of election officials, twenty-four percent of respondents said that “voting technology and voting machine capacity” need improvement or update—the highest percentage of any category in the survey).

[15] Id. at 10 (explaining that the most common complaint of election administrators is a lack of resources and that election administrators characterize themselves as “the least powerful lobby in the state legislatures”).

[16] Michael Wines, Wary of Hackers, States Move to Upgrade Voting Systems, N.Y. Times (Oct. 14, 2017), [https://perma.cc/4A96-YC9H].

[17] See Business of Voting, supra note 8, at 23.  

[18] Kim Zetter, The Crisis of Election Security, N.Y. Times (Sept. 26, 2018), [https://perma. cc/Z6DW-JH2Q].

[19] Business of Voting, supra note 8, at 30.

[20] Id. at 32–36.

[21] Id. at 38.

[22] Id. at 42.

[23] Zetter, supra note 18.

[24] See H.R. Rep. No. 110-528, at 2–3 (2008).

[25] Business of Voting, supra note 8, at 14–15, 18–19, 54.

[26] See generally Stephen Ansolabehere & Ronald Rivest, Voting Equipment and Ballots (2013), [https://perma.cc/PX57-ZSU9].

[27] Help America Vote Act of 2002, Pub. L. No. 107-252, 116 Stat. 1666 (2002) (prior to 2010, 2018 amendments).

[28] Arthur L. Burris & Eric A. Fischer, Cong. Res. Serv., The Help America Vote Act and Election Administration: Overview and Selected Issues for the 2016 Election, at Summary (2016).

[29] See Business of Voting, supra note 8, at 11, 13, 19, 55.

[30] Zetter, supra note 18.

[31] Id.

[32] Lawrence Norden & Wilfred U. Codrington III, America’s Voting Machines at Risk—An Update, Brennan Ctr. for Just. (Mar. 8, 2018), [https://perma.cc/Z3AH-YJZW].

[33] Id.

[34] Id.

[35] Lawrence Norden & Wilfred U. Codrington III, Brennan Ctr. for Just., America’s Voting Machines at Risk 15–16 (2015), [https://perma.cc/7XZL-9UK4].

[36] Consolidated Appropriations Act, 2018, Pub. L. No. 115-141, Div. E, Tit. V (2018); see also U.S. Election Assistance Commission, 2018 HAVA Election Security Funds, [https://perma.cc/75VV-G6NW] (last visited Jan. 14, 2019).

[37] Election Administration Commission, supra note 8, at 63.

[38] See Voting Technology Project, supra note 7, at 53.

[39] Federal Election Commission, Official 2016 Presidential General Election Results 7 (Jan. 30, 2017), [https://perma.cc/MJ3V-VZ3H] (showing that 136,669,237 votes were cast in 2016 for president).

[40] Hasen, supra note 5, at 8; Election Administration at State and Local Levels, Nat’l Conf. of St. Legislatures (June 15, 2016), [https://perma.cc/R5P9-QNVN].

[41] Wofford, supra note 3; see also Sue Halpern, Election-Hacking Lessons From the 2018 Def Con Hackers Conference, New Yorker (Aug. 23, 2018), [https://perma.cc/9JXB-JQJ5] (explaining that, despite extensively documented vulnerability to hacks, the AccuVote-TSX is still in use in eighteen states).

[42] Help America Vote Act of 2002, Pub. L. No. 107-252, §§ 101–02, 253, 301, 303–04, 116 Stat. 1666 (2002) (prior to 2010, 2018 amendments).

[43] Id. at § 303.

[44] Id. at §§ 102, 301; see also Burris, supra note 28, at 5 (“Under HAVA, systems used in federal elections must provide for error correction by voters, accessibility for persons with disabilities, manual auditing, alternative languages, and error-rate standards. Systems must also maintain voter privacy and ballot confidentiality, and states must adopt uniform standards for what constitutes a vote on each system.”).

[45] Election Assistance Commission, The 2014 EAC Election Administration and Voting Survey Comprehensive Report 14, 264–65 tbl. 42 (June 30, 2015), [https://perma.cc/AQG5-8JMQ] (finding that in 2014 the DRE without a voter audit trail was the most widely deployed technology across the states and that DREs overall made up nearly seventy percent of all voting machines).

[46] Help America Vote Act of 2002, Pub. L. No. 107-252, § 102(a)(3), 116 Stat. 1666 (2002) (prior to 2010, 2018 amendments).

[47] It also ignored one of the central lessons of Bush: Volusia County. There, partly due to a faulty memory card and computer glitch, Al Gore lost 16,000 votes in a matter of minutes while the Socialist candidate gained 10,000. See Dana Milbank, Tragicomedy of Errors Fuels Volusia Recount, Wash. Post (Nov. 12, 2000), [https://perma.cc/3QYN-3XLM]; but see Zetter, supra note 18 (questioning whether the faulty memory card caused the mishap).

[48] Paul Krugman, Hack the Vote, N.Y. Times (Dec. 2, 2003), [https://perma.cc/T7XJ-MR2H]. There was also a 1969 front-page article in Los Angeles Times describing a “war games” exercise to determine if computerized punch-card readers could be rigged, which provided “a chilling look at the state of computer art and the implications it holds for future elections,” when the “offensive” team, tasked with finding ways to rig the election machines, won all six trials by successfully infiltrating the machines without being detected by the countermeasures implemented by their opponents. See Richard Bergholz, How Elections Can Be Rigged Via Computers, L.A. Times, July 8, 1969, at 1, 24.

[49] See Nixon v. United States, 506 U.S. 224, 228 (1993) (“A controversy is nonjusticiable––i.e., involves a political question––where there is a ‘textually demonstrable constitutional commitment of the issue to a coordinate political department; or a lack of judicially discoverable and manageable standards for resolving it . . . .’”); see also Baker v. Carr, 369 U.S. 186, 210–11 (1962) (discussing the nature of a “political question”).

[50] Although the intelligence community insists no results were altered, there has not been a full examination of all the evidence. “Intelligence assessments are based on signals intelligence—spying on Russian communications and computers for chatter or indicating that they altered votes—not on a forensic examination of voting machines and election networks.” Zetter, supra note 18.

[51] Russian Interference in the 2016 U.S. Elections: Hearing Before the S. Select Comm. on Intelligence, 115th Cong. 5 (2017) (statement of Samuel Liles, Acting Dir. of the Cyber Div., Office of Intelligence and Analysis, Dep’t of Homeland Sec.); see also Nat’l Intelligence Council, Office of the Dir. of Nat’l Intelligence, Intelligence Community Assessment: Assessing Russian Activities and Intentions in Recent US Elections 3 (2017), [https://perma.cc/S3BQ-UUCE].

[52] Nicole Perlroth et al., Russian Election Hacking Efforts, Wider than Previously Known, Draws Little Scrutiny, N.Y. Times (Sep. 1, 2017), [https://perma.cc/VP4R-E3MJ]; see also Matthew Cole et al., Top-Secret NSA Report Details Russian Hacking Effort Days Before 2016 Election, Intercept (June 5, 2017, 3:44 PM), [https://perma.cc/9ZMA-GV7R] (reporting on leaked NSA document detailing Russian hacking).

[53] Miles Parks, Will Your Vote Be Vulnerable on Election Day?, NPR (May 8, 2018, 5:00 AM), [https://perma.cc/RS7H-58PE].

[54] Cole et al., supra note 52.

[55] Id.

[56] Indictment at 26, United States v. Netyksho, No. 18-cr-00215 (D.D.C. July 13, 2018).

[57] The Supreme Court has pointed to a number of constitutional provisions to establish the fundamental right to vote. See, e.g., Bush v. Gore, 531 U.S. 98, 104–05 (2000) (once the state legislature vests the right to vote in its people, equal protection applies to the manner of its exercise); Anderson v. Celebrezze, 460 U.S. 780, 787–88 (1983) (the right to vote is protected by the Due Process Clause of the Fourteenth Amendment, which embraces the First Amendment); Reynolds v. Sims, 377 U.S. 533, 560–61 (1964) (the right to vote in state elections is protected by the Equal Protection Clause of the Fourteenth Amendment); Gray v. Sanders, 372 U.S. 368, 379 (1963) (same); United States v. Classic, 313 U.S. 299, 314 (1941) (the right to vote for Congressmen, and by extension participate in congressional primaries, is found in Article I, Section II of the constitution).

[58] Gray, 372 U.S. at 381.

[59] See Williams v. Rhodes, 393 U.S. 23, 30 (1968) (noting that restrictions are impermissible when they burden “the right of qualified voters . . . to cast their votes effectively”); Reynolds, 377 U.S. at 555 (noting that “the right of suffrage can be denied by a debasement or dilution of the weight of a citizen’s vote just as effectively as by wholly prohibiting the free exercise of the franchise”).

[60] Bush, 531 U.S. at 109 (“[T]here must be at least some assurance that the rudimentary requirements of equal treatment and fundamental fairness are satisfied.”); see also Reynolds, 377 U.S. at 555 (citations omitted) (“The right to vote can neither be denied outright, nor destroyed by alteration of ballots, nor diluted by ballot-box stuffing”); Gray, 372 U.S. at 380 ( “Every voter’s vote is entitled to be counted once. It must be correctly counted and reported.”); South v. Peters, 339 U.S. 276, 279 (1950) (Douglas, J., dissenting) (“The right to vote includes the right to have the ballot counted.”); United States v. Saylor, 322 U.S. 385, 387–88 (1944) (noting that the right to vote includes the right to have vote counted); Classic, 313 U.S. at 315 (“Obviously included within the right to choose . . . is the right of qualified voters . . . to cast their ballots and have them counted.”).

[61] See, e.g. Stewart v. Blackwell, 444 F.3d 843, 852 (6th Cir. 2006), superseded as moot by Stewart v. Blackwell, 473 F.3d 692 (6th Cir. 2007). The Court noted that “[v]iolations of the Equal Protection Clause are no less deserving of protection because they are accomplished with a modern machine than with outdated prejudices.” Id. at 880.

[62] Michael Tomz & Robert P. Van Houweling, How Does Voting Equipment Affect the Racial Gap in Voided Ballots?, 47 Am. J. of Pol. Sci. 46, 58 (2003); see also Daniel P. Tokaji, The Paperless Chase: Electronic Voting and Democratic Values, 73 Fordham L. Rev. 1711, 1754–68 (2005) (arguing that electronic technology can reduce or eliminate the racial disparities resulting from punch-card systems).

[63] Charles Stewart III, Residual Vote in the 2004 Election, 5 Election L.J. 158, 158 (2006).

[64] 347 F.3d 1101, 1101, 1106 (9th Cir. 2003).

[65] Weber, 347 F.3d at 1106; see also Wexler v. Anderson, 452 F.3d 1226, 1233 (11th Cir. 2006) (holding that different voting methods have different trade-offs, and the state’s important regulatory interests justify choosing between them).

[66] See, e.g., Wexler, 452 F.3d at 1233 (citing Bush v. Gore, 531 U.S. 98, 134 (2000) (Souter, J., dissenting));Weber, 347 F.3d at 1107 & n. 2 (citing the same).

[67] 460 U.S. 780, 788 (1983).

[68] 504 U.S. 428, 434 (1992).

[69] Burdick, 504 U.S. at 434 (citations omitted) (“[T]he rigorousness of our inquiry into the propriety of a state election law depends upon the extent to which a challenged regulation burdens First and Fourteenth Amendment rights. Thus, as we have recognized when those rights are subjected to ‘severe’ restrictions, the regulation must be ‘narrowly drawn’ to advance a state interest of compelling importance. But when a state election law provision imposes only ‘reasonable, nondiscriminatory restrictions’ upon the First and Fourteenth Amendment rights of voters, ‘the State’s important regulatory interests are generally sufficient to justify’ the restrictions.”).

[70] Burdick, 504 U.S. at 434; Anderson, 460 U.S. at 788; Storer v. Brown, 415 U.S. 724, 730 (1974) (noting that “as a practical matter, there must be a substantial regulation of elections if they are to be fair and honest and if some sort of order, rather than chaos, is to accompany the democratic processes”).

[71] Rosario v. Rockefeller, 410 U.S. 752, 761–62 (1973).

[72] 383 U.S. 663, 670 (1966) (“We have long been mindful that where fundamental rights and liberties are asserted under the Equal Protection Clause, classifications which might invade or restrain them must be closely scrutinized and carefully confined . . . . Those principles apply here.”).

[73] Harper, 383 U.S. at 668.

[74] Burdick, 504 U.S. at 433 (“Election laws will invariably impose some burden upon individual voters.”); Anderson, 460 U.S. at 788 (“Each provision [of election administration], whether it governs the registration and qualifications of voters, the selection and eligibility of candidates, or the voting process itself, inevitably affects—at least to some degree—the individual’s right to vote.”).

[75] Storer, 415 U.S. at 730 (“[A]s a practical matter, there must be a substantial regulation of elections if they are to be fair and honest and if some sort of order, rather than chaos, is to accompany the democratic process.”); see also Burdick, 504 U.S. at 433 (explaining that subjecting every voting regulation to strict scrutiny would “tie the hands of States seeking to assure that elections are operated equitably and efficiently”).

[76] Curling v. Kemp, 334 F. Supp. 3d 1303, 1324–25 (N.D. Ga. 2018) (noting the court’s conclusion that the plaintiffs were likely to succeed on the merits of one or more of their constitutional claims, in the context of a motion for a preliminary injunction, was a “cautious, preliminary one, especially in light of the initial state of the record,” but that the evidence sufficiently showed that “votes cast by DRE may be altered, diluted, or effectively not counted on the same terms as someone using another voting method – or that there is a serious risk of this under the circumstances”).

[77] While at least one court in the post-Bush era expressly declined to specify a precise error rate for determining when voting technology is constitutional and when it is not, see Stewart v. Blackwell, 444 F.3d. 843, 876 (6th Cir. 2006), it applied strict scrutiny based on a fully developed factual record indicating that in ten counties in Ohio the residual vote rate was over 3% in the 2000 election, id. at 872, while intentional undervoting makes up an estimated 0.23% to 0.75% of all residual votes, id. at 848, and that approximately 55,000 votes were lost in the 2000 presidential election statewide. Id. at 871. The Stewart court went so far as to say that the disparate technology at issue would fail even rational-basis review. Id. at 872; see also; Black v. McGuffage, 209 F.Supp.2d 889, 893, 899 (N.D. Ill. 2002) (finding that plaintiffs sufficiently stated an equal protection claim where jurisdictions without error notification had an average residual vote rate of 3.85%, but jurisdictions with error notification had an average residual vote rate of less than 1%).

[78] Curling, 334 F.Supp.3d at 1308, 1322. (featuring testimony of Dr. Alex Halderman, a computer scientist at the University of Michigan, showing “how a malware virus can be introduced into the DRE machine by insertion of an infected memory card (or by other sources) and alter the votes cast without detection”).

[79]  Id. at 1324.

[80] Id. at 1317.

[81] Id. at 1317.

[82] Wexler v. Anderson, 452 F.3d 1226, 1232–33 (11th Cir. 2006).

[83] Curling, 1303 F.Supp.3d at 1308–9 (Dr. Halderman demonstrated that “[t]he DRE machine’s paper tape . . . confirmed the same total number of votes, including the results of the manipulated or altered votes” in spite of the fact that the machines “record individual ballot data in the order in which they are cast and they assign a unique serial number and timestamp to each ballot”).

[84] De la Garza, supra note 4.

[85] Zetter, supra note 18.

[86] Wines, supra note 16. 

[87] Zetter, supra note 18.

[88] Halpern, supra note 41. Dr. J. Alex Halderman, a computer scientist and expert witness in Curling, demonstrated this point at Def Con’s Voting Village on a machine that remains in use in eighteen states. Id.

[89] Massimo Calabresi, The Secret History of Election 2016, Time (July 31, 2017), [https://perma.cc/L8K2-FPXQ].

[90] See generally Carol Anderson, One Person, No Vote: How Voter Suppression is Destroying Our Democracy (2018) (summarizing modern voter suppression efforts); Desmond Ang, Do 40-Year-Old Facts Still Matter? Long-Run Effects of Federal Oversight Under the Voting Rights Act 2, 39 (Harvard Kennedy Sch. Faculty Research Working Paper Series, Paper No. RWP18-033, 2018), [https://perma.cc/8M6N-UNKT] (finding suggestive early evidence that voting protections have been greatly eroded in the five years since the Court’s holding in Shelby County, Alabama v. Holder, 570 U.S. 2 (2013), that the Voting Rights Act’s continued coverage based on historical, rather than current, measures of discrimination is unconstitutional).

[91] Perlroth et al., supra note 52.

[92] Charles Stewart III & Stephen Ansolabehere, Waiting in Line to Vote, Executive Summary (CalTech/MIT Voting Project, Working Paper No. 114, 2013), [https://­perma.cc/T7KK-AH9N]; Voting Technology Project, supra note 7, at 32 (explaining that in the 2000 election, approximately one million voters said that they did not vote because the line was too long or the hours were too short).

[93] Zetter, supra note 18.

[94] Id.

[95] See id.

[96] See Calabresi, supra note 87, at 34.

[97] Id. at 34–35.

[98] Id. at 32.

[99] See Astead W. Herndon, Georgia Voting Begins Amid Accusations of Voter Suppression, N.Y. Times (Oct. 19, 2018), [https://perma.cc/A9N7-RHA7].

[100] Id.

[101] Id.

[102] Id.

[103] Voting Laws Roundup 2017, Brennan Ctr. for Just. (May 10, 2017), [https://perma.cc/ G6UF-SKVX].

Law Enforcement’s Pairing of Facial Recognition Technology with Body-Worn Cameras Escalates Privacy Concerns

Half of American adults are currently in a law enforcement facial-recognition network.[1] As the use of body-worn camera (“BWC”) technology by law enforcement increases, the demand for facial-recognition technology likewise accelerates.[2] Through grants called Smart Policing Initiatives, the U.S. Department of Justice has dedicated over $20 million to provide BWCs for law enforcement across the nation.[3] Companies are racing to integrate BWCs with facial recognition technology, hoping to eventually use artificial intelligence to recognize faces captured in real time, despite privacy concerns.[4] Once equipped with facial-recognition technology, BWCs could dramatically increase the number of individuals logged in law enforcement facial-recognition networks, enabling police officers to act as sophisticated surveillance mechanisms.[5]

Anyone passing a police officer equipped with this technology may be scanned, identified, and cataloged in a facial-recognition database without being suspected of any crime or even communicating with the officer.[6] This transforms walking down a street where police are present into a police interaction.[7] In addition to the very real possibility that bad actors might potentially get a hold of the resulting data, facial-recognition technologies disproportionately affect people of color, and integration with BWCs carries the probability of chilling free speech in public spaces. Although technology often outpaces legislation, privacy law must rise to meet the requirements of the First and Fourth Amendments in response to the integration of facial-recognition technology and BWCs.

In Part I, this essay examines the history of BWCs, contemporary use, and probable future impact. Part II analyzes how their integration with FRT disproportionately impacts African Americans, chills free speech, and implicates privacy concerns.[8] Part III describes how different federal and state courts and legislatures have handled real time data collection through new technologies.[9] This essay concludes with recommendations for lawmakers regarding retention and utilization of camera footage collected via BWCs.

I. Pairing BWCs with Facial Recognition Technologies

Increasing public attention on police shootings of unarmed black victims has ignited discussion around BWCs. But the government, the courts, and the public all lack an adequate understanding of the dangers of integrating BWCs with biometric technologies, like facial-recognition technology, and are currently ill-equipped to deal with the resulting, rapidly approaching surveillance state.

In an effort to correct unconstitutional practices and eliminate racial discrimination, a federal district court in New York ordered officers to use BWCs.[10] In Floyd v. City of New York, the court identified BWCs as an exceptional way to prevent constitutional harms.[11] First, the court found that BWCs “will provide a contemporaneous, objective record of stops and frisks.”[12] These recordings can validate whether a stop and frisk was warranted.[13] Second, the court reasoned that when citizens and police officers know that an exchange is being recorded, this will foster an environment of mutual respect and lawful interactions between the parties.[14] Third, according to the court, BWC recordings will serve as a legitimizing measure in response to police distrust, particularly in communities where stops and frisks are disproportionately directed.[15]

But law enforcement agencies like the New York Police Department are not motivated solely by protecting constitutional rights and incentivizing good behavior, as their zeal for pairing facial recognition technology with BWCs makes apparent.[16] It is expected that the use and adoption of BWCs will continue to accelerate, and the FBI has stated that adopting greater facial-recognition technologies is central to its mission.[17] But the FBI also realizes that this evolving technology will require clear policies and regulations.[18]

Given the push for law enforcement agencies to adopt innovative surveillance technologies as quickly as possible,[19] development of facial-recognition technology that will pair with BWCs is quickly gaining market importance.[20] A 2016 U.S. Department of Justice–funded study found that at least nine out of thirty-eight BWC manufacturers currently include some form of facial recognition in their camera technology or are planning for its possible future inclusion.[21]

In May of 2018, one of the largest BWC marketers, Axon,[22] gained a patent for software that can find faces and other objects in footage from body cameras in real time.[23] According to the company’s patent, “once a face is captured by a user’s body-worn camera, a hand-held device ‘provides the name of the person to the user of the capture system.’”[24] The development of such a system brings questions about misuse and the potential for arbitrary and reckless application of the technology. In response to those concerns, Axon’s CEO said

there are police forces around the world that use batons and guns in very abusive ways . . . it’s too blunt to say that because there is a risk of misuse, we should just write them off. We need to dig a layer deeper and understand what are the benefits and what are the risks.[25]

But who bears responsibility for performing that calculus? Government and law enforcement may lack the inclination to rigorously examine how pairing these technologies may present hidden dangers.

II. Ramifications of Integration

Many government agencies encourage the use of facial-recognition software with BWC-accrued footage.[26] The Department of Justice focuses on the practical benefits of receiving identification in real time and the cost savings that agencies will realize by not having to hire and train personnel to review video footage later.[27] But notwithstanding the positive aspects of melding BWCs with facial-recognition technology, numerous negative effects require the law’s attention before the technology runs rampant. Those effects include, but are not limited to, disparities in how the technology treats African Americans, chilling free speech, and vulnerability to third-party hacking and misuse of data.

It is important to understand who these paired technologies are most likely to impact, and how their technological shortcomings might exacerbate that differential treatment. For example, FRT has far higher error rates when utilized to identify African American faces.[28] Algorithms used in new technology may appear unbiased at first, but according to researchers,

 [t]he deeper we dig, the more remnants of bias we will find in our technology. We cannot afford to look away this time, because the stakes are simply too high.  We risk losing the gains made with the civil rights movement and women’s movement under the false assumption of machine neutrality.[29]

These automated systems reflect the priorities, preferences, and prejudices of their coders, and this “coded gaze” leads to tangible negative effects for African Americans.[30] Technology so prone to error should not constitute reliable or admissible evidence.

Pervasive government surveillance can also have a chilling effect on freedom of speech. This monitoring demonstrably lessens Americans’ “willingness to engage in public debate and to associate with others whose values, religion, or political views may be considered different from their own,” leading to a “spiral of silence.”[31] Anonymous free speech is protected by the First Amendment,[32] but real-time face recognition will redefine public spaces by destroying anonymity. Anonymous speech allows for the proliferation and protection of views that might be critical of law enforcement. Dissenters might be subjected to negative repercussions if they can be easily identified through the use of facial-recognition technology. And, based on current technology, over time these burdens would disproportionately fall on minorities.

Moreover, a regulatory void in this area prevents state and federal lawmakers from addressing hard questions about security and privacy as related to footage accrued via BWC. BWC data’s off-site aggregation increases the risk that bad actors can hijack facial-recognition feeds. Moving data off-site makes it more difficult to ensure that best technical practices are followed.[33] New regulations must protect the staggering amount of third-party biometric data, the collection of which creates tremendous security risk, in addition to profound privacy and civil-liberties problems.[34]

III. Contemporary Cases and Legislation Set a Legal Framework

Select jurisdictions do regulate facial recognition technologies in conjunction with BWCs, but there is no current federal legislative consensus on the matter. In 2015, Oregon passed a law barring facial-recognition searches of recordings from BWCs. That law only touches on recordings, and does not govern the use of real-time footage.[35] Recently, New Hampshire passed a similar law.[36] On a local level, the City of Cincinnati as well as six police departments have adopted similar regulations.[37] Despite this anecdotal progress, there should be a federal consensus on how to best balance technology adoption with privacy, free speech, and security.[38]

The Supreme Court has held that “innocent citizens should not suffer the shock, fright or embarrassment attendant upon an unannounced police intrusion.”[39] In 1968, following Katz v. United States[40] and Berger v. New York,[41] the federal government enacted the Wiretap Act.[42] Since then, law enforcement’s ability to wiretap a suspect’s phone or electronic device has been constrained primarily by statute, as opposed to constitutional case law.

Case law inevitably has blind spots. United States v. Carpenter created a legal loophole through which law enforcement can hold personally identifiable information until it becomes historical, and thereby usable without the need for a warrant.[43] That amount of time is, as of now, undecided. Using the Wiretap Act and the Katz concurrence as a possible framework for reform, an individual may enjoy a reasonable expectation of privacy in his image as captured by facial-recognition technology. But once law enforcement’s use of facial-recognition technology becomes ubiquitous, surveillance subjects will have more difficulty arguing that the Fourth Amendment protects their image. Thus, arguments arising out of privacy concerns are time-bound. 

Regulating law enforcement surveillance via statute is the best way to create a holistic scheme. Legislatures are better positioned than courts to research the complex effects of new technology and to draft legislation accordingly. While drafting, they can benefit from model legislation and existing biometrics laws governing commercial entities. In the meantime, the public relies on courts to protect civil liberties. When judges are given the task of governing technological innovation, they are often ill-suited to appropriately identify future risks. And jurisdiction-specific case law cannot generate a unified solution to the emerging privacy issues that law enforcement’s use of real-time facial-recognition technology on accrued BWC footage raises. Without any federal laws or decisions on the books, this practice will be largely unregulated, aside from any best practices adopted by various agencies in what might be an ad hoc manner. 

In addition to the Wiretap Act, legislators may also examine the Video Privacy Protection Act (“VPPA”) as well as the Family Educational Rights and Privacy Act (“FERPA”) to help formulate model legislation.[44] VPPA and FERPA, although old and limited in scope, provide research-backed definitions of personally identifiable information and regulate how such information should be kept, aggregated and disseminated.[45] For example, FERPA requires that personal information be shared only under specified circumstances.[46] For biometric information, this could mean compartmentalizing data into two or more different sets, with strict limits on who holds the keys connecting them. For facial-recognition technology, this would disaggregate the information that, when combined, most individuals consider private. Those separated data identifiers can include faces along with names, booking numbers, and Social Security numbers. Although compartmentalization is only a small step towards protecting data, it constitutes a massive hindrance for bad actors.

Rather than reinventing the wheel, model legislation on facial-recognition technology recently penned by the Georgetown Law School’s Privacy and Technology Center may also be broadened to include provisions directly related to the wearing of body cameras by law enforcement.[47] The model legislation includes recommendations on both the state and federal levels, and addresses many of the concerns raised in this article as to who has access to FRT data, how individuals can go about having their data technically forgotten, and proper means of training law enforcement officers. However, the legislation does not ponder the true depth of information that will be gleaned via BWC, and it completely discounts the concept of nonconsensual facial-recognition technlogy. If BWCs are running facial recognition in real time, nonconsensual collection of facial feature data will be collected and retained. While there is little question of facial-recognition technology being utilized in situations where felonies are occurring, BWC manufacturers will push law enforcement to engage facial-recognition technology capabilities at most, if not all, times. Therefore, regulations concerning retention and data aggregation are key. In addition to these concerns, facial-recognition technology’s current margin of error when identifying persons of color could lead to disproportionate effects when deployed on BWCs. Proposed legislation should fix this technical issue, while also working to better the technology and alerting law enforcement of efficacy requirements.

Conclusion

In order to best limit privacy concerns, the chilling of speech in public arenas, and current technology’s discriminatory effects, lawmakers should keep in mind the following five principles: (1) limit the facial-recognition data collected from BWCs; (2) provide notice to communities subject to law enforcement facial-recognition data collection; (3) limit the retention of footage gathered via BWC; (4) strictly limit whom the data may be shared with and for what purposes; and (5) establish independent oversight ensuring police accountability and mitigation of facial-recognition misidentification errors likely to have a racially disparate impact.

It is time for the law to address the critical gaps in democratic and constitutional protections that BWCs and facial-recognition technology create. There needs to be a national consensus on the retention and utilization of real-time camera footage accrued by BWCs. At the very least, cities and states should begin regulating law enforcement’s use of facial-recognition software as BWCs become more ubiquitous. More generally, lawmakers must address the various dangers technological integration presents before we unwittingly become a surveillance state.

 


[1] Clare Garvie et al., Geo. L. Ctr. on Privacy & Tech., The Perpetual Line-Up: Unregulated Police Face Recognition in America 1 (2016), https://www.perpetuallineup.org /sites/default/files/2016-12/The%20Perpetual%20Line-Up%20-%20Center%20on%2 0Privacy%20and%20Technology%20at%20Georgetown%20Law%20-%20121616.pdf [http://perma.cc/G9FK-ACCM].

[2] Id. at 29.

[3] Press Release, U.S. Dep’t of Justice, Department of Justice Awards Over $20 Million to Law Enforcement Body-Worn Camera Programs (Sep. 26, 2016), https://www.justice.gov /opa/pr/department-justice-awards-over-20-million-law-enforcement-body-worn-camera-programs [http://perma.cc/P7V5-6WG3]. There have been legal arguments both for and against the widespread use of body cameras. See generally Michael D. White, Police Officer Body-Worn Cameras: Assessing the Evidence (2014), http://citeseerx.ist.psu.edu/viewd oc/download;jsessionid=492D2B3F28A31AFEDFB411749436AB7F?doi=10.1.1.683.3623&rep=rep1&type=pdf.Although they were implemented following a nationwide push against the shooting of unarmed black men by police and have been widely regarded as a positive adoption when it comes to civilian–police altercations, recent studies have shown that the use of BWCs has not had any dampening effect on police violence. David Yokum, Anita Ravishankar & Alexander Coppock, Evaluating the Effects of Police Body-Worn Cameras (The Lab @ DC, Working Paper, 2017), https://bwc.thelab.dc.gov/TheLabDC_MPD_BW C_Working_Paper_10.20.17.pdf [http://p erma.cc/GN3P-QT8F].

[4] Ava Kofman, Real-time Face Recognition Threatens to Turn Cops’ Body Cameras into Surveillance Machines, The Intercept (Mar. 22, 2017, 2:23 PM), https://theintercept.c om/2017/03/22/real-time-face-recognition-threatens-to-turn-cops-body-cameras-into-surveillance-machines/ [http://perma.cc/6Z62-ACCM].

[5] Patrick Tucker, Facial Recognition Coming to Police Body Cameras, Defense One (July 17, 2017), https://www.defenseone.com/technology/2017/07/facial-recognition-coming-poli ce-body-cameras/139472/ [http://perma.cc/QF35-ALKU].

[6] Tom Simonite, Few Rules Govern Police Use of Facial-Recognition Technology, Wired (May 22, 2018, 9:35 PM), https://www.wired.com/story/few-rules-govern-police-use-of-facial-recognition-technology/ [http://perma.cc/8BHJ-4XY3].

[7] Letter from Civil Rights Groups to the Axon AI Ethics Board 1–2 (April 26, 2018), http://civilrightsdocs.info/pdf/policy/letters/2018/Axon AI Ethics Board Letter FINAL.pdf [http://perma.cc/6YJF-36EC]. It has recently been found that several cities used body cameras to gather information on Black Lives Matter protesters in order to create a “watch list.” Aris Foley, Memphis Police Store Secret Surveillance of Black Lives Matter Protesters for ‘Watch List,’ AOL.com. (Feb. 21, 2017, 12:30 PM), https://www.aol.com/article/news /2017/02/21/memphis-police-store-secret-surveillance-black-lives-matter-protesters/21718619/ [http://perma.cc/GW9F-28J2]. In addition to the First Amendment concerns raised by the Black Lives Matter allegations, it is an open question whether law enforcement’s ability to image and identify an innocent civilian presents the potential for a Fourth Amendment search.

[8] Mariko Hirose, Privacy in Public Spaces: The Reasonable Expectation of Privacy Against the Dragnet Use of Facial Recognition Technology, 49 Conn. L. Rev. 1591, 1618–19 (2017).

[9] Carpenter v. United States, 138 S. Ct. 2206, 2217 (2018) (holding that the use of cell site location information by law enforcement constitutes a search in some circumstances).

[10] Floyd v. City of New York, 959 F. Supp. 2d 668, 685 (S.D.N.Y. 2013).

[11] Id.

[12] Id.

[13] Id. (footnote omitted).

[14] Id.

[15] Id. The court also noted the benefit to officers who would be required to wear the camera. Id. (“Video recordings will be equally helpful to members of the NYPD who are wrongly accused of inappropriate behavior.”).

[16] See generally Fanny Coudert et al., Body-worn Cameras for Police Accountability: Opportunities and Risks, 31 Computer L. & Sec. Rev. 749 (2015) (providing an overview around the goals of BWCs and the risks they may present going forward).

[17] Statement Before the House Committee on Oversight and Government Reform, Kimberly J. Del Greco, Deputy Assistant Director, Criminal Justice Information Services Division of the Federal Bureau of Investigations, Law Enforcement’s Use of Facial Recognition Technology (Mar. 22 2017), https://www.fbi.gov/news/testimony/law-enforcements-use-of-facial-recognition-technology [http://perma.cc/6JRD-AYXE] (“[W]e at the FBI cannot fail to meet our assigned mission. We must continue to exceed expectations and never rest on past successes. Hence, we must embrace new technologies such as automated FR and optimize allocated resources to achieve mission objectives.”).

[18] Vivian Hung et al., The Johns Hopkins University Applied Physics Laboratory, A Market Survey on Body Worn Camera Technologies 404 (2016), https://www.ncjrs. gov/pdffiles1/nij/grants/250381.pdf [http://perma.cc/5Y7F-K8X4].

[19] Jennifer Lynch, Electronic Frontier Foundation, Face Off: Law Enforcement Use of Face Recognition Technology 1 (2018), https://www.eff.org/files/2018/02/15/face-off-report-1b.pdf [http://perma.cc/6S86-G3BW].

[20] Felix Juefei-Xu et al., A Preliminary Investigation on the Sensitivity of COTS Face Recognition Systems to Forensic Analyst-style Face Processing for Occlusions 25 (Conference on Computer Vision and Pattern Recognition Workshop Paper, 2015), http://xuj uefei.com/felix_cvpr15_cots.pdf [http://perma.cc/WCF9-HES3].

[21] Lynch, supra note 19, at 21.

[22] Taylor Soper, Police Body Cam Maker Axon Buys Vievu, Ending Competition Between Rivals, GeekWire (May 4, 2018, 10:21 AM), https://www.geekwire.com/2018/police-body-cam-maker-axon-buys-vievu-ending-competition-rivals [http://perma.cc/BK35-R62W] (citing Joshua Brustein, The Biggest Police Cam Company Is Buying Its Main Competitor, Bloomberg (May 4, 2018, 10:00 AM), https://www.bloom berg.com/news/artic les/2018-05-04/the-biggest-police-body-cam-company-is-buying-its-main-competitor [http://perma.cc/­C2TL-JXPE]).

[23] Alex Pasternack, Cop Cameras Can Track You in Real-Time and There’s No Stopping Them, FastCompany, (July 31, 2018), https://www.fastcompany.com/40564084/cop-came ras-can-track-you-in-real-time-and-theres-no-stopping-them [http://perma.cc/BZW2-S7ZS].

[24] Id.

[25] Ian Wren & Scott Simon, Body Camera Maker Weighs Adding Facial Recognition Technology, NPR (May 12, 2018, 8:07AM), https://www.npr.org/2018/05/12/61032088/wha t-artificial-intelligence-can-do-for-local-cops [http://perma.cc/5EWD-AEVQ].

[26] Kelly Blount, Body Worn Cameras With Facial Recognition Technology: When it Constitutes a Search, 3 Crim. L. Prac. 61, 63 (2017).

[27] Hung et al., supra note 18, at 403.

[28] Joy Buolamwini & Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, 81 Proc. of Machine Learning Res. 1 (2018), http://pro­ceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf [http://perma.cc/HRR9-69HX] (scrutinizing algorithmic bias in FRT from Microsoft, IBM and Face++ Cognitive Services showing significant differences in average error rates between light-skinned men and dark-skinned women).

[29] Overview of Gender Shades Project, MIT Media Lab, Massachusetts Institute of Technology School of Architecture + Planning, https://www.media.mit.edu/projects/gender-shades/overview/ [http://perma.cc/AFB3-GHVF] (last visited Nov. 18, 2018).

[30] Id.

[31] Lynch, supra note 19, at 9.  (describing the spiral as “the significant chilling effect on an individual’s willingness to publicly disclose political views when they believe their views differ from the majority”). The EFF points to evidence accrued from a social-media experiment, when in 2016, research documented the silencing effect on participants’ dissenting opinions when they knew of government surveillance—participants were much less likely to express negative views of government surveillance on Facebook when they perceived that those views were “outside the norm.” Id.

[32] McIntyre v. Ohio Elections Comm’n, 514 U.S. 334, 357 (1995) (“Anonymity is a shield from the tyranny of the majority. It thus exemplifies the purpose behind the Bill of Rights, and of the First Amendment in particular: to protect unpopular individuals from retaliation—and their ideas from suppression—at the hand of an intolerant society.” (citation omitted)).

[33] Lynch, supra note 19, at 21.

[34] Garvie et al., supra note 1, at 1.

[35] Or. Rev. Stat. § 133.741(1)(b)(D) (2015).

[36] N.H. Rev. Stat. Ann. § 105-D:2(XII) (2017).

[37] Cincinnati Police Dep’t, Procedure 12.540, Body Worn Camera System (2016), https://www.cincinnati-oh.gov/police/assets/File/Procedures/12540.pdf [http://perma.cc/4N­NA-8LAX]; see Garvie et al., supra note 1 (providing background data on state and city policies related to BWCs and facial-recognition technology).

[38] Cf. Rachel Levinson-Waldman, Hiding in Plain Sight: A Fourth Amendment Framework for Analyzing Government Surveillance in Public, 66 Emory L.J. 526, 530 (2016) (advocating for a judicial, rather than legislative, consensus by articulating a six-part framework to guide Fourth Amendment analysis).

[39] Ker v. California, 374 U.S. 23, 57 (1963) (Brennan, J., concurring in part and dissenting in part) (footnote omitted).

[40] 389 U.S. 347 (1967).

[41] 388 U.S. 41 (1967).

[42] 18 U.S.C. § 2511 (2012). The Wiretap Act, officially Title III of the Omnibus Crime Control and Safe Streets Act, attempted to codify the Fourth Amendment principles set forth by Katz v. United States, 389 U.S. 347 (1967). Current model legislation regarding facial-recognition technology seeks to impose annual reporting of facial-recognition technology used by law enforcement agencies, similar to analogous requirements under the Wiretap Act. Garvie et al., supra note 1, at 102–15.

[43] Jake Laperruque, Privacy After Carpenter: We Need Warrants for Real-Time Tracking and “Electronic Exhaustion,” POGO (Jul. 2, 2018), https://www.pogo.org/analysis/2018/0 7/privacy-after-carpenter-we-need-warrants-for-real-time-tracking-and-electronic-exhaustion/ [http://perma.cc/VSSF-6U2L].

[44] Video Privacy Protection Act, 18 U.S.C. § 2710 (2012); Family Educational Rights and Privacy Act, 20 U.S.C. § 1232g (2012).

[45] See 18 U.S.C. § 2710(a)(3), (d); 20 U.S.C. § 1232g(b)(1)(K)(i)– (ii).

[46] Joel Reidenberg et al., Fordham L. Sch. Ctr. on Law & Info. Pol’y, Privacy and Cloud Computing in Public Schools 4–6 (2013), https://ir.lawnet.fordham.edu/cgi/viewcontent.cg i?article=1001&context=clip [http://perma.cc/7HBG-H9S6].

[47] Garvie et al., supra note 1, at 102–15.