Justifying ‘responsible encryption’ laws
One key feature of the rhetoric used to justify ‘responsible encryption’ laws is a focus on empirical claims rather than normative issues. In this sense, the rhetoric of going dark 'brackets out' moral discourse from historical and contemporary policy debates about E2EE. The first contemporary attempt within the FVEY to regulate E2EE by invoking a precursor to the rhetoric of going dark came within the US
Comprehensive Counter-Terrorism Bill (
1991) and associated efforts at implementing backdoors into telecommunications (i.e., Key Escrow). For example, Section 2201 of the
Comprehensive Counter-Terrorism Bill (
1991) included a non-binding resolution expressing a view that:
[P]roviders of electronic communications services and manufacturers of electronic communications service equipment shall ensure that communications systems permit the government to obtain the plain text contents of voice, data, and other communications when appropriately authorized by law.
While the Bill was ultimately withdrawn, it was soon followed by the US Government’s Clipper Chip initiative. This initiative involved implementing escrowed keys into American telephones by installing a chip with a symmetric key cipher called
Skipjack. Copies of the cryptographic keys stored by the government would be available to law enforcement upon issue of a judicial warrant (Pednekar-Magal & Shields,
2003, p. 443). Once again, the US Government’s argument was a precursor to the rhetoric of going dark. For example, the White House (
1994, para. 2) explained the rationale for key escrow in the following terms:
Advanced encryption technology offers individuals and businesses an inexpensive and easy way to encode data and telephone conversations. Unfortunately, the same encryption technology that can help Americans protect business secrets and personal privacy can also be used by terrorists, drug dealers, and other criminals.
The implementation of key escrow would therefore “provide Americans with secure telecommunications without compromising the ability of law enforcement agencies to carry out legally authorized wiretaps” (The White House,
1994, para. 4). This framing narrowly focuses on the empirical utility of key escrow as a policy solution. Ultimately, through a coalition of digital rights advocacy and corporate interests in secure e-commerce (Levy,
2001, pp. 305–311), various efforts at weakening E2EE during the 1990s failed.
Following the September 11, 2001 attacks, the US Government significantly expanded the digital surveillance capabilities of law enforcement and intelligence agencies. Consequently, debates about encryption law and policy mostly disappeared from the public sphere as the FVEY focused on covert methods of digital surveillance. For example, Project BULLRUN was an NSA operation involving multiple strategies for cracking encryption, including the intentional weakening of cryptographic standards, pursuing developments in cryptanalysis without public disclosure, and forming covert agreements with technology companies to enable access to data (Yoo,
2014, p. 34). It has been reported the NSA pushed for the standardisation of a flawed pseudorandom number generator for use in cryptosystems known as the Dual EC Standard (Dual_EC_DRBG) as part of BULLRUN (Bernstein et al.,
2016; Menn,
2013, para. 2). During this period the ‘arms race’ was more covert—taking place behind the scenes. This was aided by assertions that digital surveillance capabilities needed to remain secret to protect national security (Masco,
2010, p. 448).
Yet the rhetoric of going dark re-emerged within public debates about encryption law and policy in the post-Snowden era. Use of the rhetoric thus reflected a renewed commitment by the FVEY to justify restrictions on E2EE. Specifically, the phrase ‘going dark’ was introduced into the encryption policy lexicon during an appearance before the US House Judiciary Committee by FBI General Counsel Caproni (
2011, para. 4), where she described a “capabilities gap” between the state’s lawful authority and technical abilities:
We call this capabilities gap the “Going Dark” problem. As the gap between authority and capability widens, the government is increasingly unable to collect valuable evidence in cases ranging from child exploitation and pornography to organized crime and drug trafficking to terrorism and espionage—evidence that a court has authorized the government to collect. This gap poses a growing threat to public safety.
By focusing narrowly on the ‘lawfulness’ of the authority to intercept communications, Caproni (
2011) sidesteps questions about the ethics of digital surveillance. Instead, she presumes the state ought to have such authority. Similarly, in the aftermath of the 2015 San Bernardino attack, the FBI argued Apple needed to assist with providing access to the contents of a perpetrator’s mobile phone due to the mere existence of a warrant (Bay,
2017). Thereby, the FBI employed a rhetorical device known as a
metonym to subsume moral concerns about data access into the fact that a legal ‘warrant’ had been issued (Lauer & Lauer,
2018, p. 54).
Appeals to the problem of going dark has thus become the dominant rhetorical strategy deployed by the FVEY for justifying ‘responsible encryption’ laws over the past decade. For example, in an address to the Brookings Institute, FBI Director Comey (
2014, paras. 11, 32 and 59) made the following remarks about access to encrypted communications:
Those charged with protecting our people aren’t always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority. We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technical ability to do so… We aren’t seeking a back-door approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process—front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks… We need assistance and cooperation from companies to comply with lawful court orders, so that criminals around the world cannot seek safe haven for lawless conduct.
Again, this rhetoric avoids addressing underlying questions about whether citizens
ought to be able to render their communications inaccessible to the state. A presumption that the state has moral authority to access data is articulated using metonyms of ‘lawfulness’, ‘warrants’, and ‘front doors’. Indeed, public officials have argued they are simply introducing the ‘rule of law’ into otherwise ‘lawless spaces’, ‘law-free zones’, and ‘hiding places’ (Hewson & Harrison,
2021, pp. 11, 12). Furthermore, the audience is directed to focus on the ‘needs’ of law enforcement and the reasonableness of ‘assistance and cooperation’ from service providers. As such, fundamental ethical questions are ‘bracketed out’ and replaced by technocratic claims about the necessity and proportionality of ‘responsible encryption’ laws mandating access to plaintext copies of encrypted communications (see also Rodenstein,
2017, paras. 50–52). This strategy of narrowly focusing on empirical questions is particularly evident in the argumentative discourse between privacy advocates and law enforcement officials.
Narrowing the right to encrypt
There are persuasive reasons for recognising an instrumental ‘right to encrypt’. However, rights are never absolute, and conflicts must be resolved through the application of moral principles (Waldron,
1989,
2003). It may be useful to first consider this problem in abstract terms. Within a community,
Alice might be considered ‘free’ insofar as she is not interfered with by
Bob. Thus, to protect
Alice’s right to non-interference,
Bob must be prevented from interfering. Some degree of interference with
Bob’s freedom is thus required to protect
Alice’s freedom. In this sense, interferences with freedoms are built into the basic logic of rights. Furthermore, such logics justify pre-emptive interferences with freedoms (e.g., see Ashworth and Zedner,
2014). For example, to prevent
Bob interfering with
Alice, access to the ‘instruments of interference’ (such as E2EE) might need to be regulated by the state. This can thus limit
Carol’s right to access such instruments. Thus, resolving conflicts of rights, and thereby defining their scope, requires the elucidation of moral principles.
The rhetoric of going dark excels at narrowing the ‘right to encrypt’ using a façade of technocratic precision by applying principles of
necessity,
proportionality, and
accountability (see Mann et al.,
2018, p. 378). Indeed, such principles are used to ‘trade-off’ interests in ways that favour the advocates of digital surveillance programs (Bronitt & Stellios,
2005, p. 887; Barnard-Wills,
2011, p. 555; Suzor et al.,
2017, p. 3). Competing claims about the (un)necessity and (dis)proportionality of digital surveillance go to questions about their capacity to achieve desired outcomes, the reasonableness of an interference, and the viability of less intrusive policy options (Macnish,
2018, pp. 145, 151), while claims about (un)accountability concern what social and legal structures provide sufficiently independent oversight of decisions about, and the exercise of, digital surveillance powers (Mann et al.,
2018, pp. 378, 379). Yet it is the malleability of such principles that render them vulnerable to distortion in rhetorical justifications for ‘responsible encryption’ laws.
The first requirement for satisfying the necessity principle is that there is, indeed, a policy problem that requires state intervention. Few scholars contest that cryptography is misused by criminals. Indeed, offenders have been observed to consciously use encryption to evade detection by law enforcement and intelligence agencies (van der Bruggen & Blokland,
2021, p. 960; Jardine,
2021, p. 13; Kowalski et al.,
2019, p. 248; Hutchings and Holt,
2015, p. 600). While the specific offences vary in severity, this includes harms such as the distribution of child exploitation material (O’Brien,
2014, pp. 247, 248; Maras,
2014, p. 22) and the sale of illegal drugs and weapons (Phelps & Watt,
2014, pp. 266, 267; Martin,
2014, p. 358). For example, a semantic analysis of indexed webpages on the dark web (n = 1171) suggests at least 18% of darknet sites are being used for distributing child exploitation material (Guitton,
2013, p. 2809). Similar research using an automated web crawler suggests 10–15% of darknet pages distribute such material (Spitters et al.,
2014, p. 223). Further, an analysis of the
Silk Road 2.0 marketplace suggests illicit substances constituted 19% of all advertised products (Dolliver,
2015, p. 1119). Thus, it is easy to satisfy the first requirement of the necessity principle.
Yet despite these criminal misuses of cryptography, the critics of ‘responsible encryption’ laws argue they are still not necessary. That is, the necessity principle is not satisfied if there are alternative solutions available (e.g., Swire and Ahmad,
2012, p. 420). This claim suggests law enforcement have (more than) enough access to information for the purposes of criminal investigations. For example, the American Civil Liberties Union (
2015, para. 9) has argued that “encryption is not a problem to be solved” on the basis that:
[L]aw-enforcement authorities are now operating in a “golden age of surveillance.” While technology promises to secure the content of our communications, it has at the same time made our lives more transparent to law enforcement than ever before. With little effort, police forces can now determine a suspect’s exact location over a period of months, his every confederate, and every other digital fingerprint he leaves when interacting with technology.
As such, it is claimed the investigative capabilities of law enforcement have not been unduly disrupted by cryptography (e.g., Walden,
2018; Koops and Kosta,
2018). Alternative investigatory options include undercover operations, the use of cryptanalysis to target vulnerabilities in the implementation of cryptosystems, metadata surveillance capabilities, and open-source intelligence gathering (Walden,
2018, pp. 905, 906). Indeed, one analysis of Dutch court cases between 2015 and 2019 (n = 3214) suggests, of those cases that proceed to trial, “law enforcement appears to be as successful in prosecuting offenders who rely on encrypted communication as those who do not” (Hartel & van Wegberg,
2021, p. 1).
However, the necessity principle remains malleable to distortion by the rhetoricians of going dark. These advocates of ‘responsible encryption’ laws merely assert such powers are necessary by focusing on the impacts on
investigations rather than prosecutions. For example, former FBI director Wray (
2017, para. 36) noted that “in the first 11 months of this fiscal year alone, we were unable to access the content of more than 6900 mobile devices using appropriate and available tools, even though we had the legal authority to do so”. Similarly, the Australian Department of Home Affairs (
2018, p. 5) has argued that over 90% of the Australian Security Intelligence Organisation’s priority investigations are disrupted by encryption. This data is drawn from law enforcement agencies themselves, rendering it difficult for digital rights advocates to contest. The claim is reiterated by the rhetoricians of going dark, who deploy the necessity principle to narrow the scope of any ‘right to encrypt’.
The proportionality principle is similarly deployed to both justify and criticise the reasonableness of ‘responsible encryption’ laws. For example, cybersecurity experts have warned that weakening cryptosystems will “open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seek to defend” (Abelson et al.,
2015, pp. 24, 25). Thus, they argue ‘responsible encryption’ (and exceptional access) laws will lead to greater costs than their purported benefits. Therein, even if such powers are necessary to prevent the criminal misuses of cryptography, the detrimental consequences for cybersecurity interests undermine the state’s claims about proportionality. Yet the rhetoricians of going dark simply focus on different categories of harm to satisfy the proportionality principle and justify ‘responsible encryption’ laws. For example, the top law enforcement officials in the FVEY (Office of Public Affairs,
2020, paras. 2, 3) have argued:
Particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children… We call on technology companies to work with governments to… [e]nable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight.
The proportionality principle is here deployed to assert that harms to “sexually exploited children” are severe enough to warrant “law enforcement access to content in a readable and usable format”. This strategy of framing the issue as about selective types of harm is well-documented as appealing to the “horsemen of the infopocalypse” (Jordan,
2015, pp. 104, 105; Carey and Burkell,
2007). As such, the rhetoricians of going dark focus on selective harms, ignoring concerns about cybersecurity vulnerabilities if access to E2EE is restricted.
This rhetorical strategy is possible because of the fluidity of ‘harm’ as a moral signifier. Indeed, counterarguments that digital surveillance programs are
disproportionate rely upon competing claims that the ‘harms’ occurring in cyberspace are being exaggerated (Yar & Steinmetz, 2019, pp. 97; 210–213). For example, technologist Schneier (
2019, paras. 2, 7) has argued regulators are “scaring people into supporting backdoors”:
Since the terrorist attacks of 9/11, the US government has been pushing the terrorist scare story. Recently, it seems to have switched to pedophiles and child exploitation… None of us who favor strong encryption is saying that child exploitation isn’t a serious crime, or a worldwide problem. We’re not saying that about kidnapping, international drug cartels, money laundering, or terrorism. We are saying three things. One, that strong encryption is necessary for personal and national security. Two, that weakening encryption does more harm than good. And three, law enforcement has other avenues for criminal investigation than eavesdropping on communications and stored devices.
These types of claims that “weakening encryption does more harm than good” similarly assume there are objective measures of ‘harm’ that can inform assessments of proportionality. Yet the severity of a ‘harm’ is not only about the quantity of events. It is also linked to intersubjective judgements about the moral gravity of an action. Indeed, Cohen (
2002, p.xxxiv), the theorist of moral panics, has observed, “we have neither the quantitative, objective criteria to claim that R (the reaction) is ‘disproportionate’ to A (the action) nor the universal moral criteria to judge that R is an ‘inappropriate’ response to the moral gravity of A”. As such, given how cryptography can be misused by criminals, the proportionality principle is readily distorted by rhetoricians to narrow the scope of a ‘right to encrypt’.
The proportionality principle is also malleable due to the connections rhetoricians draw between ‘harm’ and the ‘risk’ of harm. As Ashworth and Zedner (
2014, p. 95) have argued, “[i]f a certain form of wrongdoing is judged serious enough to criminalize, it ought surely to follow that the state… should assume responsibility for taking steps to protect people from such wrongdoing and harm”. This is the logic of preventive justice, where “the possibility of forestalling risks competes with and even takes precedence over responding to wrongs done” (Zedner,
2007, p. 262). Indeed, interviews with cybersecurity experts highlight how concerns about ‘harm’ are predominantly based upon hypothetical, rather than actual, scenarios (Carroll & Windle,
2018, p. 285). For example, former FBI director Comey (
2014, paras. 11, 32) made the following observation about the protective and preventive functions of law enforcement access to encrypted communications:
We call it “Going Dark,” and what it means is this: Those charged with protecting our people aren’t always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority… We are completely comfortable with court orders and legal process—front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks.
These protective and preventive functions of digital surveillance programs are thereby invoked within the rhetoric of going dark to further shift the ‘balance’ in assessing proportionality. Consequently, competing claims about the disproportionality of ‘responsible encryption’ laws are readily neutralised by invoking the risks of cybercrime and terrorism.
One final aspect of the rhetoric of going dark used to narrow the scope of the ‘right to encrypt’ is the principle of
accountability. Such rhetoric invokes the ‘objective’ and ‘neutral’ role of the judiciary or subject-matter experts as arbiters of striking the right ‘balance’ between competing interests. However, such ‘safeguards’ are also readily deployed in pursuit of digital surveillance programs (Mann et al.,
2018, p. 378). For example, one Five Country Ministerial Communique (
2018, paras. 25–26) included the following comments:
Each of the Five Eyes jurisdictions will consider how best to implement the principles of this statement, including with the voluntary cooperation of industry partners. Any response, be it legislative or otherwise, will adhere to requirements for proper authorization and oversight, and to the traditional requirements that access to information is underpinned by warrant or other legal process.
The rhetoricians of ‘going dark’ thus employ the accountability principle (‘authorization and oversight’) as a strategy for negating concerns about the scope of ‘responsible encryption’ laws. In this sense, the requirement to obtain a ‘warrant’ is framed as sufficient for satisfying the principle (e.g., Lauer and Lauer,
2018). There is no consideration that such powers might require democratic or individual models of consent, and instead faith is placed in judicial and technical experts as sources of non-arbitrary power. This highlights the limitations of existing arguments for a ‘right to encrypt’ insofar as they rely upon the same technocratic principles as the advocates of ‘responsible encryption’ laws. Consequently, rhetoricians are able to narrow the scope of such a right via the malleability of necessity, proportionality, and accountability.