Skip to main content
Erschienen in: AI & SOCIETY 1/2022

11.03.2021 | Original Article

Algorithmic augmentation of democracy: considering whether technology can enhance the concepts of democracy and the rule of law through four hypotheticals

verfasst von: Paul Burgess

Erschienen in: AI & SOCIETY | Ausgabe 1/2022

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The potential use, relevance, and application of AI and other technologies in the democratic process may be obvious to some. However, technological innovation and, even, its consideration may face an intuitive push-back in the form of algorithm aversion (Dietvorst et al. J Exp Psychol 144(1):114–126, 2015). In this paper, I confront this intuition and suggest that a more ‘extreme’ form of technological change in the democratic process does not necessarily result in a worse outcome in terms of the fundamental concepts of democracy and the Rule of Law. To provoke further consideration and illustrate that initial intuitions regarding democratic innovation may not always be accurate, I pose and explore four ways that AI and other forms of technology could be used to augment the representative democratic process. The augmentations range from voting online to the wholesale replacement of the legislature’s human representatives with algorithms. After first noting the intuition that less invasive forms of augmented democracy may be less objectionable than more extreme forms, I go on to critically assess whether the augmentation of existing systems satisfies or enhances ideas associated with democracy and the Rule of Law (provided by Dahl and Fuller). By imagining a (not too far-fetched) future in a (not too far-removed) democratic society, my conclusion is that, when it comes to democracy and the Rule of Law, intuitions regarding technology may lead us astray.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Fußnoten
1
In 2020 alone, a number of articles were published both on which the argument in this article is based and which are extended by the argument in this article (Araujo et al. 2020; Boyles and Joaquin 2020; Cristianini and Scantamburlo 2020; de Fine Licht and de Fine Licht 2020; Hagendorff and Wezel 2020; Leyva and Beckett 2020; Naudé and Dimitri 2020).
 
2
Algorithm aversion relates to the preference of human judgement over machines (Dietvorst et al. 2015). This common intuition also has been expressed in business (Frick 2015). Whilst the counter-point to this intuition is algorithm appreciation (Logg et al. 2019); this does not dispel the intuition.
 
3
For a detailed and fascinating explanation of the Australian Ballot and its history, see (Brent 2006).
 
4
There can be a bi-directional effect between science fiction and design scenarios—so the ‘not (completely) science fiction’ is not intended to be a negative; instead, the—sometimes considerable—extensions of existing technologies can be useful. See, for example, (Thibault et al. 2020). (I am grateful to the anonymous reviewer for making this point clear.) There are, however, some ideas that would fall far outside what AI currently can do: (Hagendorff and Wezel 2020).
 
5
For a contemporary assessment of the use of AI and government decision making that takes a less hypothetical approach, see (Zalnieriute et al. 2019).
 
6
There are many suggestions that AIs will not be friendly, even if there is a compelling argument that this is what is needed (Boyles and Joaquin 2020; Muehlhauser and Bostrom 2014). I do not suggest these are not likely, nor that their consideration may not be useful, but my consideration is focussed on the non-dystopian idea that the technologies I explore will be friendly (and useful). What is clear is that the promotion of friendly or beneficial AI is a vital research goal (Baum 2017).
 
7
Blockchain technology facilitates a form of encrypted ledger that is held across a number of—non-centralised—locations.
 
8
For suggestions of blockchain being used to facilitate online voting, see (Pilkington 2016). See also (Kshetri and Voas 2018; Pawade et al. 2020). There are, however, various other uses that also go beyond its crypto-currency origins—including alternative methods of redistribution (Potts et al. 2017). (Thanks to the anonymous reviewer for pointing this out.).
 
9
As with all of the ideas raised here, there are relative pros and cons that could be raised. Online voting can have implications related to the legitimacy of democratic procedures. Kersting and Baldersheim provide a collection relating to many of these issues (Kersting and Baldersheim 2004).
 
10
This would not remove the problems associated with potential coercion when casting votes. Whilst I do not address this in this hypothetical, and even though it remains a threat, it seems unlikely to be scalable to mass voter fraud.
 
11
The hypothetical sidesteps broader concerns regarding direct democracy. In this respect, see (Raible and Trueblood 2017).
 
12
A similar idea has been suggested as being a ‘technocracy’ (Khanna 2017).
 
13
A more extreme use of blockchain reflects coins being issued that can then be ‘spent’ as a way of voting and as a way to illustrate preferences (Chandra 2018). Similar ideas relate to concepts like liquid democracy. See, for example (Blum and Zuber 2016). For liquid democracy in computer science focussed terms, see (Kahng et al. 2018).
 
14
This would be captured under the ‘anxiety of control’ idea presented by Leszczynski. (Leszczynski 2015) Crawford has raised some of the fears of surveillance of this sort. (Crawford 2014) This is connected to the technological sovereignity idea explored recently by Mann et al. (2020). Mann, in particular, has also been addressed similar issues that bring together and state-based consideration of, inter alia, privacy (Mann et al. 2018; Mann and Daly 2019; Mann and Matzner 2019).
 
15
A data-driven government may be more effective and more efficient through being more responsive (Goldsmith and Crawford 2014). There may be significant efficiency gains—but with existing technology, they may not necessarily result in benefit without human input (Mehr 2017).
 
16
Beyond the hypothetical outlined here, this desire is—generally speaking—one that is applied to the way in which AI is currently being deployed in the public sphere (Hanania and Thieullent 2020).
 
17
The intelligence that is imagined behind the algorithm is clearly far beyond that which currently exists. What is envisioned is something that could replicate human thought processes of decision making and assessment of information—yet, in a way that is transparent and methodical.
 
18
This argument has recently been applied in the closely related sphere of administrative decision-making (Hermstrüwer 2020).
 
19
Khanna provides a compelling argument for a similar position (Khanna 2017). In some senses, people may prefer algorithmic judgement (Logg et al. 2019).
 
20
This statement applies to, and only holds in, the context of the hypothetical. As was helpfully, and accurately, raised by the anonymous reviewer, the relative assessment of extreme-ness may not hold in relation to a change of governmental control within authoritarian dictatorships.
 
21
As will be apparent from the volume of work cited above, there is considerable work on the impact and operation of AI in public decision-making. There is also specific research on the process of legal reasoning and the operation of legislative norms: (CSIRO, n.d.). (Thanks to the anonymous reviewer for flagging this research).
 
22
The other five relate to: democracy guaranteeing fundamental rights; democracy fostering human development; the facilitation of a high level of political equality; the notion that democratic nations do not fight wars with one another; and that democratic contrives tend to be more prosperous. These are of little relevance to the augmented democracy ideas. (Dahl and Shapiro 2015, pp. 44–61).
 
23
There would be financial savings. Other benefits include the relative ease of going to the polls (easing disruption to the working day). A useful summary of pros and cons of internet voting more generally is provided by Rachel Gibson (2001).
 
24
As noted above, a summary of some of the issues can be found at (Raible and Trueblood 2017).
 
25
Whilst this claim is based on a purely practical reality, and some of the very real contemporary problems noted by Raible and Trueblood (ibid) and by Walton (2007) there are, however, some that have taken a more positive view of human potential (McCullagh 2003; Morris 2000).
 
26
Discussions regarding positive and negative freedoms, and the recent distinctions made between liberty and liberalism by people like Quentin Skinner come to mind here. See (Skinner 2012).
 
27
This—as was helpfully pointed out by the anonymous reviewer—is not merely the ambit of futuristic hypotheticals. Substantial fears regarding the use of data for surveillance, monitoring, and control exist (Jia 2020; Mann et al. 2020; Qiang 2019).
 
28
I.e. one that is not tainted by social or peer pressures. This, however, creates—even in relation to technology that ‘disappears’—the observer’s paradox: whilst the aim must be to find out how people talk when they are not being systematically observed, yet this can only be obtained by systematic observation (Labov 1972, p. 209). The observed people may amend their behaviour—but the 'invisible’ nature of the technology would decrease this effect as much as possible. (Outside of the hypothetical in which anonymity is possible and maintained, in a real state—especially a totalitarian or oppressive regime—even the ‘invisible’ nature may mean individuals never communicate truthfully).
 
29
Which requires also putting aside discussions associated with the identification of the Rousseauian general will or cognate ideas.
 
30
Within the confines of this hypothetical, the importing of such a function does not appear to be too much of a stretch.
 
31
For a useful overview of liability issues, see (Webb 2016).
 
32
Beyond this, there is also the potential for the creation of a technocratic elite of programmers or organisations that create the algorithms. I raise some of these issues in "Retaining the humans".
 
33
What is clear, however, is that there can be effective use of digital platforms and the algorithms that operate in digital/social media to power a narrative that may run counter to either the mass media or the more widely-held view (Leyva and Beckett 2020; Rydgren and van der Meiden 2019; Schroeder 2019). (I am grateful to the anonymous reviewer for pressing for clarity on this point.)
 
34
For an overview of the issues around contestation and the various accounts that are seen as canonical see: (Burgess 2017).
 
35
Three illustrations can be found in (Dicey 1979, pp. 202–203; Hayek 2007, p. 112; Raz 2009, p. 210).
 
36
For example, see (Bingham 2010). See also (UN Security Council 2004, para. 6).
 
37
This formed one of Dicey’s desiderata. It was also apparent in Aristotle’s ideas and Locke’s. (Aristotle et al. 1981, para. 1287aI; Dicey 1979, pp. 188–198 and 202–203; Locke 1988, sec. 135).
 
38
A useful discussion of these points in the way that Fuller describes them is provided by Waldron: (2016).
 
39
This puts to one side the potential for relative changes in the outcome of elections as a result of a lazy yet tech savvy populous voting.
 
40
The latter aspect is also addressed below in terms of different desiderata: that of clarity or certainty.
 
41
The idea that the law may need to be interpreted by experts is touched upon by Waldron in his discussion of Fuller. (Waldron 2016).
 
42
This could be said in the general sense that the issue has been more widely promulgated or whether this process reflects a greater acknowledgement of individuals’ dignity.
 
43
Here, I mean something more than the intentional replacement of one act with another due to an overarching shift in a wider policy.
 
44
At this stage, it is relevant to note that there is a difference between Algorithm and Algorithm+ in terms of the relative positivity of the outcome of assessing the desiderata, and that Algorithm+ also reflects a relative procedural change.
 
45
For example, there would be no reason why groups that hold any view—even a morally repugnant like a neo-nazi party—would be precluded from creating an algorithm that reflects their ideology. There seems here to be a risk that the building of the algorithms would create an autocratic class that would, in effect, control the system by virtue of the fact that they could control, in effect, the ‘parties’ that could stand. Whilst this is true, it seems eminently possible that there could be a simple template way to create an algorithm. Although, this may just shift the question one step further. In any event, the same argument exists in any party formation process. (The argument that the ‘simple’ creation of an algorithm may decrease the transaction costs associated with the creation of a political party or movement, thus making morally repugnant parties more prevalent, is one that must await another discussion).
 
46
This is relatively unsurprising as, in effect, Blockchain+ is to direct democracy what Blockchain is to representative democracy.
 
47
Aristotle, decries the making of hasty and emotion-fuelled decisions and refers to the inclusion of (what would now may be called) legislative due process. (Aristotle 2004, bks. 1, Ch. 1).
 
48
I am grateful to the anonymous reviewer for drawing my attention to this potential.
 
49
The very nature of those options would be strongly contested. Extreme differences of opinion between individuals with different political opinions—for example, individualists and collectivists—would radically impact what these options may be.
 
50
Other research explores – in more tangible ways than the hypotheticals considered here—the design and development of systems that both relate to identifying issues (Chen et al. 2020; Monteiro 2019), or identifying a way forward (Foth et al. 2015; Komninos and Zuber 2019). (Thanks, again, to the reviewer for pointing to the importance of these.)
 
Literatur
Zurück zum Zitat Aristotle (2004) Rhetoric (W. R. Roberts, Trans.). Dover Aristotle (2004) Rhetoric (W. R. Roberts, Trans.). Dover
Zurück zum Zitat Aristotle ST, Saunders TJ (1981) The politics. Penguin, UK Aristotle ST, Saunders TJ (1981) The politics. Penguin, UK
Zurück zum Zitat Bingham T (2010) The Rule of Law. Allen Lane Bingham T (2010) The Rule of Law. Allen Lane
Zurück zum Zitat Burgess P (2017) The Rule of Law: beyond contestedness. Jurisprudence 8(3):480–500CrossRef Burgess P (2017) The Rule of Law: beyond contestedness. Jurisprudence 8(3):480–500CrossRef
Zurück zum Zitat Citron DK, Pasquale FA (2014) The scored society: due process for automated predictions. Wash Law Rev 89:1–34 Citron DK, Pasquale FA (2014) The scored society: due process for automated predictions. Wash Law Rev 89:1–34
Zurück zum Zitat Craig P (1997) Formal and substantive conceptions of the rule of raw an analytical framework. Public Law, Autumn, pp 467–487 Craig P (1997) Formal and substantive conceptions of the rule of raw an analytical framework. Public Law, Autumn, pp 467–487
Zurück zum Zitat Dahl RA, Shapiro I (2015) On democracy (2nd Revised edition). Yale University Press Dahl RA, Shapiro I (2015) On democracy (2nd Revised edition). Yale University Press
Zurück zum Zitat Dicey AV (1979) Introduction to the study of the law of the constitution, 10th edn. Palgrave Macmillan, UKCrossRef Dicey AV (1979) Introduction to the study of the law of the constitution, 10th edn. Palgrave Macmillan, UKCrossRef
Zurück zum Zitat Dietvorst BJ, Simmons JP, Massey C (2015) Algorithm aversion: people erroneously avoid algorithms after seeing them err. J Exp Psychol 144(1):114–126CrossRef Dietvorst BJ, Simmons JP, Massey C (2015) Algorithm aversion: people erroneously avoid algorithms after seeing them err. J Exp Psychol 144(1):114–126CrossRef
Zurück zum Zitat Floridi L, Cowls J, Beltrametti M, Chatila R, Chazerand P, Dignum V, Luetge C, Madelin R, Pagallo U, Rossi F, Schafer B, Valcke P, Vayena E (2018) AI4People—an ethical framework for a good ai society: opportunities, risks, principles, and recommendations. Mind Mach 28(4):689–707. https://doi.org/10.1007/s11023-018-9482-5CrossRef Floridi L, Cowls J, Beltrametti M, Chatila R, Chazerand P, Dignum V, Luetge C, Madelin R, Pagallo U, Rossi F, Schafer B, Valcke P, Vayena E (2018) AI4People—an ethical framework for a good ai society: opportunities, risks, principles, and recommendations. Mind Mach 28(4):689–707. https://​doi.​org/​10.​1007/​s11023-018-9482-5CrossRef
Zurück zum Zitat Foth M, Tomitsch M, Satchell C, Haeusler MH (2015) From users to citizens: some thoughts on designing for polity and civics. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction, 623–633. https://doi.org/https://doi.org/10.1145/2838739.2838769 Foth M, Tomitsch M, Satchell C, Haeusler MH (2015) From users to citizens: some thoughts on designing for polity and civics. In: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction, 623–633. https://​doi.​org/​https://​doi.​org/​10.​1145/​2838739.​2838769
Zurück zum Zitat Goldsmith S, Crawford S (2014) The responsive city: engaging communities through data–smart governance. 1st edn. Jossey-Bass Goldsmith S, Crawford S (2014) The responsive city: engaging communities through data–smart governance. 1st edn. Jossey-Bass
Zurück zum Zitat Hermstrüwer Y (2020) Artificial intelligence and administrative decisions under uncertainty. In: Wischmeyer T, Rademacher T (eds) Regulating Artificial Intelligence. Springer International Publishing, pp 199–223. https://doi.org/https://doi.org/10.1007/978-3-030-32361-5_9 Hermstrüwer Y (2020) Artificial intelligence and administrative decisions under uncertainty. In: Wischmeyer T, Rademacher T (eds) Regulating Artificial Intelligence. Springer International Publishing, pp 199–223. https://​doi.​org/​https://​doi.​org/​10.​1007/​978-3-030-32361-5_​9
Zurück zum Zitat Khanna P (2017) Technocracy in America: rise of the info-state. CreateSpace Independent Publishing Platform Khanna P (2017) Technocracy in America: rise of the info-state. CreateSpace Independent Publishing Platform
Zurück zum Zitat Labov W (1972) Sociolinguistic patterns. University of Pennsylvania Press Labov W (1972) Sociolinguistic patterns. University of Pennsylvania Press
Zurück zum Zitat Locke J (1988) Two treatises of government. In: Laslett P (ed). Cambridge University Press Locke J (1988) Two treatises of government. In: Laslett P (ed). Cambridge University Press
Zurück zum Zitat Mehr H (2017) Artificial intelligence for citizen services and government. Harvard Kennedy School, p 19 Mehr H (2017) Artificial intelligence for citizen services and government. Harvard Kennedy School, p 19
Zurück zum Zitat Monteiro M (2019) Ruined by design: how designers destroyed the world, and what we can do to fix it. Mule Books Monteiro M (2019) Ruined by design: how designers destroyed the world, and what we can do to fix it. Mule Books
Zurück zum Zitat Morris D (2000) Direct democracy and the internet symposium: internet voting and democracy. Loy LA Law Rev 34(3):1033–1054 Morris D (2000) Direct democracy and the internet symposium: internet voting and democracy. Loy LA Law Rev 34(3):1033–1054
Zurück zum Zitat Kshetri N, Voas J (2018) Blockchain-enabled e-voting. IEEE Softw 35(4):95–99CrossRef Kshetri N, Voas J (2018) Blockchain-enabled e-voting. IEEE Softw 35(4):95–99CrossRef
Zurück zum Zitat Pasquale F (2016) The black box society: the secret algorithms that control money and information (Reprint edition). Harvard University Press Pasquale F (2016) The black box society: the secret algorithms that control money and information (Reprint edition). Harvard University Press
Zurück zum Zitat Pawade D, Sakhapara A, Badgujar A, Adepu D, Andrade M (2020) Secure online voting system using biometric and blockchain. In: Sharma N, Chakrabarti A, Balas VE (eds) Data management, analytics and innovation. Springer Pawade D, Sakhapara A, Badgujar A, Adepu D, Andrade M (2020) Secure online voting system using biometric and blockchain. In: Sharma N, Chakrabarti A, Balas VE (eds) Data management, analytics and innovation. Springer
Zurück zum Zitat Pilkington M (2016) Blockchain technology: principles and applications. In: Olleros FX, Zhegu M (eds) Research handbook on digital transformations. Edward Elgar Publishing Ltd Pilkington M (2016) Blockchain technology: principles and applications. In: Olleros FX, Zhegu M (eds) Research handbook on digital transformations. Edward Elgar Publishing Ltd
Zurück zum Zitat Raz J (2009) The authority of law: essays on law and morality, 2nd edn. Oxford University Press Raz J (2009) The authority of law: essays on law and morality, 2nd edn. Oxford University Press
Zurück zum Zitat Sandvig C, Hamilton K, Karahalios K, Langbort C (2016) When the algorithm itself is a racist: diagnosing ethical harm in the basic components of software. Int J Commun 10:4972–4990 Sandvig C, Hamilton K, Karahalios K, Langbort C (2016) When the algorithm itself is a racist: diagnosing ethical harm in the basic components of software. Int J Commun 10:4972–4990
Zurück zum Zitat Skinner Q (2012) Liberty before Liberalism. Cambridge University Press Skinner Q (2012) Liberty before Liberalism. Cambridge University Press
Zurück zum Zitat Thibault M, Buruk “Oz” O, Buruk SS, Hamari J (2020) Transurbanism: smart cities for transhumans. In: Proceedings of the 2020 ACM Designing Interactive Systems Conference. Association for Computing Machinery, pp 1915–1928 https://doi.org/https://doi.org/10.1145/3357236.3395523 Thibault M, Buruk “Oz” O, Buruk SS, Hamari J (2020) Transurbanism: smart cities for transhumans. In: Proceedings of the 2020 ACM Designing Interactive Systems Conference. Association for Computing Machinery, pp 1915–1928 https://​doi.​org/​https://​doi.​org/​10.​1145/​3357236.​3395523
Zurück zum Zitat von Hayek FA (2007) The road to serfdom. In: Caldwell B (ed). University of Chicago Press von Hayek FA (2007) The road to serfdom. In: Caldwell B (ed). University of Chicago Press
Zurück zum Zitat Webb KC (2016) Products liability and autonomous vehicles: who’s driving whom. Richmond J Law Technol 23:1–52 Webb KC (2016) Products liability and autonomous vehicles: who’s driving whom. Richmond J Law Technol 23:1–52
Zurück zum Zitat Wischmeyer T (2020) Artificial intelligence and transparency: opening the black box. In: Wischmeyer T, Rademacher T (eds) Regulating artificial intelligence. Springer International Publishing, pp. 75–101. https://doi.org/https://doi.org/10.1007/978-3-030-32361-5_4 Wischmeyer T (2020) Artificial intelligence and transparency: opening the black box. In: Wischmeyer T, Rademacher T (eds) Regulating artificial intelligence. Springer International Publishing, pp. 75–101. https://​doi.​org/​https://​doi.​org/​10.​1007/​978-3-030-32361-5_​4
Metadaten
Titel
Algorithmic augmentation of democracy: considering whether technology can enhance the concepts of democracy and the rule of law through four hypotheticals
verfasst von
Paul Burgess
Publikationsdatum
11.03.2021
Verlag
Springer London
Erschienen in
AI & SOCIETY / Ausgabe 1/2022
Print ISSN: 0951-5666
Elektronische ISSN: 1435-5655
DOI
https://doi.org/10.1007/s00146-021-01170-8

Weitere Artikel der Ausgabe 1/2022

AI & SOCIETY 1/2022 Zur Ausgabe

Premium Partner