Skip to main content

2024 | OriginalPaper | Buchkapitel

Machine learning, political participation and the transformations of democratic self-determination

verfasst von : Jeanette Hofmann, Clara Iglesias Keller

Erschienen in: Künstliche Intelligenz, Mensch und Gesellschaft

Verlag: Springer Fachmedien Wiesbaden

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This contribution addresses links between machine learning technologies and democracy with a focus on political participation. Democracy research often regards machine learning technologies as a threat, as these technologies could violate fundamental rights or replace democratic decision making. While raising important concerns, these approaches underestimate the malleability of digital technologies and their relationship to democracy. Our argument is that inherent to democratic practice we find a constant (re)negotiation of rights and institutions, in this case not least driven by the fact that machine learning technologies themselves are far from reaching maturity. The openness and negotiability of the relationship of AI and democracy is illustrated by three critical perspectives that hold special importance for political participation: algorithmic bias, automated decision-making and AI’s epistemic dimension. By reflecting the changing condition of political organisation, current research can be productive and even performative in the sense of co-defining a shared understanding of new technologies and aim to set standards for their legitimate use.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
As we demonstrate with examples throughout this text.
 
2
Although or precisely because artificial intelligence (AI) is currently a much-discussed topic, it is difficult to define it. For this reason, many experts completely abandon the term and switch to abstractions such as “predictive technologies”, “agentic machines”, or “algorithmic systems” (Joyce et al. 2021, p. 2; see also Dignum 2022). While this leads some observers to question the very existence of AI, others prefer to reserve the term for “whatever we are doing next in computing” (Recker et al. 2021, p. 1435). Nevertheless, the vague terminology proves to be a problem if one wants to investigate the interactions of AI with society and democracy. To avoid ambiguity, we refer instead to algorithmic systems, learning algorithms, or machine learning.
 
3
Legal philosophy and behavioural theories have explored an array of factors that judges weigh into this process of materializing what are, ultimately, ethical standards elected by the rule of law (Pereira 2016, p. 347), including hermeneutic choices, political implications, and institutional culture, beyond personal ideologies and bias (Campos Mello 2018; Pereira 2016; Freitas 2018).
 
4
For instance, an experiment in the US Supreme Court led by Epstein et al. (2018, p. 239) found that justices who subscribe to a liberal ideology were more supportive of free-speech claims than conservative justices. Showing that bias also infiltrates collegiate deliberation, Cesário Alvim Gomes et al. (2018) documented how justices of the Brazilian Supreme Court were found more likely to disagree with rulings reported by female justices, in comparison to their male peers (Cesário Gomes Alvim, Werneck Arguelhes, und Nogueira 2018, p. 866).
 
5
For hermeneutic techniques, see Freitas 2018.
 
6
There are options to choose from when selecting learning algorithms defining target variables, compiling training and test data as well as optimizing during training processes (cf. Domingos 2012, p. 79–80).
 
7
In recent years, the increasing use of automated decision-making systems has not only brought existing network policy organizations onto the scene but has also led to a number of start-ups: NGOs such as the Ada Lovelace Institute (2018, UK), Algorithm watch (2017, Germany), AI Now (2017, USA) or Data and Society (2014, USA.).
 
8
COM (2021) 206 final.
 
9
Hartmut Rosa (2020, p. 21) describes subjecting the world to control along four dimensions: recognizability, accessibility, controllability and usability.
 
Literatur
Zurück zum Zitat Amoore, L. 2020. Cloud Ethics. Algorithms and the Attributes of Ourselves and Others. Duke University Press.CrossRef Amoore, L. 2020. Cloud Ethics. Algorithms and the Attributes of Ourselves and Others. Duke University Press.CrossRef
Zurück zum Zitat Bucher, T. 2018. If … Then: Algorithmic Power and Politics. Oxford University Press. Bucher, T. 2018. If … Then: Algorithmic Power and Politics. Oxford University Press.
Zurück zum Zitat Cesário Gomes Alvim, J., Werneck Arguelhes, D., and Nogueira, R. 2018. Gênero e comportamento judicial no supremo tribunal federal: Os ministros confiam menos em relatoras mulheres? Revista Brasileira de Políticas Públicas 8(2). https://doi.org/10.5102/rbpp.v8i2.5326 Cesário Gomes Alvim, J., Werneck Arguelhes, D., and Nogueira, R. 2018. Gênero e comportamento judicial no supremo tribunal federal: Os ministros confiam menos em relatoras mulheres? Revista Brasileira de Políticas Públicas 8(2). https://​doi.​org/​10.​5102/​rbpp.​v8i2.​5326
Zurück zum Zitat Epstein, L., Parker, C. M., and Segal, J. A. 2018. Do Justices Defend the Speech They Hate? An Analysis of In-Group Bias on the US Supreme Court. Journal of Law and Courts 6(2): 237–262. https://doi.org/https://doi.org/10.1086/697118CrossRef Epstein, L., Parker, C. M., and Segal, J. A. 2018. Do Justices Defend the Speech They Hate? An Analysis of In-Group Bias on the US Supreme Court. Journal of Law and Courts 6(2): 237–262. https://​doi.​org/​https://​doi.​org/​10.​1086/​697118CrossRef
Zurück zum Zitat Eubanks, V. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St Martin’s Press. Eubanks, V. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St Martin’s Press.
Zurück zum Zitat Freitas, J. 2018. Interpretação Judicial: Exame Crítico Dos Vieses. Revista Da AJUFERGS 10(1a): 57–84. Freitas, J. 2018. Interpretação Judicial: Exame Crítico Dos Vieses. Revista Da AJUFERGS 10(1a): 57–84.
Zurück zum Zitat Habermas, J., Sperber, G. B., and Soethe, P. 2007. A inclusão do outro: estudos de teoria política (3rd ed., Humanística 3). Loyola. Habermas, J., Sperber, G. B., and Soethe, P. 2007. A inclusão do outro: estudos de teoria política (3rd ed., Humanística 3). Loyola.
Zurück zum Zitat Habermas, J. 1996. Between Facts and Norms: Contributions to a Discourse Theory of Law and Democracy. Studies in Contemporary German Social Thought. Cambridge, Mass: MIT Press.CrossRef Habermas, J. 1996. Between Facts and Norms: Contributions to a Discourse Theory of Law and Democracy. Studies in Contemporary German Social Thought. Cambridge, Mass: MIT Press.CrossRef
Zurück zum Zitat Heintz, B. 2021. Big Observation – Ein Vergleich moderner Beobachtungsformate am Beispiel von amtlicher Statistik und Recommendersystemen. KZfSS Kölner Zeitschrift für Soziologie und Sozialpsychologie, 73(1), 137–167. https://doi.org/https://doi.org/10.1007/s11577-021-00744-0CrossRef Heintz, B. 2021. Big Observation – Ein Vergleich moderner Beobachtungsformate am Beispiel von amtlicher Statistik und Recommendersystemen. KZfSS Kölner Zeitschrift für Soziologie und Sozialpsychologie, 73(1), 137–167. https://​doi.​org/​https://​doi.​org/​10.​1007/​s11577-021-00744-0CrossRef
Zurück zum Zitat Kahneman, D. 2003. Maps of Bounded Rationality: Psychology for Behavioral Economics. The American Economic Review, 93(5), 1449–1475.CrossRef Kahneman, D. 2003. Maps of Bounded Rationality: Psychology for Behavioral Economics. The American Economic Review, 93(5), 1449–1475.CrossRef
Zurück zum Zitat O’Neil, C. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Allen Lane. O’Neil, C. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Allen Lane.
Zurück zum Zitat Rosa, H. 2020. Unverfügbarkeit. Wien. Residenzverlag. Rosa, H. 2020. Unverfügbarkeit. Wien. Residenzverlag.
Zurück zum Zitat Rostalski, F., and Thiel, T. 2021. Künstliche Intelligenz als Herausforderung für demokratische Partizipation. In Interdisziplinäre Arbeitsgruppe „Verantwortung: Maschinelles Lernen und Künstliche Intelligenz“ der Berlin-Brandenburgischen Akademie der Wissenschaften (Hrsg.), Verantwortungsvoller Einsatz von KI? Mit menschlicher Kompetenz! (pp. 56–63). Berlin-Brandenburgische Akademie der Wissenschaften. http://hdl.handle.net/10419/235149 Rostalski, F., and Thiel, T. 2021. Künstliche Intelligenz als Herausforderung für demokratische Partizipation. In Interdisziplinäre Arbeitsgruppe „Verantwortung: Maschinelles Lernen und Künstliche Intelligenz“ der Berlin-Brandenburgischen Akademie der Wissenschaften (Hrsg.), Verantwortungsvoller Einsatz von KI? Mit menschlicher Kompetenz! (pp. 56–63). Berlin-Brandenburgische Akademie der Wissenschaften. http://​hdl.​handle.​net/​10419/​235149
Zurück zum Zitat de Sousa Santos, Boaventura. 2002. Reinventar a democracia (2nd ed.). Ed. Gradiva. de Sousa Santos, Boaventura. 2002. Reinventar a democracia (2nd ed.). Ed. Gradiva.
Zurück zum Zitat Scheuerman, M. K., Denton, E., and Hanna, A. 2021. Do Datasets Have Politics? Disciplinary Values in Computer Vision Dataset Development. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2, 317), 1–37. https://doi.org/10.1145/3476058 Scheuerman, M. K., Denton, E., and Hanna, A. 2021. Do Datasets Have Politics? Disciplinary Values in Computer Vision Dataset Development. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2, 317), 1–37. https://​doi.​org/​10.​1145/​3476058
Zurück zum Zitat Viljoen, S. 2021. A Relational Theory of Data Governance. The Yale Law Journal, 82. Viljoen, S. 2021. A Relational Theory of Data Governance. The Yale Law Journal, 82.
Metadaten
Titel
Machine learning, political participation and the transformations of democratic self-determination
verfasst von
Jeanette Hofmann
Clara Iglesias Keller
Copyright-Jahr
2024
DOI
https://doi.org/10.1007/978-3-658-43521-9_13

Premium Partner