Skip to main content
Top
Published in: Quality & Quantity 3/2024

Open Access 30-09-2023

New ethnographic perspective on relational ethics in the field of Artificial intelligence

Authors: Pavle Pavlović, Mitja Hafner Fink

Published in: Quality & Quantity | Issue 3/2024

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This article was created on the wave of the ubiquitous and already-saturated topic of ethics in the field of artificial intelligence. We were motivated by the proliferation of rules within this field and by a posthumanism critique of this topic. We attempt to nurture a new research platform for a social science analysis of the “How of ethics” issue by providing an argument for the study of algorithms and ethical issues by expanding the usability of the concept of niche construction and environmental perspectives in ethnographic studies. From a design perspective, this means expanding the quest related to the ethical matter by intensifying the inquiry in a design that includes not just the design process but also a more comprehensive environment. Inspired by current trends in evolutionary anthropology, science studies, and the philosophy of science, we are in line with approaches that reaffirm ethical issues from standpoint theory in the current scientific debate about trust in science. The results of our historical perspective on the issue of value neutrality point out that the position where the tool is not neutral does not mean that it is biased but that it is deeply involved in the network of relationships that influence it to be biased, and that threatens its autonomy. By providing argumentation based on the issue of ethics, we have nurtured the so-called ecology of practice and connoisseurs as a new practice and perspective that ethnography can take on the issue of accountable, ethical, and trustable science.
Notes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

1 Introduction

In 2020 AlgorithmWatch made a collection and analysis of documents whose intention was to capture the current regulation in the field of Artificial intelligence (AI) (AlgorithmWatch 2020). By 2020 their collection included 167 documents mainly related to ethical development principles and automated decision-making (ADM) implementation. They noticed that the documents lacked effective enforcement and were mainly based on voluntary commitments or general recommendations. Without a clear view of how they are to be applied in practice, they concluded that many guidelines are unsuitable as a tool against potentially harmful uses of AI-based technology and that they will probably fail to prevent harmful damage (Haas et al. 2020). Numerous authors have pointed out similar shortcomings, pointing out that abstract principles, lack of technical explanation, and regulatory ethics without insight into everyday practice are some of the problems facing the field of AI (Floridi 2021; Miller 2013; Morley et al. 2020).
Inspired by those critiques, in this article, we will offer a solution for practical access to those issues by expanding inquiry to a wider environment, including humans as more than human.1 However, firstly, drawing upon a lesson from the development of German sociology, especially the value judgment dispute (Werturteilsstreit), we will draw a parallel with today’s problem and try to examine “what went wrong” and how we can current scientific issues emancipate from that knowledge. Value judgment dispute will provide an argument for relational thinking between different actors involved in the process, and the development in the ethnographic direction as a method for this purpose will serve to gain insight into the actual situation on the ground. In that ground, there are different actors in interrelations where the boundaries between humans and non-humans (everything that comes from technology and changes the process mainly on the ontological level, be it algorithms, data, or various devices) are porous. This significantly affects the approach to objectivity, which must change from a static, fixed object to an open and responsible one in which values are not separated from facts but responsibly incorporated into the care of the outcomes of science as a process. Namely, based on the debate in the field of evolutionary anthropology, science studies, philosophy of science on the issue of situatedness, ethics, and interdependence with the environment (Barad 2007; Geismar and Knox 2021; Haraway 2016; Holbraad and Pedersen 2017; Ingold 2011a, b, 2015; Odling-Smee 2010, 2011; Pink and Lanzeni 2018; Stengers 2005, 2018), we will advocate the necessity of different possibilities and solutions, which implies moving away from technological-determinism-based quest “What are we doing to technology?” and “How can technology harm us?” to question “How we are coevolving together in a more than human world?”2 We consider that the human-centred approach is to start from the wrong assumption of human domination in a hybrid (or any other) space because to assume that we are nothing but human. And, to neglect the influence of technology itself would constitute a denial of the influence and importance of technology in our everyday life, which would to the same extent mean the futility of these same claims stemming from technological determinism.3 That means that adopting such a standpoint “will prevent humans from turning the robot into their new symbolic other and from falling into the dualistic paradigm that has historically characterized Western hegemonic accounts” (Ferrando 2019, p. 113).
Thus, in this article, we will foster an approach that will provoke the necessity of a better understanding of the “full circle” of the ethical side of knowledge production “beyond its creator” (Jaton 2021) and include value neutrality as an open issue, not as a methodological basis. Our understanding of ethics is relational and based on the current opinion that we are encountering the era of distributed morality (Simon 2015), which brings the need for a different approach to ethics, established away from the assumption of singular identity and selfhood (Suchman 1987; Turkle 1997) toward relational “conceptions that stress a sense of identity, which is inextricably interwoven with various relationships (familial, social, natural, and so on) that define us as relational selves” (Ess 2015, p. 49). This relation with technical, and more specifically environmentalist, insight was influenced by evolutionary theory—the idea of niche construction. Fostering methodologies based on the question concerning the “how of ethics?” (Bennett 2001; Haraway 1988; Mol 2002) we will advocate an approach that aims to understand the ethical role of every actor in the process of knowledge production and its further sharing, distribution, and reassessment.
This article has three goals. Firstly, this is an attempt at a direct answer to a programme of developing research ethics based on ethnography (Pink and Lanceni 2018; Pink et al. 2022; Kitchin 2021) that engages with the mundane emergent context where data are made, interpreted, analysed and mobilised in everyday practices. Secondly, by this we will develop a research platform different from the “utilitarian perspective” based on understanding, coevolution, and emerging trends. Research on youth’s connection to technology shows that they are already recognising and developing feelings toward technical objects, which is leading to giving existential dignity to machines, as different studies have shown, “a significant proportion of children ascribe cognitive, behavioural, and especially affective, characteristics to robots” (Beran et al. 2011). Thirdly, this will be an attempt to escape from the value neutrality “trap” by restating our position and research framework with a posthumanist research perspective on the issue of artificial intelligence.
The article is organised into five sections and a conclusion. In the second section, we briefly describe the circumstances of the so-called value judgment dispute (Werturteilsstreit) in Germany at the beginning of the 20th century, within which, among other things, the concept of value neutrality was “coined”. Subsequently, these discussions (including the idea of value neutrality) led to the further development of modern sociology in Germany (and wider). In the third section, we analyse and identify the need for translating the field of inquiry regarding the ethics of AI from the question of what to the question of how. After that, we will use an analogy from evolutionary theory to rearrange social and technical relations by nurturing a niche construction approach. In the end, we will use the ecology of practice developed by Stengers (2005) as a possible new perspective regarding an ethnographic ethical inquiry.4
Our ecological, posthuman, and evolutionary perspective on the problem of AI ethics is an attempt to supplement the lack in practice through ethnography as a method of examining needs “from the field” with a theoretical framework and conceptual apparatus that offer ideas of the ecology of practice and niche construction. It is a relational approach to this process, and the historical review of the value judgment dispute is only an argument in the direction. The main conclusion of this article is that discovering networks of relations within that, or similar assemblages, could help us understand the exact position and needs of science within the wider context of its purpose, practical or societal.

2 Werturteilsstreit and the legacy of the past

The story of value neutrality (roles of values in science) is an instructive one about how science tried to defend society and itself from the consequences of societal transformation. We believe that the debates between German social scientists at the beginning of the 20th century, which took place under the label of Werturteilsstreit (value judgment dispute), were crucial for understanding the academic justification of the concept of value neutrality. The German society at the turn of the 19th and 20th centuries, where the need for the mentioned “defending” role of (social) science occurred, was characterized by delayed industrialization compared to the rest of industrialized Europe. This development further created conditions and increased the need for inquiry based on new circumstances—accelerated industrialisation (followed by technological development) and the social transformations that accompanied it (migrations, demand for better living conditions, expansion of proletarianisation), eugenics as a product of Darwin’s evolutionary theory, and the emergence of women’s rights movement, which reflected a societal need for more gender equality (Moebius 2021). The events that marked a turning point in the development of German sociology were the cleavages between the experienced and younger generation of German social scientists. Namely, guided by Max Weber (together with Ferdinand Tönnies, Werner Sombart, and Georg Simmel), the newly established German Society for Sociology (Deutsche Gesellschaft fur Soziologie) distanced itself from experienced colleagues (Social Polity Association) on the issue of objectivity and value neutrality (Proctor 1991). The idea of young scholars was to build an inquiry of society on purely scientific and methodologically guided goals and procedures while leaving political affiliation to the side. Within the so-called Werturteilsstreit discussion, young scholars attacked the older generation of political economists for mixing facts and values, science and politics. Moreover, they proclaimed that the problem of values was the problem of objectivity and that social science was objective insofar as it avoided making value judgments. Weber (1949) expressed a clear position that scientific reasoning (as the basis for scientific objectivity) must necessarily be value-free and independent of social (cultural) context. He illustrated this standpoint with the thesis “that a systematically correct scientific proof in the social sciences if it is to achieve its purpose, must be acknowledged as correct even by a Chinese” (Weber 1949, 58). At the same time, it is necessary to mention that Max Weber somehow tried to bring both extreme views of the role of values in science closer together. This was also reflected in the fact that he rejected the simplistic thesis that science and scientific objectivity had nothing to do with values. Thus, among other things, he warned that “an attitude of moral indifference has no connection with scientific objectivity” (Weber 1949, 60).
The development of cleavage around values judgment dispute involved several factors, which led to this neutrality and objectivity crusade. First, it was dissatisfied with the independence of the Social Policy Association, accusing it of acting solely as an advisory body to the government in state affairs and policy. The 1909 meeting in Vienna on the issue of value neutrality in the political economy led to the transition from a historical or ethical school of political economy to value-free scientific sociology (Proctor 1991, p. 86). The discussion about the relationship between the state and science led to a dispute between representatives. It brought the topic of value neutrality to the epicentre and pushed other topics to the side. The older scientists defended their position on the importance of bringing science to bear on the problem of public policy, and for younger scholars, this was unacceptable.
Apart from being intellectual, this division also had elements of being a solid political manifestation in reaction to emerging challenges. Technical development provoked the need for social reforms for the socially lower classes. A growing industry, spreading colonial force, and fresh national unification created a specific kind of socialism, which would lead to social nationalism in the coming years. As a defence, young sociologists did not use enlightenment ideas like individuality but promoted value neutrality as an instrument not against nationalism, which was legitimate, but against scientific socialism. Value neutrality in that manner would increase the independence of science from government and serve exclusively as a means of science for science and not a means for politics or ideology. It was a way to break with the previous practice of older sociologists, in which political interests were reflected in the field of science.
On the other side, the scientific influence of social Darwinism played a significant role in the spreading of the idea of racial hygiene in Germany. Namely, by the end of the nineteenth century, social Darwinism and racial hygiene constituted major research agendas for the emergence of the science of sociology. Sociologists tried to construct a science based on the synthesis of the organic metaphors of Comte and Spencer and the idea of natural selection by Charles Darwin in which natural forces rule societal organization and not some artificial or mechanical ones. For example, Francis Galton, Ferdinand Tönnies, and Rudolf Goldscheid were important proponents of this approach (Proctor 1991). On the contrary, Max Weber was against, accusing those scientists of “racial mysticism” and said that a social phenomenon must be considered sui generis and not as a passive reflection of instincts or biological drives. Value neutrality and objectivity established principles different from those based on evolutionary science, and thus tried to bypass the path of racial discrimination.
The third movement that influenced the proliferation and use of value neutrality was the emancipation of women. By the First World War, almost a million women in Germany were part of this movement. Almost all the influential social scientists, Sombart, Tönnies, and Simmel, were against this emancipation; Max Weber did not belong to this group. He was a proclaimer of women’s rights, but he never wrote anything on this issue. A line of reasoning was mainly established that women are insufficiently intellectually and emotionally mature compared to men, that man is to culture and woman is to nature, that science is masculine, that objectivity and neutrality are male spirits, and that women tend to identify themselves with surroundings (Proctor 1991). Value neutrality again was used to avoid taking a stand on women’s rights, defending their emancipation and involvement in science; the exclusion of values from science worked to reaffirm the exclusion of women from science.
Today, in the new era of technological development, the idea of value neutrality has been largely abandoned (Davis 2012), and objectivity is not approached in a unified way. What is interesting is that today “spears are breaking” over the same issues, where the influence of feminist epistemologies, technological development, and evolutionary science again “threaten” to reform science.

3 The “how” of ethics in AI

After the COVID-19 pandemic, the crisis of trust in science continues with scientific solutions that society does not fully accept or doubts. It seems that this crisis is not only specific to the disciplines that are directly related to the pandemic, like medicine or pharmacy but also to those that have developed drastically in that period, like Big data analytics or Artificial intelligence. Namely, from the second decade of the twenty-first century, a grand narrative posits knowledge derived from data analytics as true (Markham et al. 2018). Similarly to the value judgment dispute issue described in the previous section, big data has been called a revolution that will leverage social science’s status to paradigmatic by advocating value-free science and further needlessness of theory, which will be established this time on algorithms and their properties (Anderson 2008; Davis 2012; Dwork and Milligan 2013; Grey 2009; Porter 1995; Prensky 2009; Steadman 2013).5 But this was soon left to the side and new opinions have been established in the field that compared to “traditional” social sciences, algorithms and their construction are even more subjective and value-laden, with desired outcomes designed by developers, users, and machines (Mittelstadt et al. 2016) and with deep moral consequences and the reinforcement or undercutting of ethical principles (Kirsten 2019). Also, significant social, political, and aesthetic dimensions, framed and shaped by different kinds of enrolments and different kinds of actors in nature (Callon 1986; Kitchin 2017; Latour 2007) have become some of the main preoccupations of the current social science field. The radical influence and transformation of this field pointed to the need for ethical AI to avoid another AI winter (Floridi 2020) because of the fear that this enthusiasm would not provide adequate solutions to current challenges. This recommendation comes after an evaluation of the current situation in the field, where the ethical turn is obvious: within only a few years, industry, academia, and international organisations have created more than 70 documents (norms, principles, regulations, directives, guidelines) for an ethical AI (Floridi et al. 2021; Morley et al. 2020). This current situation is additionally fostered by the complexification of the work in the field of artificial intelligence, which is not only recognised by recommendation systems but also by much more autonomy which influences the agency of the machine (Steward 2021); this reputed smart agency is shaping the field, moving AI toward game theoretical models using a game between an agent and an adversary (Wang et al. 2016). If we add to this equation the proliferation of big data analytics and deep learning models, which enable high-level abstractions in data, it is obvious that the new move to so-called general AI will lead to greater comprehension and performative learning and tasks.
So, the main idea of ethical AI is to find and position AI in a manner that can foster human nature and its potentialities with smart agency and, on the other side, not be overused and misused, creating risks or costs for humans and societies (Floridi 2018). Disruptive trends can foster needs so that measures become fashions and not solutions, like in the case of informed consent. For example, regulations like the European Data Protection Regulation drastically restrict information-based medical research utilising aggregated databases to uphold ethical ideals of data protection and informed consent (Mittelstadt and Floridi 2016). Informed consent based on this regulation assumes that every subsequent use of data must have individual consent from persons of origin. As Mittelstadt and Floridi (2016) suggested, consent cannot be informed because data subjects cannot be told about future users and the consequences of their data, which are unknowable when the data are collected or aggregated. These fashions later become strict and inflexible ethical whiplash, which does not care about change and process but acts automatically based on standards that are largely inapplicable today and act as a brake on progress through the knowledge that could be acquired by such an approach. This example calls for a processual solution, which will include much better monitoring of dynamics and daily changes brought by technological development. Also, organisations seem to be exposed to a particular type of ethical marketisation. In that process, they create principles that further contribute to ethical inconsistency, resulting in adopting, reaffirming, or shopping for ethics that are best retrofitted to justify their current behaviour (Floridi 2021 pp. 82–93). It seems unnecessary to work in an environment that will foster ethically responsible decisions when a distractive risk of this kind of ethical shopping is present (Morley et al. 2020, p. 159).
One of the big problems is that guidance provided by ethical guidelines suggests that a technical solution exists, but very few provide a technical explanation (Morley et al. 2020). Often developers are frustrated by how little help is offered to them in the process; instead, they are supported by highly abstract principles when it comes to the day job (Floridi 2021) where regulatory ethics based on ethical universalism do not map neatly onto everyday practices in the field (Miller 2013).
Recently, findings in the field of AI ethics have shown that principles and technical tools created mainly through official strategies have little impact on current algorithmic practice and its functioning (Borg 2021; Rakova et al. 2021; Vakkuri et al. 2020). They are too theoretically and normatively oriented and often miss everyday people’s orientation. According to Pink et al. (2022), ethics associated with automated systems and technologies must engage ethics in everyday situations and not out of every day.
This prioritisation would move from ethos to engaging everyday ethics, from rules to process, and from a normative to an ethical perspective. This broader approach to this issue with more comprehensive and inclusive solutions, like an environmental or relational approach, would situate the principles, practices, values, humans, and non-humans into one big assemblage and investigate their relations. In essence, this means separating technology from its application, namely technology as a tool at our disposal, and investigating morality as something that is not assumed to be neutral. Ethicists must be aware of a broader environment, as the creative process in AI is not merely technical but sociotechnical, where “data do not pre-exist their generation but are created through the process of conducting science” (Kitchin 2021, p. 24). Ontological security established through ethical universalism must be debunked with the investigation, and relational and situational perspectives could be a good starting point for that purpose.

3.1 Understanding complex sociotechnical assemblages

The focus must thus be not only on production or consumption, technical or social, but on the creation process through coevolving different elements from technical to sociological, normative, political, cultural, and others. In this context, we understand algorithms as finite, abstract, effective, compound control structures, imperatively given and as accomplishing a given purpose under given provisions (Hill 2016), where through encoded procedures input data are being transformed into a usable, and therefore desired, output (Gillespie 2012). They perform a crucial function: by following a logical and mathematical sequence, they can structure and find additional meaning in a big data environment. Here we rely on the definition of Boyd and Crawford (2012), who refer to big data as a cultural, technological, and scholarly phenomenon, in which technology is used to aggregate, analyse, link, and compare large data sets to identify patterns to make economic, social, technical, and legal claims that not only produce a technical outcome but also a cultural one. This entire process is organised in a political and social environment, which is in a symbiotic relationship with artificial intelligence or, as Borg (2021) has explained, “Artificial Intelligence needs to be trained on Big Data to be accurate, and Big Data’s value is largely realized through its use by Artificial Intelligence” (Borg 2021, p. 1).6
Thus, understanding algorithms as a complex sociotechnical assemblage that involves long chains of actors, technologies, and meaning (Seaver 2017) is not a question of (external) moral and political consequences (Fukuyama 2017) or a question of research ethics (e.g. Barocas and Nissenbaum 2014; Zimmer 2010), which can be solved with a normative solution and recommendation. It is a profoundly Weberian value-oriented question (Weber 1949), which has a focus on discovering webs of significance (Geertz 2000) and relations among more comprehensive assemblages of actors involved in its establishment, work, and further life.
With this approach, we consider that the issue of ethically responsible use of technologies can be lit up. This ethical role has been mainly neglected, especially on the issue of moral traceability and distributed responsibility (Mittelstadt et al. 2016; Simon 2015). For example, Markham et al. (2018) pointed out that general assumptions around the issue of big data and data science are facing an accountability crisis because of the diminishment of qualitative aspects of behaviour and experience from the data. Wang (2013) has already spoken about the need for thick data in the case of big data. Also, recently a respected call has been issued in the field of anthropology to understand “ethics situated in the ongoingness of life” (Pink and Lanceni 2018, p. 1). Such a standpoint offers a revised approach to temporality and attends to the ethics of intervening and engaging with the uncertainty of what is yet unknown rather than simply with an ethics of the past. Because it is not produced and used in a technical vacuum, technological progress is not only technological. The life of algorithms is saturated within dense social networks, which bring different individual interactions, group representations and norms, organisational dynamics and cultures, and field-level structures (Christine 2020). By focusing on the “waves and ripples” between algorithms and social actors, we can examine the refractions that are created between them and in the process analyse the chains of representations and practices that travel across algorithmic systems, shaping their impact on the process. As Christine (2020) observed, this lens has not been systematically implemented in the study of algorithms.
Understanding and creating an ethical algorithmic culture impact assessment on a larger society must be the first step toward this process. According to Vedder and Nauds (2017), algorithms involve a broader scope of settings that transcend the mechanisms now inherent in the regulation. As the effect of algorithms transcends the realm of data protection, so must the approach in the field of ethics. Thus, the main issue is a question of full sociotechnical assemblages, including the researcher, algorithms, big data, AI, and the general environment that he or she creates.
A solution to this problem can be found in Bruno Latour’s Actor-Network theory, focusing on substitutions and associations within assemblages of humans and non-humans (Latour 1986). Such a fluid approach refuses to take technological black boxes for granted. It shifts existing sites of study inscribing scientific and technical objects within the longer chains of humans and non-human actants who participate in the creation, diffusion, and institutionalisation of scientific and technical knowledge. Critical in this framework, which can be helpful as a tool for our purpose, is the concept of “enrolment”, most explicitly theorised by Callon (1986), which involves analysis of the dynamics of association, translation, and entanglement that take place whenever humans and non-humans interact. We believe that this call should not be related to the study of the black box, which is often the case, but to the study of the cross-section between different elements involved in the processes. Thus, the focus will be moved from technical to relational and social connections, to a structural explanation. Actor-network theory served as an inspiration for the model proposed by Ingold, which we are going to discuss more deeply in the next section.7

4 Lessons from evolutionary anthropology – niche construction

In this part, we will develop a useful conceptual apparatus based on the notion of the human niche, which has been one of the cornerstones of evolutionary theory for the last two decades. By using concepts like coevolution and codeveloping between different humans and non-human worlds in one all-pervasive ecosystem, human niche is an evolutionary theory with posthuman clothing. In this approach understanding a phenomenon, a process, an event, and the future is no longer a matter of humans and society; it is primarily a matter of humans and societal responsibility towards the wider environment which they inhabit. And that position requires a somewhat broader view of things.
This evolutionary analogy can be helpful because similar synergy can be found in the relation between technical and practical, especially in the construction and further life of algorithms. Mainly inspired by the developers of this theory, John Odling-Smee and Kevin Laland (2011), who, for example, suggested that cultural processes are part of our niche. Namely, evolutionary theory was a long-time burden with the influence of the Darwinist or Neo-Darwinist approach, which pointed toward one direction—the environment has shaped the organism’s development and we were adapting ourselves to the condition that it demands of us to survive. In the meantime, our natural relationship and, more than that, our relationship with the non-living world is re-establishing our positions regarding human relationships, which have become more than human. So, for example, using a tool means not just the tool’s usability but can also indicate a change in the form and capacities created by that tool (Fuentes 2017). A Heideggerian example is evident here when a tool with which a product is crafted also assumes that tempering the skill with which that same product is created. And, this is a property of development systems, where agents are present (co-present) and work together in wider environment (Ingold 2011b).
Motives for the above model are grounded in the learning process—learning is cognitive but also embodied and embedded, rooted in culture, situated in context, and shaped by ideology, power and social practices (Cole and Engestrom 1995) and shaped by the social environment in which it takes place (Smagorinsky 1995). The result is a learning dynamic wherein “culture and cognition cocreate one another” (Cole 1985). Thus, learning is embedded in the coevolving process. If we work with any tool, from the most physical work, such as working with a shovel, or the more intellectual creation of algorithms, the mutual exchange of information and monitoring the properties of the tool and our abilities is the most essential part of that process. Our efficiency thereby increases, which reflects on our intellectual and productive side of work as well as acting as a stimulus because of the fewer iterations we use to complete the task. If something is a tool, whether it is a shovel, a car, or an algorithm, the question is not only the construction of that object but also its improvement and upgrading. Every access to the object is its initiation and relearning process at best and is often an improvement. Such an action means that learning is “imperceptibly” embedded in the process of coevolution, and the action of a tool could, therefore, possess a “hush” agency because of the “silent” influence on the human ability to craft. For example, human trust and sovereignty face the consequences, topics often presented around Anthropocene writings, which results in more transformative and relational bonds with different entanglements (Wakkary 2021), which means sharing agency and sovereignty with the surroundings. Approaching society, development, and evolution through meaning-making means starting from the point where through creativity, humans and tools are co-constructing each other. This is not about defending unrealistic hybrid positions where the tool has full agentic capability beyond its creator, but we are just cautious about the unidimensional character of understanding processes and universalisation that follows it.

4.1 Crossings between “species” in a more-than-human world

Crossing between “species”, human and non-human or as often called the more-than-human world (Bennett 2001), has long been neglected by the field of social science until the moment when the unpredictable Anthropocene era arrived “on payment”. We are witnessing a “vengeful” environmental backlash by nature after a long period of human abuse. Social science search was, for a long time, a refusal to look at certain things outside the social (human) framework. Even today, popular models such as Bourdieu’s habitus or Foucault’s discourse are designed to see only through social “eyes”, neglecting the possibility of a wider inspection of the phenomena that are still in use. The world we live in and the problems that surround us imply dimensionalities that go beyond the present moment and also the social environment and surroundings that require a rise in hybrid locations, in which nature-culture or social-technical no longer have such clear boundaries.
In the evolutionary (and biological) explanation of the idea of niche construction proposed by Laland et al. (2000), evolution supposes a relation between nature and culture inside itself. The process of origination inside evolution leaves behind a trail through the culturally constructed legacies, which is mainly the result of two variables: the co-construction of organisms and feedback that flows from these developmental ecological, evolutionary, and sociocultural processes (Odling-Smee and Laland 2011). That means that further development is highly affected by adaptation to the environment and further ability of niche (re-)construction inside which humans develop their relationships. The whole process bequeaths to future generations a so-called ecological heritage through which the changing environment is transmitted through constructing a niche (ability) in the environment in which it is created (Odling-Smee 2010; Odling-Smee and Laland 2011). This is highly affected by information transfer as a vital element of this process.
Therefore, situating current technical and all other problems which surround them points toward two problems. The first one is too-fast technological development, and the second one relates to the adjustment to new environmental changes based on culture–nature evolution. For inhabitants of those environments adaptation becomes a problem, and new information transfer channels lose their potential because the content of the information is changed and saturated with the amount of information. Rules and principles do not adequately catch the change, so the whole process is shifted from its current evolution because all its elements (humans, environments, non-humans, techniques) are moving at a different speed in one direction. The essence of a scientific process is to discover these movements, their dynamics, and interrelationships, or a network of relationships within an assemblage.
To discover alternative approaches that align with this change, it is essential to engage with the process from within. Ingold (2004) has proposed an intriguing concept called relational thinking with change. This means that the most concrete way to treat the organism is not as a discrete entity predetermined by its position and the path it should find itself on but as an entity, which occupies a place within a continuous field of relations. Those relations would be much more topological than static (Ingold 2004). Hence, the development and capacities of the organism are the result of the entire development system, which consists of the organism’s presence in a particular environment, and are not predefined in any specification but are emergent from developmental systems. That kind of biosocial development (Ingold and Pallson 2013) should be at the heart of social sciences, where primacy would be not in the form of anthropomorphism but a condition for anthropogenesis (Palsson 2013). This new epistemological space diminishes the emphasis on individual autonomy and certain cooperative or conflicting practices, now emphasising the interrelationships of the actors and the context in which they occur. As Tim Ingold said: “…perception and action are not alternating phases in the circuit of stimulation and response, they are rather intrinsically coupled in the same movement; to perceive is to act” (Ingold 2022, p. 342). These entanglements assume that an organism is not entangled in relations but that every living thing is itself an entanglement (Ingold 2022). This approach would move researching technology from technology as a means to eco-technology because in the era of the Anthropocene, “technology should be rethought not in separation from the environment, but as part of the environment” (Ferrando 2019).
Why is the niche construction approach essential? Because it is a story that explains how to override the approach, which considers technology as something excluded from society, externalised by the human. However, instead, it is part of one environment. In the case of evolutionary theory, which can serve as an analogy to social relations, the environment can represent a vault in which all things happen and different parts of the processes are entangled and cross-sectioned. In the case of ethics, that means a certain kind of mutualism between humans, tools, machines, algorithms, and non-humans, thus avoiding establishing control mechanisms by forcing rules and guidelines. Agency or structure (speaking in a more sociological sense) is a process with a broader environment in which it occurs and must be approached from that perspective.
This will also demand that we change the main agenda question from “What” and “Why” to “How”; “What”? (means to be human) or “What” (is happening) will become less relevant instead of the question” How”? (…can we coevolve in a more than-human world). Furthermore, coevolving is a primary change in the approaches because, as Bruno Latour pointed out, after the Gaia hypothesis, where all living things are interlinked, folded, and end entangled in each other, nothing was the same – the human was neither slave to nor Lord over nature, and all things and species are part of the same mesh (Latour 2017). All methodological dichotomies based on humanistic principles, us and them (human on one side and non-human on the other side, for example) are slowly melting under the influence of the conceptual frameworks offered by posthumanism, and under this framework, it is becoming increasingly challenging to defend socially defined boundaries. Performativity as a main property of concepts and ideas, their flows and flux, and processes of evolution and coevolution directed contemporary humanities toward knowledge beyond “royal” humanistic inquiry. This means that, above all, this kind of inquiry is shifting from a predetermined one to an open or even emergent research design (Campbell and Lassiter 2014). In short, “how a thing does” what it does is more interesting than “what does it means” (Nealon 2021). In the next section, we will speak more about these issues and propose a research platform based on ecology practice in a more understandable and embodied way.

5 Ecology of practice as a new ethnographic perspective

The rapidity of technological progress and the attempt of science (among which social sciences occupy a significant place) to keep up with these faster changes and provide a response to accompanying issues seems to be an impossible task. Markham (2020) showed that in the last ten years in social science, there has been a fast shift in topics related to the digital field. For example, the topic of the paradigmatic change that big data brings (Kitchin 2014a) and its revolutionary character has not even “begun”, and some other topics have already come into focus. Problems seem to remain problems without the possibility of a meaningful response to them. In such an environment, mistrust appears as a systemic characteristic and a result of too-fast scientific development and change.
To solve the abovementioned problem, Isabelle Stengers proposes a manifesto to slow down science (Stengers 2018) by opening the scientific process as much as possible, making the process more transparent and more accountable. That means stepping away from the comfort zone of matter of facts toward the matter of concern (Latour 2004). We can see the reason for this shift in the explanation of Isabelle Strangers:
“…there are situations that concern us before they become objects of preoccupation or choice, situations which, in order to be appropriately characterized, demand that ‘we feel concerned’ … The essential thing with ‘matters of concern is to get rid of the idea that there is a single right answer’ and instead to put what are often difficult choices on” (Stengers 2018, p. 3).
It is a type of knowledge that is not based on the criterion of scientific objectivity that derives from factuality but a type of inquiry that shows significant consideration for the community and the differences and perspectives that come from it.
Haraway (1990) has already written about the problem of the factuality of knowledge, and today science is already ripe for discovering the distribution of a network of knowledge outside that scientific community to a more general population, mainly its users. This is the essence of the shift switching from a matter of fact to a matter of concern. The role of ethnography as a method for discovering this web would be to discover the distribution pathways of created scientific knowledge, the processes of receiving and modifying that acquired knowledge by the broader community that adopts it, and the different ways in which the scientific community receives and processes what it receives from the wider user community. According to Stengers (2011), such a position would mark the end of the debate between relativism and universalism by reviving a somewhat lost faith in the objectivity of science by redefining it. Bruno Latour is also on a similar epistemological track by further elaborating on the agency of multiple agents in the Anthropocene era, where “all agents share the same shape-changing destiny. A destiny that cannot be followed, documented, told, and represented by using any of the older traits associated with subjectivity or objectivity” (Latour 2004, p. 17). That means, according to Latour, that the earth is no longer objective because it is hard to create distance between a human and its surroundings, so onto-epistemological consequences radically undermine the subjective/objective trajectory. Only if we change the meaning of the notion of objectivity, not as a “name for a method but for an achievement, for the creation of a rapport authorizing the definition of an object” (Stengers 2011, p. 50), this would introduce a radical shift, the shift from data being treated as closed and fixed-objective evidence to their treatment as an ongoing process of analysis that can provide open and ongoing ways of knowing. As Sarah Pink pointed out, through the processual character of data it becomes “possible to apply an ethical approach to the methods of their analysis and modes of understanding” (Pink and Lanzeni 2018, p. 2). In that case, ethics would become part of the analysis rather than just something applied to it from the outside.
Suppose such an approach advocates the primacy of the matter of concern, which indicates the concern actors must show towards each other. In that case, it is essential to set the research strategy to understand who is affected and how. Stengers (2005) proposes the so-called “ecology of practice” which brings to the fore the question of effects and responsibility. This “art of situating knowledge” concerning its consequences carries with it a profound ethical responsibility because “practices should be characterized in terms that do not dissociate the Ethos of practice from its Oikos – the way it defines its environment” (Stengers 2005). This answer to the Mertonian ethos problem reflected through the environmental approach would mean that neutrality is incompatible with ecology because all persons have particular preferences, but this will not affect the idea of ​​objectivity, which will go beyond the authorisation of the objective definition of science. A position where the tool is not neutral does not mean that it is biased but that it is deeply involved in the network of relationships that influence it to be biased (as in the case of value judgment dispute), and that threatens its autonomy. This changes the perspective of the human–technology relationship because we are all technologically mediated depending on the assemblages and arrangements of which we are a part (Verbeek 2011).
In response, science should accept and slightly modify the technical and social environment and not authoritatively create the same. This would imply “approaching it as it diverges, feeling its borders, experimenting what practitioners may accept as relevant even if they are not their questions, rather than posing an insulting question that would lead them to mobilize and transform the border into a defence against outside” (Stengers 2005, p. 184).
This ecology of practice would not describe sciences by their deconstruction “as they are” but instead would explore their possibilities to connect and discover “as they may become” (Stengers 2005, p. 184). As we previously pointed out, rapid scientific development creates the most outstanding public resistance to scientific achievements because science cannot be accessible if it hides from its citizens. Opening this process would mean advancing autonomy ahead of authority; and plurality instead of unity, but plurality with a robust ethical position would be the only way out of relativism and universalism.
For this purpose of the ecology of practice, Stengers (2018) proposed a position of so-called Connoisseurs. They might become “agents of resistance against a scientific knowledge that pretends it has general authority; they partake in the production of what Donna Haraway calls “situated knowledges” (Stengers 2018, p. 9). According to Haraway, situated knowledge refers to the idea that our understanding of any given topic is shaped by our unique perspectives and positions (Haraway 1988). Essentially, what we can know about a particular object is limited by our individual experiences and biases. Understanding the situations in the paths of knowledge and information requires fieldwork and is the key to the ethnographic process that is inspired by the position of Connoisseurs. That means positioning ethnographers as observers of relations and examiners of the richness of perspective as a witness of the process (Stengers 2018). This would contribute to the abundance of perspectives but also the obligations arising from it.
Isabelle Stengers cites the example of GMOs (genetically modified food), where the condemnation of this approach by the public was the result of the public’s misunderstanding of the scientific process behind the product itself (Stengers 2018). She indicates that the public needs to understand that the GMO process is nothing different from what farmers did long before, only now that the process is much faster and more effective. In that case, the public deceived itself by mixing some of its values, which were often manipulated through the media, with scientific facts. Therefore, scientific work must be redirected from a matter of facts to a matter of concern, where understanding (primarily scientific) would be placed within the framework of “public intelligence” of science (Stengers 2018). Using the logic of Actor-network Theory (Latour 1993), connoisseurs would, in that case, be a bridge through which the links of translation would be discovered, bringing science closer to citizens. In that case, ethnography as a field method can serve that purpose, and Bruno Latour’s works are a good example of how this path can be established.
So, cultivated science must produce not only specialists but also connoisseurs, i.e. in domains where producers know that they have to consider the existence of people who can evaluate the products. Their job would be to assess the kind of information they are given, discuss its relevance, and differentiate between mere propaganda and calculated risk. It is not a matter of asking the general question, “Does the public have the capacity?” but one of asserting that it does not have the means to be capable. This attempt would be in line to position science away from lab isolation with full awareness of the world around it, permeated by economic, cultural, and political forces operating outside the laboratory. The research platform that we are nurturing would imply a radical redefinition of the ethnographer’s role as a nexus in the encounters of worlds with often different views. That bond would function for the promotion of science to the wider community and to bring that community closer to science with a main role of reducing the cleavages between the two often-opposing sides.

6 Conclusion

Through several perspectives on seemingly different topics, this article offers the possibility of approaching the problem of AI ethics. This problem consists of two main features. Firstly, it is a paradoxical situation in which there is an increasing number of normative solutions in the field of AI ethics followed by an inevitable decline in trust in its scope. Secondly, the programming community generally does not consult these solutions which were mainly created by nonprogrammers. Here, we were trying to build a path established in a descriptive manner that does not aim to cover norms or rules but the distribution lines, in this case, of ethics, in one area. There, different positions, from epistemological, ontological, technological and even ethical, are mixed in one environment that is the umbrella for all these systems. All parts of that environment contribute equally to maintaining its flow, whether human or non-human or social or technical, for example. So, in the case of dealing with AI ethics topics, in the way we have presented, it means dealing with ethics as an “umbrella” process within which various systems operate, from big data, algorithms, multiple policies, values, guidelines and so on.
Debates on value neutrality showed us the importance and effectiveness of relations between science and society at the beginning of the 20th century and today. It has taught us that a position where the tool is not neutral does not mean that it is biased but that it is deeply involved in the network of relationships that influence it to be biased. This changes the perspective of the human–technology relationship because we are all technologically mediated depending on the assemblages and arrangements of which we are a part (Verbeek 2011). In response, science should accept and slightly modify the technical and social environment and not authoritatively create the same. This would imply “approaching it as it diverges, feeling its borders, experimenting what practitioners may accept as relevant even if they are not their questions, rather than posing an insulting question that would lead them to mobilize and transform the border into a defence against outside” (Stengers 2005, p. 184). The borders of the surroundings are the ones that we have described with a niche construction approach. As an evolutionary approach, research platforms based on niche construction can help us examine the possibilities for coevolution by opening the scientific process in its design or, as it may become, not by universalisation and closure of its foundations. The ecology of practice and the connoisseurs as its bearers can serve in examining those connections between different parts of an assemblage and its unravelling. Ethnography, as the most field-based approach, can contribute to that goal, and precisely, the primary goal of this article is reaffirming ethnographic practice through the mentioned ideas of the ecology of practice and the positioning of ethnographers within and between different systems. The translations that the ethnographer could discover can be crucial for understanding the relations between science and society. Ethnographical inquiry and the necessity of connoisseurs as mediators in those relations and interests put upon the heavy demand for ethnography especially taking into consideration the possibility of embodiment and following the process from within ethnography which is probably one of the most challenging and suitable approaches that we have at our disposal. There remains an open question of the possibility of the meaning of borders in an approach that is so open that those borders are not visible. Also, inquiries established on a posthumanist theoretical framework are sometimes not critical and too optimistic, distancing themselves away from scientific and engineering practice in a narrative that can be quite exotic (Coeckelbergh 2021, p. 51).
To summarize, the examination of practice and the research platform we propose do not separate Ethos from Oikos. That calls for an ecological, posthumanist, and evolutionary perspective on the practice by which human—and more-than-human—knowledge is created. We were inspired by early field feedback related to normative proliferation on the issue of ethics and trust in AI. Obsolesce and inadequacy of rules, guidelines, and standards pointed to the necessity for a different approach to this issue. Our perspective is in line with posthumanist “wind” by melting human and non-human barriers and Haraway’s syntagma “staying with a trouble”, which presumes that we would not propose a contrary unrealistic alternative, “out of context”, but an approach deeply embedded in situational perspectives of actors involved in the process (Haraway 1988, 2016). Research platforms based on the ecology of practice and connoisseurs as a new practice and perspective that ethnography can take concerning the issue of accountable, ethical, and trustable science. This practice will induce a new position in examining the ethical issue with better connections between science and society. Such an approach allows us to “revive” the process and examine the ethical outcomes throughout their duration rather than forming pre-ethical principles before the process starts.

Competing interests

The authors have no relevant financial or non-financial interests to disclose.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Footnotes
1
The basic idea of this article was presented at the 20th Annual STS Conference Graz 2022 “Critical Issues in Science, Technology and Society Studies” and was titled “Setting agenda for ethical AI through understanding practice in a more than human world: Accountability issue and its future ethnographic perspective” (Getzinger et al. 2022).
 
2
Even at the beginning of the 1990s, Orlikowski (1992) pointed out the fragility of this dichotomy by applying Giddens’ structuration model.
 
3
This analogy has been drawn from a similar idea of Jonathan Marks about evolution and human nature: “To imagine that we are nothing but apes, and to find human nature there constitutes a denial of evolution” (Fuentes et al. 2010).
 
4
This article’s background is based on research, literature reviews, issues of big data, and the ethical use of algorithms. The idea started in a broader sense by investigating the paradigm shift from positivism to eScience (Gray 2009; Kitchin 2014b) and then algorithm studies and the possibilities of a more objective science (Anderson 2008; Dwork and Milligan 2013; Grey 2009).
 
5
Kitchin (2014a, b) wrote extensively on this issue.
 
6
This field porosity is nicely explained by Hayles (1999) using an analogy from Norbert Wiener’s 1948 famous book Cybernetic: “When system boundaries are defined by information flow in feedback loops rather than the epidermal surface, the subject becomes a system to be assembled and disassembled rather than an entity whose organic wholeness can be assumed” (p. 160).
 
7
Ingold developed his “line“ strategy based on Latour’s Actor-network Theory (Ingold 2011a, 2015).
 
Literature
go back to reference Barad, K.: Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Duke University Press Books, Durham and London (2007)CrossRef Barad, K.: Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Duke University Press Books, Durham and London (2007)CrossRef
go back to reference Barocas, S., Nissenbaum, H.: Big data’s end run around anonymity and consent. In: Lane, J.S., Privacy (eds.) Big Data, and the Public Good: Frameworks for Engagement, pp. 44–75. Cambridge University, Cambridge, UK (2014)CrossRef Barocas, S., Nissenbaum, H.: Big data’s end run around anonymity and consent. In: Lane, J.S., Privacy (eds.) Big Data, and the Public Good: Frameworks for Engagement, pp. 44–75. Cambridge University, Cambridge, UK (2014)CrossRef
go back to reference Bennett, J.: The Enchantment of Modern Life: Attachments, Crossings, and Ethics. Princeton University Press, Princeton and Oxford (2001)CrossRef Bennett, J.: The Enchantment of Modern Life: Attachments, Crossings, and Ethics. Princeton University Press, Princeton and Oxford (2001)CrossRef
go back to reference Beran, T.N.: Understanding how children understand robots: Perceived animism in child–robot interaction. Int. J. Hum. Comput. Stud. 69(7–8), 539–550 (2011)CrossRef Beran, T.N.: Understanding how children understand robots: Perceived animism in child–robot interaction. Int. J. Hum. Comput. Stud. 69(7–8), 539–550 (2011)CrossRef
go back to reference Boyd, D., Crawford, K.: Critical questions for big data. Inform. Communication Soc. 15(5), 662–679 (2012)CrossRef Boyd, D., Crawford, K.: Critical questions for big data. Inform. Communication Soc. 15(5), 662–679 (2012)CrossRef
go back to reference Callon, M.: Some elements of a sociology of translation: Domestication of the scallops and the fishermen of St Brieuc Bay. In J.Law (ed.) In: Power, Action and Belief: A New Sociology of Knowledge, pp. 196–233. Routledge & Kegan Paul, London (1986) Callon, M.: Some elements of a sociology of translation: Domestication of the scallops and the fishermen of St Brieuc Bay. In J.Law (ed.) In: Power, Action and Belief: A New Sociology of Knowledge, pp. 196–233. Routledge & Kegan Paul, London (1986)
go back to reference Campbell, E., Lassiter, L.E.: Doing Ethnography Today: Theories, Methods, Exercises. Wiley-Blackwell (2014) Campbell, E., Lassiter, L.E.: Doing Ethnography Today: Theories, Methods, Exercises. Wiley-Blackwell (2014)
go back to reference Christin, A.: The ethnographer and the algorithm: Beyond the black box. Theory and Society. 49, 897–918 (2020)CrossRef Christin, A.: The ethnographer and the algorithm: Beyond the black box. Theory and Society. 49, 897–918 (2020)CrossRef
go back to reference Coeckelbergh, M.: AI Ethics. Mit Press (2021) Coeckelbergh, M.: AI Ethics. Mit Press (2021)
go back to reference Cole, M.: The zone of Proximal Development: Where culture and cognition create each other. In: J.V.Wertsch (ed.) In: Culture, Communication and Cognition: Vygotskian Perspectives, pp. 146–161. Cambridge University Press, Cambridge (1985) Cole, M.: The zone of Proximal Development: Where culture and cognition create each other. In: J.V.Wertsch (ed.) In: Culture, Communication and Cognition: Vygotskian Perspectives, pp. 146–161. Cambridge University Press, Cambridge (1985)
go back to reference Cole, M., Engeström, Y.: Mind, culture, person: Elements in a cultural psychology: Comment. Hum. Dev. 38(1), 19–24 (1995)CrossRef Cole, M., Engeström, Y.: Mind, culture, person: Elements in a cultural psychology: Comment. Hum. Dev. 38(1), 19–24 (1995)CrossRef
go back to reference Davis, K.: Ethics of Big Data. O’Reilly Media, Inc, Sebastopol (2012) Davis, K.: Ethics of Big Data. O’Reilly Media, Inc, Sebastopol (2012)
go back to reference Dwork, C., Milligan, D.: It’s not privacy, and it’s not fair. Stanf. Law Rev. Online. 66, 35 (2013) Dwork, C., Milligan, D.: It’s not privacy, and it’s not fair. Stanf. Law Rev. Online. 66, 35 (2013)
go back to reference Ess, C.: New selves, new research ethics. In: Fossheim, H. I. (ed.) Internet Research Ethics. Cappelen Damm Akademisk (2015) Ess, C.: New selves, new research ethics. In: Fossheim, H. I. (ed.) Internet Research Ethics. Cappelen Damm Akademisk (2015)
go back to reference Ferrando, F.: Philosophical Posthumanism. Bloomsbury Academic, London and New York (2019)CrossRef Ferrando, F.: Philosophical Posthumanism. Bloomsbury Academic, London and New York (2019)CrossRef
go back to reference Floridi, L.: Translating principles into practices of digital ethics: Five risks of being unethical. In: Luciano, F. (ed.) Ethics, Governance, and Policies in Artificial Intelligence, pp. 81–90. Springer, Cham (2021)CrossRef Floridi, L.: Translating principles into practices of digital ethics: Five risks of being unethical. In: Luciano, F. (ed.) Ethics, Governance, and Policies in Artificial Intelligence, pp. 81–90. Springer, Cham (2021)CrossRef
go back to reference Floridi, L., Cowls, J., Beltrametti, M., Chatila, R., Chazerand, P., Dignum, V., Luetge, C., Madelin, R., Pagallo, U., Rossi, F., Schafer, B., Valcke, R., Vayena, E.: An ethical framework for a good AI society: Opportunities, risks, principles, and recommendations. In: Floridi, L. (ed.) Ethics, Governance, and Policies in Artificial Intelligence, pp. 19–41. Springer, Cham (2021)CrossRef Floridi, L., Cowls, J., Beltrametti, M., Chatila, R., Chazerand, P., Dignum, V., Luetge, C., Madelin, R., Pagallo, U., Rossi, F., Schafer, B., Valcke, R., Vayena, E.: An ethical framework for a good AI society: Opportunities, risks, principles, and recommendations. In: Floridi, L. (ed.) Ethics, Governance, and Policies in Artificial Intelligence, pp. 19–41. Springer, Cham (2021)CrossRef
go back to reference Fuentes, A.: The Creative Spark: How Imagination Made Humans Exceptional. Dutton, New York (2017) Fuentes, A.: The Creative Spark: How Imagination Made Humans Exceptional. Dutton, New York (2017)
go back to reference Fuentes, A., Marks, J., Ingold, T., Sussman, R., Kirch, P.V., Brumfiel, E.M., Rapp, R., Ginsburg, F., Nader, L., Kottak, C.P.: On Nature and the human. Am. Anthropol. 112(4), 512–521 (2010)CrossRef Fuentes, A., Marks, J., Ingold, T., Sussman, R., Kirch, P.V., Brumfiel, E.M., Rapp, R., Ginsburg, F., Nader, L., Kottak, C.P.: On Nature and the human. Am. Anthropol. 112(4), 512–521 (2010)CrossRef
go back to reference Fukuyama, F.: Our Posthuman Future: Consequences of the Biotechnology Revolution. Profile Books, London (2017) Fukuyama, F.: Our Posthuman Future: Consequences of the Biotechnology Revolution. Profile Books, London (2017)
go back to reference Geertz, C.: The Interpretation of Culture. Basic Books, New York (2000) Geertz, C.: The Interpretation of Culture. Basic Books, New York (2000)
go back to reference Geismar, H., Knox, H.: Digital Anthropology: Second Edition. Routledge, London and New York (2021)CrossRef Geismar, H., Knox, H.: Digital Anthropology: Second Edition. Routledge, London and New York (2021)CrossRef
go back to reference Grey, J.: Jim Gray on eScience: A transformed scientific method. In: Hey, T., Tansley, S., Tolle, K. (eds.) The Fourth Paradigm: Data-Intensive Scientific Discovery, pp. xvii–xxxi. Microsoft Research, Redmond, Washington (2009) Grey, J.: Jim Gray on eScience: A transformed scientific method. In: Hey, T., Tansley, S., Tolle, K. (eds.) The Fourth Paradigm: Data-Intensive Scientific Discovery, pp. xvii–xxxi. Microsoft Research, Redmond, Washington (2009)
go back to reference Haraway, D.: Situated knowledges: The science question in feminism and the privilege of partial. Feminist Stud. 14(3), 575–599 (1988)CrossRef Haraway, D.: Situated knowledges: The science question in feminism and the privilege of partial. Feminist Stud. 14(3), 575–599 (1988)CrossRef
go back to reference Haraway, D.: Primate Visions: Gender, Race, and Nature in the World of Modern Science. Routledge, New York, London (1990) Haraway, D.: Primate Visions: Gender, Race, and Nature in the World of Modern Science. Routledge, New York, London (1990)
go back to reference Haraway, D.J.: Staying with the Trouble: Making Kin in the Chthulucene. Duke University Press Books, Durham and London (2016)CrossRef Haraway, D.J.: Staying with the Trouble: Making Kin in the Chthulucene. Duke University Press Books, Durham and London (2016)CrossRef
go back to reference Hayles, K.N.: How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. University of Chicago Press, Chicago (1999)CrossRef Hayles, K.N.: How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. University of Chicago Press, Chicago (1999)CrossRef
go back to reference Holbraad, M., Pedersen, M.A.: The Ontological Turn: An Anthropological Exposition. Cambridge University Press, Cambridge (2017)CrossRef Holbraad, M., Pedersen, M.A.: The Ontological Turn: An Anthropological Exposition. Cambridge University Press, Cambridge (2017)CrossRef
go back to reference Ingold, T.: Beyond biology and culture. The meaning of evolution in a relational world. Social Anthropol. 9(2), 209–221 (2004)CrossRef Ingold, T.: Beyond biology and culture. The meaning of evolution in a relational world. Social Anthropol. 9(2), 209–221 (2004)CrossRef
go back to reference Ingold, T.: Being Alive: Essays on Movement, Knowledge and Description. Routledge London, New York (2011a)CrossRef Ingold, T.: Being Alive: Essays on Movement, Knowledge and Description. Routledge London, New York (2011a)CrossRef
go back to reference Ingold, T.: Imagining for real: essays on creation, attention and correspondence. Routledge, London, New York (2022) Ingold, T.: Imagining for real: essays on creation, attention and correspondence. Routledge, London, New York (2022)
go back to reference Ingold, T.: The Perception of the Environment: Essays on Livelihood, Dwelling and Skill. Routledge, London, New York (2011b) Ingold, T.: The Perception of the Environment: Essays on Livelihood, Dwelling and Skill. Routledge, London, New York (2011b)
go back to reference Ingold, T., Pallson, G.: Biosocial Becoming: Integrating Social and Biological Anthropology. Cambridge University Press, Cambridge (2013)CrossRef Ingold, T., Pallson, G.: Biosocial Becoming: Integrating Social and Biological Anthropology. Cambridge University Press, Cambridge (2013)CrossRef
go back to reference Kirsten, M.: Ethical implications and accountability of algorithms. J. Bus. Ethics. 160, 835–850 (2019)CrossRef Kirsten, M.: Ethical implications and accountability of algorithms. J. Bus. Ethics. 160, 835–850 (2019)CrossRef
go back to reference Kitchin, R.: Big data, new epistemologies and paradigm shift. Big Data & Society. 1(1), 1–12 (2014a)CrossRef Kitchin, R.: Big data, new epistemologies and paradigm shift. Big Data & Society. 1(1), 1–12 (2014a)CrossRef
go back to reference Kitchin, R.: The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences. SAGE Publications Ltd (2014b) Kitchin, R.: The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences. SAGE Publications Ltd (2014b)
go back to reference Kitchin, R.: Thinking critically about and researching algorithms. Inform. Communication Soc. 20(1), 14–29 (2017)CrossRef Kitchin, R.: Thinking critically about and researching algorithms. Inform. Communication Soc. 20(1), 14–29 (2017)CrossRef
go back to reference Kitchin, R.: Data Lives: How Data Are Made and Shape Our World. Bristol University Press, Bristol (2021)CrossRef Kitchin, R.: Data Lives: How Data Are Made and Shape Our World. Bristol University Press, Bristol (2021)CrossRef
go back to reference Latour, B.: The Pasteurization of France. Harvard University Press, Cambridge, London (1993) Latour, B.: The Pasteurization of France. Harvard University Press, Cambridge, London (1993)
go back to reference Latour, B.: Politics of Nature: How to Bring the Sciences into Democracy. Harvard University Press, Harvard (2004)CrossRef Latour, B.: Politics of Nature: How to Bring the Sciences into Democracy. Harvard University Press, Harvard (2004)CrossRef
go back to reference Latour, B.: Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford University Press, Oxford (2007) Latour, B.: Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford University Press, Oxford (2007)
go back to reference Latour, B.: Facing Gaia: Eight Lectures on the New Climatic Regime. Polity, Cambridge (2017) Latour, B.: Facing Gaia: Eight Lectures on the New Climatic Regime. Polity, Cambridge (2017)
go back to reference Latour, B., Woolgar, S.: Latoratory Life: The Construction of Scientific Facts. Princeton Univerity Press, Princeton, New Jersey (1986) Latour, B., Woolgar, S.: Latoratory Life: The Construction of Scientific Facts. Princeton Univerity Press, Princeton, New Jersey (1986)
go back to reference Markham, A.: Doing digital ethnography in the digital age. In: Leavy, P. (ed.) The Field of Qualitative Research. Oxford University Press, Oxford (2020) Markham, A.: Doing digital ethnography in the digital age. In: Leavy, P. (ed.) The Field of Qualitative Research. Oxford University Press, Oxford (2020)
go back to reference Miller, T.: Messy ethics: Negotiating the terrain between ethics approval and ethical practice. In: MacClancy, J., Fuentes, A. (eds.) Ethics in the Field: Contemporary Challenges, pp. 140–155. Berghahn Books (2013) Miller, T.: Messy ethics: Negotiating the terrain between ethics approval and ethical practice. In: MacClancy, J., Fuentes, A. (eds.) Ethics in the Field: Contemporary Challenges, pp. 140–155. Berghahn Books (2013)
go back to reference Mittelstadt, D.B., Allo, P., Taddeo, M., Wachter, S., Floridi, L.: The ethics of algorithms: Mapping the debate. Big Data & Society. 3(2), 1–21 (2016)CrossRef Mittelstadt, D.B., Allo, P., Taddeo, M., Wachter, S., Floridi, L.: The ethics of algorithms: Mapping the debate. Big Data & Society. 3(2), 1–21 (2016)CrossRef
go back to reference Moebius, S.: Sociology in Germany: A History. Palgrave MacMillan, Cham (2021)CrossRef Moebius, S.: Sociology in Germany: A History. Palgrave MacMillan, Cham (2021)CrossRef
go back to reference Mol, A.: The Body Multiple: Ontology in Medical Practice. Duke University Press, Durham and London (2002)CrossRef Mol, A.: The Body Multiple: Ontology in Medical Practice. Duke University Press, Durham and London (2002)CrossRef
go back to reference Nealon, J.T.: Fates of the Performative: from the Linguistic Turn to the New Materialism. University of Minnesota Press, Minneapolis, London (2021)CrossRef Nealon, J.T.: Fates of the Performative: from the Linguistic Turn to the New Materialism. University of Minnesota Press, Minneapolis, London (2021)CrossRef
go back to reference Odling-Smee, J.: Niche inheritance. In: Pigliucci, M., Müller, G.B. (eds.) Evolution: The Extended Synthesis, pp. 175–208. The MIT Press (2010) Odling-Smee, J.: Niche inheritance. In: Pigliucci, M., Müller, G.B. (eds.) Evolution: The Extended Synthesis, pp. 175–208. The MIT Press (2010)
go back to reference Orlikowski, W.: The duality of technology: Rethinking the concept of technology in organizations. Organ. Sci. 3(3), 398–427 (1992)CrossRef Orlikowski, W.: The duality of technology: Rethinking the concept of technology in organizations. Organ. Sci. 3(3), 398–427 (1992)CrossRef
go back to reference Palsson, G.: In: Ingold, T. (ed.) Ensembles of Biosocial Relations. Integrating Social and Biological Anthropology. Cambridge University Press, Biosocial Becomings (2013)CrossRef Palsson, G.: In: Ingold, T. (ed.) Ensembles of Biosocial Relations. Integrating Social and Biological Anthropology. Cambridge University Press, Biosocial Becomings (2013)CrossRef
go back to reference Pink, S., Ruckenstei, M., Berg, M., Lupton, D.: Everyday automation: Setting a research agenda. In: Pink, S., Berg, M., Lupton, D., Ruckenstei, M. (eds.) Everyday Automation: Experiencing and Anticipating Emergent Technologies, pp. 1–20. Routledge, London and New York (2022)CrossRef Pink, S., Ruckenstei, M., Berg, M., Lupton, D.: Everyday automation: Setting a research agenda. In: Pink, S., Berg, M., Lupton, D., Ruckenstei, M. (eds.) Everyday Automation: Experiencing and Anticipating Emergent Technologies, pp. 1–20. Routledge, London and New York (2022)CrossRef
go back to reference Porter, T.: Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton University Press, Princeton, NJ (1995)CrossRef Porter, T.: Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton University Press, Princeton, NJ (1995)CrossRef
go back to reference Proctor, R.N.: Value-Free Science? Purity and Power in Modern Knowledge. Harvard University Press, Harvard (1991) Proctor, R.N.: Value-Free Science? Purity and Power in Modern Knowledge. Harvard University Press, Harvard (1991)
go back to reference Rakova, R., Ayess, A., Fanti, A., Lennon, M., Schiff, D., Rakova, B., Ayesh, A., Fanti, A., Lennon, M.: Explaining the principles to practices gap in AI. IEEE Technol. Soc. Mag. 40(2), 81–94 (2021)CrossRef Rakova, R., Ayess, A., Fanti, A., Lennon, M., Schiff, D., Rakova, B., Ayesh, A., Fanti, A., Lennon, M.: Explaining the principles to practices gap in AI. IEEE Technol. Soc. Mag. 40(2), 81–94 (2021)CrossRef
go back to reference Seaver, N.: Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society 4(2) (2017) Seaver, N.: Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society 4(2) (2017)
go back to reference Simon, J.: Distributed epistemic responsibility in a hyperconnected era. In: Floridi, L. (ed.) The Onlife Manifesto, pp. 145–159. Springer International Publishing (2015) Simon, J.: Distributed epistemic responsibility in a hyperconnected era. In: Floridi, L. (ed.) The Onlife Manifesto, pp. 145–159. Springer International Publishing (2015)
go back to reference Smagorinsky, P.: The social construction of data: Methodological problems of investigating learning in the zone of proximal development. Rev. Educ. Res. 65(3), 191–212 (1995)CrossRef Smagorinsky, P.: The social construction of data: Methodological problems of investigating learning in the zone of proximal development. Rev. Educ. Res. 65(3), 191–212 (1995)CrossRef
go back to reference Stengers, I.: Introductory notes on an ecology of practice. Cult. Stud. Rev. 11(1), 183–196 (2005)CrossRef Stengers, I.: Introductory notes on an ecology of practice. Cult. Stud. Rev. 11(1), 183–196 (2005)CrossRef
go back to reference Stengers, I.: Another Science is Possible: A Manifesto for Slow Science. Polity, Cambridge and Medford (2018) Stengers, I.: Another Science is Possible: A Manifesto for Slow Science. Polity, Cambridge and Medford (2018)
go back to reference Suchman, L.: Plans and Situated Actions: The Problem of Human-Machine Communication. Cambridge University Press, Cambridge (1987) Suchman, L.: Plans and Situated Actions: The Problem of Human-Machine Communication. Cambridge University Press, Cambridge (1987)
go back to reference Turkle, S.: Life on the Screen. Simon & Schuster, New York (1997) Turkle, S.: Life on the Screen. Simon & Schuster, New York (1997)
go back to reference Vakkuri, V., Kemell, K.-K., Kultanen, J.: The current state of industrial practice in artificial intelligence ethics. IEEE Softw. 37(4), 50–57 (2020)CrossRef Vakkuri, V., Kemell, K.-K., Kultanen, J.: The current state of industrial practice in artificial intelligence ethics. IEEE Softw. 37(4), 50–57 (2020)CrossRef
go back to reference Verbeek, P.-P.: Moralizing Technology: Understanding and Designing the Morality of Things. University of Chicago Press, Chicago (2011)CrossRef Verbeek, P.-P.: Moralizing Technology: Understanding and Designing the Morality of Things. University of Chicago Press, Chicago (2011)CrossRef
go back to reference Wakkary, R.: Things We Could Design: for More Than Human-Centered Worlds. ‎The MIT Press, Cambidge, London (2021)CrossRef Wakkary, R.: Things We Could Design: for More Than Human-Centered Worlds. ‎The MIT Press, Cambidge, London (2021)CrossRef
go back to reference Wang, S., Wan, J., Zhang, D., Zhang, C.: Towards smart factory for industry 4.0: A self-organized multi-agent system with big data-based feedback and coordination. Comput. Netw. 101, 158–168 (2016)CrossRef Wang, S., Wan, J., Zhang, D., Zhang, C.: Towards smart factory for industry 4.0: A self-organized multi-agent system with big data-based feedback and coordination. Comput. Netw. 101, 158–168 (2016)CrossRef
go back to reference Weber, M.: The Methodology of the Social Sciences (translated and edited by Edward A. Shils and Henry A. Finch; with a foreword by Edward A. Shils). The Free Press of Glencoe, Illinois (1949) Weber, M.: The Methodology of the Social Sciences (translated and edited by Edward A. Shils and Henry A. Finch; with a foreword by Edward A. Shils). The Free Press of Glencoe, Illinois (1949)
go back to reference Zimmer, M.: But the data is already public: On the ethics of research in Facebook. Ethics Inf. Technol. 12, 313–325 (2010)CrossRef Zimmer, M.: But the data is already public: On the ethics of research in Facebook. Ethics Inf. Technol. 12, 313–325 (2010)CrossRef
Metadata
Title
New ethnographic perspective on relational ethics in the field of Artificial intelligence
Authors
Pavle Pavlović
Mitja Hafner Fink
Publication date
30-09-2023
Publisher
Springer Netherlands
Published in
Quality & Quantity / Issue 3/2024
Print ISSN: 0033-5177
Electronic ISSN: 1573-7845
DOI
https://doi.org/10.1007/s11135-023-01751-3

Other articles of this Issue 3/2024

Quality & Quantity 3/2024 Go to the issue

Premium Partner