Skip to main content
Top

2022 | OriginalPaper | Chapter

3. Data-Driven Surveillance

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This chapter describes current information and research marketing practices with AI. More or less covertly, marketers collect, aggregate, and analyse consumers’ data from different offline and online sources and use intelligent algorithmic systems to derive new knowledge about their preferences and behaviours. Consumers know little about how marketers operate with their data, what knowledge is extracted, and with whom this is shared. A new profound asymmetry emerges between consumers and marketers, where power is not only related to the market and contracts but also to the possibility of knowing consumers' lives. Existing privacy and data protection only succeed to a limited extent in limiting surveillance practices and reducing these asymmetries. The main reasons for this are found in the limited effectiveness of the dominant protection paradigm of informed consent and the conceptualisation of information as personal data.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
On big data and related transformations in social and economic relationships, see McAfee and Brynjolfsson (2012); Mayer-Schönberger and Cukier (2013); Davenport (2014); Morrison (2015).
 
2
McAfee and Brynjolfsson (2012) describe five new management challenges for a company to succeed in big data management. Companies need to have leadership teams that set clear goals and ask the right questions, and they need to employ IT workers with expertise in computer science and data management. Companies need to develop excellent software tools to manage the volume, velocity, and variety of big data, and need to use the knowledge so extracted to maximise cross-functional cooperation between business departments and customer relationships. Also needed, crucially, is a cultural shift in mentality: the right question for managers is no longer “what do we think?” but “what do we know?” This change requires moving away from acting solely on instinct and making use of data to build strategies and decision-making.
 
3
In particular, large quantities of data require “big data consumer analytics”, defined as “the extraction of hidden insight about consumer behavior from Big Data and the exploitation of that insight through advantageous interpretation” (Erevelles et al. 2016).
 
4
According to Davenport (2018), companies today are transitioning from analytics 3.0—which he describes as data-economy analytics, in which companies in traditional industries also embrace big data and analytics by resorting to machine learning models—to analytics 4.0, where we have the highest degree of sophistication in analytics thanks to AI and cognitive technologies (NLP, speech recognition, computer vision). Analytics 4.0 enhance the previous stages and form a backdrop that merges analytics and automation. On the stages of this transition see, more generally, Davenport and Harris (2017).
 
5
Balducci and Marinova (2018) define unstructured data as “information that either does not have a pre-defined data model or is not organised in a pre-defined manner.” This kind of data can be either verbal (spoken and written data) or nonverbal, which in turn can be human (e.g., facial cues) and inanimate (e.g., radio frequency identification, location data).
 
6
On the impact of big data on strategy formation, see Constantiou and Kallinikos (2015, p. 52), who look at big data as a new organisational context. While traditional business strategies would respond to the need to collect data intentionally and purposefully to inform specific theoretical models and provide predefined input to decision-making process, big data can be made business-relevant subsequent to its collection, but its relevance may not straightforwardly derive from the original data records. Much of what comes under the heading of big data is not collected intentionally; rather, they say, “it is haphazard, hugely heterogeneous, and, not infrequently, trivial, messy and agnostic, as it happens with much user-generated content and data logs of various kinds”.
 
7
Beyond business, this new setting has been summarised in the motto “First data then search for any possible uses of what is already available as data.” The quote is from Anderson (2008), which elaborates on the obsolescence of traditional scientific methods in research.
 
8
In providing a framework for different data-collection methods, I am following Clarke (2019).
 
9
On “volunteered information,” see Mitchell (2010). Before Web-based commerce, marketing had only had two sets of information to deal with: market-aggregated information, which is primarily qualitative and statistical, and transaction-related data, which are granular and personally identifiable.
 
10
According to Yong Jin Park (2013), privacy literacy online consists of the following elements: familiarity with technical aspects of the Internet, an awareness of common institutional practices, and an understanding of current privacy policies. Many empirical studies exist already on the problem of online consumers’ privacy literacy. See, among others, Bartsch and Dienlin (2016) and more recently Prince et al. (2021).
 
11
Ridley-Siegert (2015); Walker (2016).
 
12
This kind of data acquisition is traditionally referred to as “clickstream data.” See Montgomery et al. (2004).
 
13
Sipior et al. (2011).
 
14
Christl and Spiekermann (2016), p. 40: “The marketing data industry is arguably the main driving force behind ubiquitous consumer surveillance. It consists of a wide range of different types of companies, including marketing and advertising agencies, list brokers, database management providers, online and mobile advertisers, and firms engaged in direct mail, telephone sales services, and data-driven commerce, as well as companies offering loyalty programs. Marketing data companies, which are often called “data brokers”, mostly offer a smaller or bigger selection of these services.”
 
15
Schaub et al. (2016).
 
16
Privacy Choice (2010).
 
17
For an account of methods of users’ data collection and disclosure, see Lomborg and Bechmann (2014).
 
18
H. Schneider et al. (2017).
 
19
Forbrukerrådet (2020).
 
20
Brynjolfsson et al. (2013). Omni-channel is a strategy based on merging consumers’ physical and online presence to provide a seamless customer journey across e-commerce websites and brick-and-mortar stores.
 
21
The term profile derives from the Italian profilo, from profilare, originally “to draw a line,” especially the contour of an object. That is precisely the idea behind profiling through data processing, in which the data that is available about individuals or groups is expanded so as to describe their traits and propensities.
 
22
Van Otterlo (2013).
 
23
Diaz Ruiz and Kjellberg (2020).
 
24
Google Ad Manager Help, Create first-party audience segments, 2020, https://​support.​google.​com/​admanager/​answer/​2423498?​hl=​en. See, more generally, Procter et al. (2015).
 
25
Facebook For Business, Audience Targeting Options, 2020 https://​www.​facebook.​com/​business/​help/​633474486707199.
 
26
Jenkinson (1994). In the early days of marketing, buyer personas were generally constructed by conducting interviews and surveys with real consumers. Using their expressions and thoughts, and maybe even the photos they took or the activities they did, marketers would “step into the shoes” such real individuals and rely on the imagination and creativity to build profiles that described their hypothetical target consumer.
 
27
Burgess and Burgess (2020) explain that a buyer persona “is a semifictional representation of your ideal customer based on market research and real data about your existing customers.” The authors describe this process of creating a buyer persona through data in seven steps: gathering buyers’ data, assembling the team, getting to know the buyer, evaluating pain points, crafting the persona, continuously revisiting and revising the persona, and validating the persona. Based on the validated persona, the marketing team will be able to find a lookalike audience in new incoming data.
 
28
Popov and Iakovleva (2018).
 
29
On predictive analytics see, generally, Siegel (2013).
 
30
This is an example of regression. However, consumer scoring can also be modelled as a classification task. In such a case, consumers would be classed as either “potential buyer” or “non-buyer.”
 
31
Barnhard (2020).
 
32
Chen (2018).
 
33
Camilleri (2020).
 
34
In the Foucauldian tradition, Lyon (2007), defines surveillance (from French surveiller, “to watch over”) as “the focused, systematic and routine attention to personal details for purposes of influence, management, protection or direction” (p. 14). The author cautions that, contrary to what is commonly assumed, the term surveillance does not carry any evaluative overtones, but it is merely descriptive. It includes everything from face-to-face encounters to mediated arrangements dependent on a comprehensive and ever-growing range of information technologies.
 
35
Together with governmental surveillance, the corporate use of information technologies has been among the recurrent research interests of surveillance scholars. For example, in one of the earliest studies on corporate surveillance, Gandy Jr. (1993, p. 1) relied on Bentham’s idea of the panopticon to describe technology-mediated corporate practices as “a kind of high-tech, cybernetic triage through which individuals and groups of people are being sorted according to their presumed economic value.” Similarly, Clarke (1988) was concerned about the increasing use of information technologies in commercial organisations and coined the term dataveillance to stress how longstanding forms of visual and communication surveillance had been progressively replaced by more economically viable and technically efficient computerised means. Others who have spoken of corporate surveillance and technology, include Lyon (1994, 2010); Haggerty and Ericson (2000); Zwick and Dholakia (2004). For an historical overview of this research strand, see Pridmore and Zwick (2011).
 
36
Ball et al. (2016), p. 61: “thus we are forced to ask what the ‘big’ in ‘big data’ refers to. While the etymology of the term encourages a focus on the volume of data, it refers in fact instead more to the ubiquity of data, the completeness of coverage over contemporary lives. It is this ubiquity, the knowledge of a near-complete record of individual lives, which removes the need for a priori decisions on commencing surveillance.”).
 
37
Andrejevic and Gates (2014).
 
38
Mayer-Schönberger and Cukier (2013).
 
39
Hildebrandt (2019), for example, describes machine learning as agnostic in the sense that the algorithms are oblivious to human bias or independent of the design choices that determine its accuracy.
 
40
Fisher and Mehozay (2019). See also Krasmann (2020), who argues for a specific epistemological approach to algorithmic systems. Since these systems have no access to the “world of human sense-making,” they reduce reality and paint a “behaviourist picture” of human modes of existence.
 
41
Fuchs (2011).
 
42
The concept of a prosumer was introduced by Toffler (1980) to welcome the uptake of a new political, economic, and democratic order characterised by “progressive blurring of the line that separates producer from consumers.” However, surveillance scholars have argued that Toffler overlooked that “prosumption” would also affect power relations in decentralised working environments, reconfiguring production and the autonomy of labour. In particular, Fuchs (2011, p. 295) criticises the practice of digital platforms outsourcing work to users and consumers, who now work for free. That is the case, he argues, with social networks, which exploit user-generated content to make a profit through advertising.
 
43
Fuchs (2011, p. 299), who on a Marxist approach argues that big data companies exploit prosumers as much as industrial capitalists exploit factory workers.
 
44
See Lyon (2010), claiming that there exists a new type of surveillance he calls “mobi-veillance.”
 
45
The digitally mediated contextualisation of consumers has been analysed by surveillance scholars under the label “brand-scape,” defined by Wood and Ball (2013) as “a new mode of ordering that seeks to simultaneously construct space and subjectivity, a mode of ordering represented by a new apparatus.”
 
46
In making the predictive analytics of machine learning understandable and appealing to marketers, Siegel (2013, p. 20) argues that “[w]e usually don’t know about causation, and we often don’t necessarily care. […] the objective is more to predict than it is to understand the world. […] It just needs to work; prediction trumps explanation.”
 
47
Van Dijck (2014, p. 200), who deconstructs the ideological grounds of data-driven culture as rooted in problematic ontological and epistemological claims (so-called dataism). As part of this logic, predictive analytics reveals “a slippery slope between analysis and projection, between deduction and prediction.”
 
48
See, Cohen (2016), mocking the rhetoric of participation and openness in today networked data flows (s.c. the “participatory turn”), which hides the design to position surveillance outside from legal and social control.
 
49
Zuboff (2019).
 
50
Polanyi (1944).
 
51
This is only one of eight definitions given to the “surveillance capitalism.” For a flavour of the range of ideas addressed in the book, consider the other seven: (i) “a parasitic economic logic in which the production of goods and services is subordinated to a new global architecture of behavioral modification”; (ii) “a rogue mutation of capitalism marked by concentrations of wealth, knowledge, and power unprecedented in human history”; (iii) “the foundational framework of a surveillance economy”; (iv) “as significant a threat to human nature in the twenty-first century as industrial capitalism was to the natural world in the nineteenth and twentieth”; (v) “the origin of a new instrumentarian power that asserts dominance over society and presents startling challenges to market democracy”; (vi) “a movement that aims to impose a new collective order based on total certainty”; and (vii) “an expropriation of critical human rights that is best understood as a coup from above: an overthrow of the people’s sovereignty.”
 
52
See on this also, Sadowski (2019).
 
53
Zuboff (2019), p. 199.
 
54
Zuboff (2019), p. 38.
 
55
Negroponte et al. (1997).
 
56
Zwick and Dholakia (2004).
 
57
The concept shows a clear link to digital-driven buyer personas as a product of marketing creations.
 
58
Haggerty and Ericson (2000, p. 606): “this assemblage operates by abstracting human bodies from their territorial settings and separating them into a series of discrete flows. These flows are then reassembled into distinct ‘data doubles,’ which can be scrutinised and targeted for intervention. In the process, we are witnessing a rhizomatic levelling of the hierarchy of surveillance, such that groups which were previously exempt from routine surveillance are now increasingly being monitored.” Data doubles are often referred to by other labels, such as “dividuals” (Deleuze 1992) and “data shadows” (Simon 2005).
 
59
Lanier (2010).
 
60
See Boyd and Crawford (2012), who deconstruct the widespread mythology that “large data sets offer a higher form of intelligence and knowledge that can generate insights that were previously impossible, with the aura of truth, objectivity, and accuracy.”
 
61
Barocas et al. (2013).
 
62
Pridmore and Zwick (2011) argue that the surveillance character of algorithmic assemblage does not manifest with the individualisation of identities. Instead, it is concerned with “the collection of personal information to discriminate individuals into previously categorised consumer lifestyle groups or profiles”.
 
63
The method of consumers protection and how big data are valued reflect what Zuboff (2019, p. 535) calls a “formal indifference” of surveillance capitalism to its population of customers and users.
 
64
The phrase was already used prior to the Internet boom of the late 90s. For example, one can see nearly that exact phrase (or at least certainly the exact principle) described in Serra and Weyergraf (1980) on TV advertising. The interview references a short film titled Television Delivers People, made by the American video artist in 1973, who claimed that “The Product of Television, Commercial Television, is the Audience.”
 
65
Zwick and Dholakia (2013).
 
66
Pridmore and Zwick (2011, p. 122) speak of “consumer brands.”
 
67
Holvast (2007).
 
68
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119, pp. 1–88 (GDPR for short).
 
69
For example, on the differences between the EU and US approaches to granting individuals rights to personal information, see Paul M. Schwartz (2003).
 
70
Warren and Brandeis (1890). The paper is considered the seminal essay in the history of privacy, by American and privacy lawyers alike.
 
71
Among the many theories of privacy, Tavani (2008) singles out three. The first is Warren and Brandeis’s traditional notion of privacy, ascribed to the category of “physical privacy,” which is the freedom from physical intrusion. The second and third are “decisional privacy” and “psychological privacy,” concerned with protection from interference in important life decisions and the protection of one’s intimate thoughts.
 
72
Council of Europe, Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data, Jan. 28, 1981, ETS. No. 108.
 
73
Ibid., Article 1.
 
74
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, OJ L 281, pp. 31–50 (also Data Protection Directive)
 
75
Charter of Fundamental Rights of the European Union, OJ C 326, 26.10.2012, pp. 391–407.
 
76
Ibid, Article 8.
 
77
Rouvroy and Poullet (2009).
 
78
This is clearly stated in the preamble of the GDPR, Recital 7 (“Natural persons should have control of their own personal data”) and in Recital 39 (“Personal data should be processed in a manner that ensures appropriate security and confidentiality of the personal data, including for preventing unauthorised access to or use of personal data and the equipment used for the processing”).
 
79
Article 6 GDPR.
 
80
Recital 32 GDPR: “Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement.”
 
81
Indeed, the simpler the law makes the consent procedure, the less users are likely to understand what they are being asked to consent to. But then the more meaningful you make the consent procedure, by providing sufficient information about your data processing, the more time-consuming and demanding it will be for users to go through all that information in forming an informed judgment about giving their consent to such processing.
 
82
Barocas and Nissenbaum (2009).
 
83
According to a study carried out in Sherman (2008), a proper understanding of the meaning of privacy documents would require the IQ of an average PhD.
 
84
Legally speaking, such practices would run contrary to Article 12 GDPR, which requires the information given in privacy policies be “concise, transparent, intelligible and easily accessible form” and be stated in “clear and plain language.”
 
85
Reidenberg et al. (2016).
 
86
Koops (2014), p. 251: “often, there is little to choose: if you want to use a service, you have to comply with the conditions—if you do not tick the consent box, access will be denied.”
 
87
This social fact could naively be regarded as a privacy paradox. On the one hand, individuals seem to be concerned about their informational privacy and to want to protect it. On the other hand, they voluntarily disclose information online and rarely exercise their data-protection rights. However, an alternative explanation is that people may want privacy and understand its importance, and they are even aware of the rights they have, but in the current data-driven marketplace, using and sharing data has become such an integral part of their life that their “choice” is not a real one. Such an explanation calls into question whether the decisions made by individuals when sharing data are indeed paradoxical or instead simply reflect their needs in a highly “datafied” society.
 
88
See Mantelero (2014), who calls for a general acknowledgement of the “transformative approach of Big data.”
 
89
On the difficulty of reconciling privacy principles with big data, see also Tal Z. Zarsky (2016). Among other things, big data would be incompatible with the principle of data minimisation, which under Article 5 GDPR, requires that “personal data shall be […] adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.” Apparent conflicts emerge with the principle of purpose limitation, requiring that “personal data shall be […] collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes” (Article 5(1)(b) GDPR). On the possibility of reconnecting AI and big-data applications with the principle of EU data protection though extensive interpretation, see Sartor (2020b).
 
90
See Baruh and Popescu (2017), who argue that the predominant privacy-protection regimes underplay privacy as a collective good. In particular, they show how the two common individual privacy strategies—i.e., not consenting to privacy policies and completely relying on market-provided privacy protections—may result in fewer privacy options available to society at large. On a similar note, Mantelero (2016) explains that the atomistic approach to data protection no longer holds in mass-predictive data analysis. In such a new context, privacy interests have an additional layer consisting of in “the collective dimension of data protection, which protects groups of persons from the potential harms of discriminatory and invasive forms of data processing.”
 
91
The assumption that consent is adequately informed based on acceptance of online service contracts turns out to be “the biggest lie on the Internet.”
 
92
Hull (2015) argues that the failure of notice and consent to protect privacy, while at the same time making it easier to exploit personal data, may be the fruit of specific ideological choices. The notice-and-consent model consecrates the belief that, despite the scope of fundamental rights, privacy can only be treated in terms of an individual’s economic choice to disclose information.
 
93
Article 4(1) GDPR.
 
94
Recital 26 GDPR.
 
95
On this point, see Paul M. Schwartz and Solove (2011), who caution that the restricted notion of personal information risks leaving the behavioural data markets untouched, even if they probably rank among the most privacy-invasive spaces.
 
96
Andrew and Baker (2019).
 
97
On whether behavioural data in targeted advertising can be covered by data-protection legislation, see also Zuiderveen Borgesius (2016).
 
98
Article 2(5) GDPR.
 
99
Article 6(4)(e): “Where the processing for a purpose other than that for which the personal data have been collected is not based on the data subject’s consent or on a Union or Member State law which constitutes a necessary and proportionate measure in a democratic society to safeguard the objectives referred to in Article 23(1), the controller shall, in order to ascertain whether processing for another purpose is compatible with the purpose for which the personal data are initially collected, take into account, inter alia: […]. (e) the existence of appropriate safeguards, which may include encryption or pseudonymisation.” This is complemented by Recital 29, providing that “in order to create incentives to apply pseudonymisation when processing personal data, measures of pseudonymisation should, whilst allowing general analysis, be possible within the same controller when that controller has taken technical and organisational measures necessary to ensure, for the processing concerned, that this Regulation is implemented, and that additional information for attributing the personal data to a specific data subject is kept separately.”
 
100
Andrew and Baker (2019). The authors conclude that “its reach will be limited if law makers fail to understand the broader behavioral data ecosystems they seek to regulate […] the law creates the space for a behavioral data market in which commercial self-interest is likely to flourish. In this way, whether deliberately or otherwise, the GDPR will make it possible for the behavioural data market to continue to function unencumbered.”
 
101
Art. 13(2)(f), 14.2(g), and 15(1)(h).
 
102
Among others, see Wachter et al. (2017); Malgieri and Comandé (2017); Veale and Edwards (2018); Kaminski and Malgieri (2019).
 
103
Hildebrandt and Rouvroy (2011), p. 23.
 
104
On the possibility of considering inference as personal data under the GDPR, see extensively Sartor (2020a), p. 40ff.
 
105
Article 29 Data Protection Working Party, Opinion 4/2007 on the Concept of Personal Data, adopted on 20 June 2007. In particular, the opinion follows that idea that in order for data to count as “related” to an individual, there needs to be a “content,” “purpose,” or “result” element. The first criterion means that information qualifies as personal data when it relates to a person, i.e., it is about that person, as when “the information contained in a company’s folder under the name of a certain client clearly relates to him.” Under the second criterion (“purpose”), an item of information counts as personal insofar as it is used to evaluate, treat in a certain way, or influence the status or behaviour of an individual. Under the “result” criterion, even without a “content” or “purpose” element, data can be considered personal whenever their use is likely to have an impact on a person’s rights and interests.
 
106
Article 29 Data Protection Working Party, Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679, adopted on 3 October 2017 as last revised and adopted on 6 February 2018: “In addition to general information about the processing, pursuant to Article 15(3), the controller has a duty to make available the data used as input to create the profile as well as access to information on the profile and details of which segments the data subject has been placed into” (p. 17).
 
107
Wachter and Mittelstadt (2019).
 
108
Article 29 Working Party, Guidelines on the Right to Data Portability (2016), p. 242.
 
Literature
go back to reference Andrejevic M, Gates K (2014) Big data surveillance: introduction. Surveill Soc 12(2):185–196 Andrejevic M, Gates K (2014) Big data surveillance: introduction. Surveill Soc 12(2):185–196
go back to reference Andrew J, Baker M (2019) The general data protection regulation in the age of surveillance capitalism. J Bus Ethics 168(3):565–578CrossRef Andrew J, Baker M (2019) The general data protection regulation in the age of surveillance capitalism. J Bus Ethics 168(3):565–578CrossRef
go back to reference Balducci B, Marinova D (2018) Unstructured data in marketing. J Acad Mark Sci 46(4):557–590CrossRef Balducci B, Marinova D (2018) Unstructured data in marketing. J Acad Mark Sci 46(4):557–590CrossRef
go back to reference Ball K, Di Domenico ML, Nunan D (2016) Big data surveillance and the body-subject. Body Soc 22(2):58–81CrossRef Ball K, Di Domenico ML, Nunan D (2016) Big data surveillance and the body-subject. Body Soc 22(2):58–81CrossRef
go back to reference Barocas S, Nissenbaum H (2009) On notice: the trouble with notice and consent”. In: Proceedings of the engaging data forum: the first international forum on the application and management of personal electronic information Barocas S, Nissenbaum H (2009) On notice: the trouble with notice and consent”. In: Proceedings of the engaging data forum: the first international forum on the application and management of personal electronic information
go back to reference Barocas S, Hood S, Ziewitz M (2013) “Governing algorithms: a provocation piece” “Governing Algorithms” conference (paper presentation), May 16–17, 2013 Barocas S, Hood S, Ziewitz M (2013) “Governing algorithms: a provocation piece” “Governing Algorithms” conference (paper presentation), May 16–17, 2013
go back to reference Bartsch M, Dienlin T (2016) Control your Facebook: an analysis of online privacy literacy. Comput Hum Behav 56:147–154CrossRef Bartsch M, Dienlin T (2016) Control your Facebook: an analysis of online privacy literacy. Comput Hum Behav 56:147–154CrossRef
go back to reference Baruh L, Popescu M (2017) Big data analytics and the limits of privacy self-management. New Media Soc 19(4):579–596CrossRef Baruh L, Popescu M (2017) Big data analytics and the limits of privacy self-management. New Media Soc 19(4):579–596CrossRef
go back to reference Boyd D, Crawford K (2012) Critical questions for big data: provocations for a cultural, technological, and scholarly phenomenon. Inf Commun Soc 15(5):662–679CrossRef Boyd D, Crawford K (2012) Critical questions for big data: provocations for a cultural, technological, and scholarly phenomenon. Inf Commun Soc 15(5):662–679CrossRef
go back to reference Brynjolfsson E, Hu YJ, Rahman MS (2013) Competing in the age of omnichannel retailing. MIT, Cambridge Brynjolfsson E, Hu YJ, Rahman MS (2013) Competing in the age of omnichannel retailing. MIT, Cambridge
go back to reference Burgess C, Burgess M (2020) The new marketing: how to win in the digital age. SAGE Burgess C, Burgess M (2020) The new marketing: how to win in the digital age. SAGE
go back to reference Camilleri MA (2020) The use of data-driven technologies for customer-centric marketing. Int J Big Data Manage 1(1):50–63CrossRef Camilleri MA (2020) The use of data-driven technologies for customer-centric marketing. Int J Big Data Manage 1(1):50–63CrossRef
go back to reference Chen S (2018) Estimating customer lifetime value using machine learning techniques. In: Thomas C (ed) Data mining. IntechOpen, pp 17–34 Chen S (2018) Estimating customer lifetime value using machine learning techniques. In: Thomas C (ed) Data mining. IntechOpen, pp 17–34
go back to reference Clarke R (1988) Information technology and dataveillance. Commun ACM 31(5):498–512CrossRef Clarke R (1988) Information technology and dataveillance. Commun ACM 31(5):498–512CrossRef
go back to reference Clarke R (2019) Risks inherent in the digital surveillance economy: a research agenda. J Inf Technol 34(1):59–80CrossRef Clarke R (2019) Risks inherent in the digital surveillance economy: a research agenda. J Inf Technol 34(1):59–80CrossRef
go back to reference Cohen JE (2016) The surveillance-innovation complex: the irony of the participatory turn. In: Barney D et al (eds) The participatory condition in the digital age. University of Minnesota Press, pp 207–226 Cohen JE (2016) The surveillance-innovation complex: the irony of the participatory turn. In: Barney D et al (eds) The participatory condition in the digital age. University of Minnesota Press, pp 207–226
go back to reference Constantiou ID, Kallinikos J (2015) New games, new rules: big data and the changing context of strategy. J Inf Technol 30(1):44–57CrossRef Constantiou ID, Kallinikos J (2015) New games, new rules: big data and the changing context of strategy. J Inf Technol 30(1):44–57CrossRef
go back to reference Davenport T (2014) Big data at work: dispelling the myths, uncovering the opportunities. Harvard Business Review PressCrossRef Davenport T (2014) Big data at work: dispelling the myths, uncovering the opportunities. Harvard Business Review PressCrossRef
go back to reference Davenport T (2018) From analytics to artificial intelligence. J Bus Anal 1(2):73–80CrossRef Davenport T (2018) From analytics to artificial intelligence. J Bus Anal 1(2):73–80CrossRef
go back to reference Davenport T, Harris J (2017) Competing on analytics: updated, with a new introduction: the new science of winning. Harvard Business Press Davenport T, Harris J (2017) Competing on analytics: updated, with a new introduction: the new science of winning. Harvard Business Press
go back to reference Deleuze G (1992) Postscript on the societies of control. Routledge Deleuze G (1992) Postscript on the societies of control. Routledge
go back to reference Diaz Ruiz CA, Kjellberg H (2020) Feral segmentation: how cultural intermediaries perform market segmentation in the wild. Mark Theory 20(4):429–457CrossRef Diaz Ruiz CA, Kjellberg H (2020) Feral segmentation: how cultural intermediaries perform market segmentation in the wild. Mark Theory 20(4):429–457CrossRef
go back to reference Erevelles S, Fukawa N, Swayne L (2016) Big data consumer analytics and the transformation of marketing. J Bus Res 69(2):897–904CrossRef Erevelles S, Fukawa N, Swayne L (2016) Big data consumer analytics and the transformation of marketing. J Bus Res 69(2):897–904CrossRef
go back to reference Fisher E, Mehozay Y (2019) How algorithms see their audience: media epistemes and the changing conception of the individual. Media Cult Soc 41(8):1176–1191CrossRef Fisher E, Mehozay Y (2019) How algorithms see their audience: media epistemes and the changing conception of the individual. Media Cult Soc 41(8):1176–1191CrossRef
go back to reference Fuchs C (2011) Web 2.0, prosumption, and surveillance. Surveill Soc 8(3):288–309CrossRef Fuchs C (2011) Web 2.0, prosumption, and surveillance. Surveill Soc 8(3):288–309CrossRef
go back to reference Gandy Jr OH (1993) The panoptic sort: a political economy of personal information. Westview Gandy Jr OH (1993) The panoptic sort: a political economy of personal information. Westview
go back to reference Haggerty KD, Ericson RV (2000) The surveillant assemblage. Br J Sociol 51(4):605–622CrossRef Haggerty KD, Ericson RV (2000) The surveillant assemblage. Br J Sociol 51(4):605–622CrossRef
go back to reference Hildebrandt M (2019) Privacy as protection of the incomputable self: from agnostic to agonistic machine learning. Theoretical Inquiries in Law 20(1):83–121CrossRef Hildebrandt M (2019) Privacy as protection of the incomputable self: from agnostic to agonistic machine learning. Theoretical Inquiries in Law 20(1):83–121CrossRef
go back to reference Hildebrandt M, Rouvroy A (2011) Law, human agency and autonomic computing: the philosophy of law meets the philosophy of technology. RoutledgeCrossRef Hildebrandt M, Rouvroy A (2011) Law, human agency and autonomic computing: the philosophy of law meets the philosophy of technology. RoutledgeCrossRef
go back to reference Holvast J (2007) History of privacy. In: de Leeuw KMM, Bergstra J (eds) The history of information security: a comprehensive handbook. Elsevier, pp 737–769CrossRef Holvast J (2007) History of privacy. In: de Leeuw KMM, Bergstra J (eds) The history of information security: a comprehensive handbook. Elsevier, pp 737–769CrossRef
go back to reference Hull G (2015) Successful failure: what Foucault can teach us about privacy self-management in a world of Facebook and big data. Ethics Inf Technol 17(2):89–101CrossRef Hull G (2015) Successful failure: what Foucault can teach us about privacy self-management in a world of Facebook and big data. Ethics Inf Technol 17(2):89–101CrossRef
go back to reference Jenkinson A (1994) Beyond segmentation. J Target Meas Anal Mark 3(1):60–72 Jenkinson A (1994) Beyond segmentation. J Target Meas Anal Mark 3(1):60–72
go back to reference Kaminski ME, Malgieri G (2019) Algorithmic impact assessments under the GDPR: producing multilayered explanations. Available at SSRN 3456224 Kaminski ME, Malgieri G (2019) Algorithmic impact assessments under the GDPR: producing multilayered explanations. Available at SSRN 3456224
go back to reference Koops B-J (2014) The trouble with European data protection law. International Data Privacy Law 4(4):250–261CrossRef Koops B-J (2014) The trouble with European data protection law. International Data Privacy Law 4(4):250–261CrossRef
go back to reference Krasmann S (2020) The logic of the surface: on the epistemology of algorithms in times of big data. Inf Commun Soc:1–14 Krasmann S (2020) The logic of the surface: on the epistemology of algorithms in times of big data. Inf Commun Soc:1–14
go back to reference Lanier J (2010) You are not a gadget: a manifesto. Vintage Lanier J (2010) You are not a gadget: a manifesto. Vintage
go back to reference Lomborg S, Bechmann A (2014) Using APIs for data collection on social media. Inf Soc 30(4):256–265CrossRef Lomborg S, Bechmann A (2014) Using APIs for data collection on social media. Inf Soc 30(4):256–265CrossRef
go back to reference Lyon D (1994) From big brother to electronic panopticon. The Electronic Eye: The Rise of Surveillance Society, pp 57–80 Lyon D (1994) From big brother to electronic panopticon. The Electronic Eye: The Rise of Surveillance Society, pp 57–80
go back to reference Lyon D (2007) Surveillance studies: an overview. Polity Lyon D (2007) Surveillance studies: an overview. Polity
go back to reference Lyon D (2010) Surveillance, power and everyday life. In: Kalantzis-Cope P, Gherab-Martin K (eds) Emerging digital spaces in contemporary society. Palgrave Macmillan UK, pp 107–120CrossRef Lyon D (2010) Surveillance, power and everyday life. In: Kalantzis-Cope P, Gherab-Martin K (eds) Emerging digital spaces in contemporary society. Palgrave Macmillan UK, pp 107–120CrossRef
go back to reference Malgieri G, Comandé G (2017) Why a right to legibility of automated decision-making exists in the general data protection regulation. Int Data Priv Law 7:243CrossRef Malgieri G, Comandé G (2017) Why a right to legibility of automated decision-making exists in the general data protection regulation. Int Data Priv Law 7:243CrossRef
go back to reference Mantelero A (2014) The future of consumer data protection in the EU Re-thinking the “notice and consent” paradigm in the new era of predictive analytics. Comput Law Secur Rev 30(6):643–660CrossRef Mantelero A (2014) The future of consumer data protection in the EU Re-thinking the “notice and consent” paradigm in the new era of predictive analytics. Comput Law Secur Rev 30(6):643–660CrossRef
go back to reference Mantelero A (2016) Personal data for decisional purposes in the age of analytics: from an individual to a collectivedimension of data protection. Comput Law Secur Rev 32(2):238–255CrossRef Mantelero A (2016) Personal data for decisional purposes in the age of analytics: from an individual to a collectivedimension of data protection. Comput Law Secur Rev 32(2):238–255CrossRef
go back to reference Mayer-Schönberger V, Cukier K (2013) Big data: a revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt Mayer-Schönberger V, Cukier K (2013) Big data: a revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt
go back to reference Mitchell A (2010) The rise of volunteered personal information. J Direct Data Digit Mark Pract 12(2):154–164CrossRef Mitchell A (2010) The rise of volunteered personal information. J Direct Data Digit Mark Pract 12(2):154–164CrossRef
go back to reference Montgomery AL et al (2004) Modeling online browsing and path analysis using clickstream data. Mark Sci 23(4):579–595CrossRef Montgomery AL et al (2004) Modeling online browsing and path analysis using clickstream data. Mark Sci 23(4):579–595CrossRef
go back to reference Morrison R (2015) Data-driven organization design: sustaining the competitive edge through organizational analytics. Kogan Page Publishers Morrison R (2015) Data-driven organization design: sustaining the competitive edge through organizational analytics. Kogan Page Publishers
go back to reference Park YJ (2013) Digital literacy and privacy behavior online. Commun Res 40(2):215–236CrossRef Park YJ (2013) Digital literacy and privacy behavior online. Commun Res 40(2):215–236CrossRef
go back to reference Polanyi K (1944) The great transformation. Beacon Press Polanyi K (1944) The great transformation. Beacon Press
go back to reference Popov A, Iakovleva D (2018) Adaptive look-alike targeting in social networks advertising. Proc Comput Sci 136:255–264CrossRef Popov A, Iakovleva D (2018) Adaptive look-alike targeting in social networks advertising. Proc Comput Sci 136:255–264CrossRef
go back to reference Pridmore J, Zwick D (2011) Marketing and the rise of commercial consumer surveillance. Surveill Soc 8(3):269–277CrossRef Pridmore J, Zwick D (2011) Marketing and the rise of commercial consumer surveillance. Surveill Soc 8(3):269–277CrossRef
go back to reference Prince C et al (2021) Are we living in surveillance societies and is privacy an illusion? An empirical study on privacy literacy and privacy concerns. IEEE Trans Eng Manag:1–18 Prince C et al (2021) Are we living in surveillance societies and is privacy an illusion? An empirical study on privacy literacy and privacy concerns. IEEE Trans Eng Manag:1–18
go back to reference Procter R, Voss A, Lvov I (2015) Audience research and social media data: opportunities and challenges. Participat J Audience Recept Stud 12(1):470–493 Procter R, Voss A, Lvov I (2015) Audience research and social media data: opportunities and challenges. Participat J Audience Recept Stud 12(1):470–493
go back to reference Reidenberg JR et al (2016) Ambiguity in privacy policies and the impact of regulation. J Leg Stud 45(2):163–190CrossRef Reidenberg JR et al (2016) Ambiguity in privacy policies and the impact of regulation. J Leg Stud 45(2):163–190CrossRef
go back to reference Ridley-Siegert T (2015) Data privacy: what the consumer really thinks. J Direct Data Digit Mark Pract 17(1):30–35CrossRef Ridley-Siegert T (2015) Data privacy: what the consumer really thinks. J Direct Data Digit Mark Pract 17(1):30–35CrossRef
go back to reference Rouvroy A, Poullet Y (2009) The right to informational self-determination and the value of self-development: reassessing the importance of privacy for democracy. In: Gutwirth S et al (eds) Reinventing data protection? Springer, pp 45–76CrossRef Rouvroy A, Poullet Y (2009) The right to informational self-determination and the value of self-development: reassessing the importance of privacy for democracy. In: Gutwirth S et al (eds) Reinventing data protection? Springer, pp 45–76CrossRef
go back to reference Sadowski J (2019) When data is capital: datafication, accumulation, and extraction. Big Data Soc 6(1) Sadowski J (2019) When data is capital: datafication, accumulation, and extraction. Big Data Soc 6(1)
go back to reference Sartor G (Apr. 2020a) New aspects and challenges in consumer protection. Study PE 648.790 European Parliamentary Research Service, Policy Department for Economic, Scientific and Quality of Life Policies Sartor G (Apr. 2020a) New aspects and challenges in consumer protection. Study PE 648.790 European Parliamentary Research Service, Policy Department for Economic, Scientific and Quality of Life Policies
go back to reference Sartor G (Sept. 2020b) The impact of the General Data Protection Regulation (GDPR) on artificial intelligence. Study No. PE 641.530 European Parliamentary Research Service, Scientific Foresight Unit (STOA) Sartor G (Sept. 2020b) The impact of the General Data Protection Regulation (GDPR) on artificial intelligence. Study No. PE 641.530 European Parliamentary Research Service, Scientific Foresight Unit (STOA)
go back to reference Schaub F et al (2016) Watching them watching me: browser extensions impact on user privacy awareness and concern. In: NDSS workshop on usable security, pp 1–10 Schaub F et al (2016) Watching them watching me: browser extensions impact on user privacy awareness and concern. In: NDSS workshop on usable security, pp 1–10
go back to reference Schneider H et al (2017) Your data, your vis: personalizing personal data visualizations. In: IFIP Conference on Human-Computer Interaction. Springer, pp 374–392 Schneider H et al (2017) Your data, your vis: personalizing personal data visualizations. In: IFIP Conference on Human-Computer Interaction. Springer, pp 374–392
go back to reference Schwartz PM (2003) Property, privacy, and personal data. Harv Law Rev 117:2056CrossRef Schwartz PM (2003) Property, privacy, and personal data. Harv Law Rev 117:2056CrossRef
go back to reference Schwartz PM, Solove DJ (2011) The PII problem: privacy and a new concept of personally identifiable information. N Y Univ Law Rev 86:1814 Schwartz PM, Solove DJ (2011) The PII problem: privacy and a new concept of personally identifiable information. N Y Univ Law Rev 86:1814
go back to reference Serra R, Weyergraf C (1980) Richard Serra: Interviews, Etc., 1970-1980. Hudson River Museum Serra R, Weyergraf C (1980) Richard Serra: Interviews, Etc., 1970-1980. Hudson River Museum
go back to reference Siegel E (2013) Predictive analytics: the power to predict who will click, buy, lie, or die. John Wiley & Sons Siegel E (2013) Predictive analytics: the power to predict who will click, buy, lie, or die. John Wiley & Sons
go back to reference Simon B (2005) The return of panopticism: supervision, subjection and the new surveillance. Surveill Soc 3(1) Simon B (2005) The return of panopticism: supervision, subjection and the new surveillance. Surveill Soc 3(1)
go back to reference Sipior JC, Ward BT, Mendoza RA (2011) Online privacy concerns associated with cookies, flash cookies, and web beacons. J Internet Commer 10(1):1–16CrossRef Sipior JC, Ward BT, Mendoza RA (2011) Online privacy concerns associated with cookies, flash cookies, and web beacons. J Internet Commer 10(1):1–16CrossRef
go back to reference Tavani HT (2008) Informational privacy: concepts, theories, and controversies. In: Himma KE, Tavani HT (eds) The handbook of Information and computer ethics. John Wiley & Sons, pp 131–164CrossRef Tavani HT (2008) Informational privacy: concepts, theories, and controversies. In: Himma KE, Tavani HT (eds) The handbook of Information and computer ethics. John Wiley & Sons, pp 131–164CrossRef
go back to reference Toffler A (1980) The third wave. William Morrow (US) Toffler A (1980) The third wave. William Morrow (US)
go back to reference Van Dijck J (2014) Datafication, dataism and dataveillance: big data between scientific paradigm and ideology. Surveill Soc 12(2):197–208CrossRef Van Dijck J (2014) Datafication, dataism and dataveillance: big data between scientific paradigm and ideology. Surveill Soc 12(2):197–208CrossRef
go back to reference Van Otterlo M (2013) A machine learning view on profiling. Privacy, due process and the computational turn - philosophers of law meet philosophers of technology. Routledge, Abingdon, pp 41–64 Van Otterlo M (2013) A machine learning view on profiling. Privacy, due process and the computational turn - philosophers of law meet philosophers of technology. Routledge, Abingdon, pp 41–64
go back to reference Wachter S, Mittelstadt B (2019) A right to reasonable inferences: re-thinking data protection law in the age of big data and AI. en Columbia Bus Law Rev, pp 494–619. https://osf.io/mu2kf (visited on 11/29/2021) Wachter S, Mittelstadt B (2019) A right to reasonable inferences: re-thinking data protection law in the age of big data and AI. en Columbia Bus Law Rev, pp 494–619. https://​osf.​io/​mu2kf (visited on 11/29/2021)
go back to reference Wachter S, Mittelstadt B, Floridi L (2017) Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int Data Priv Law 7(2):76–99CrossRef Wachter S, Mittelstadt B, Floridi L (2017) Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int Data Priv Law 7(2):76–99CrossRef
go back to reference Walker KL (2016) Surrendering information through the looking glass: transparency, trust, and protection. J Public Policy Mark 35(1):144–158CrossRef Walker KL (2016) Surrendering information through the looking glass: transparency, trust, and protection. J Public Policy Mark 35(1):144–158CrossRef
go back to reference Warren SD, Brandeis LD (1890) The right to privacy. Harv Law Rev pp. 193–220 Warren SD, Brandeis LD (1890) The right to privacy. Harv Law Rev pp. 193–220
go back to reference Wood DM, Ball K (2013) Brandscapes of control? Surveillance, marketing and the co-construction of subjectivity and space in neo-liberal capitalism. Mark Theory 13(1):47–67CrossRef Wood DM, Ball K (2013) Brandscapes of control? Surveillance, marketing and the co-construction of subjectivity and space in neo-liberal capitalism. Mark Theory 13(1):47–67CrossRef
go back to reference Zarsky TZ (2016) Incompatible: the GDPR in the age of big data. Seton Hall Law Rev 47(4):995–1020 Zarsky TZ (2016) Incompatible: the GDPR in the age of big data. Seton Hall Law Rev 47(4):995–1020
go back to reference Zuboff S (2019) The age of surveillance capitalism: the fight for the future at the new frontier of power. Profile Books Zuboff S (2019) The age of surveillance capitalism: the fight for the future at the new frontier of power. Profile Books
go back to reference Zuiderveen Borgesius FJ (2016) Singling out people without knowing their names – behavioural targeting, pseudonymous data, and the new data protection regulation. Comput Law Secur Rev 32(2):256–271CrossRef Zuiderveen Borgesius FJ (2016) Singling out people without knowing their names – behavioural targeting, pseudonymous data, and the new data protection regulation. Comput Law Secur Rev 32(2):256–271CrossRef
go back to reference Zwick D, Dholakia N (2004) Consumer subjectivity in the age of internet: the radical concept of marketing control through customer relationship management. Inf Organ 14(3):211–236CrossRef Zwick D, Dholakia N (2004) Consumer subjectivity in the age of internet: the radical concept of marketing control through customer relationship management. Inf Organ 14(3):211–236CrossRef
go back to reference Zwick D, Dholakia N (2013) Strategic database marketing: customer profiling as new product development. In: Marketing management. Routledge, pp 481–496 Zwick D, Dholakia N (2013) Strategic database marketing: customer profiling as new product development. In: Marketing management. Routledge, pp 481–496
Metadata
Title
Data-Driven Surveillance
Author
Federico Galli
Copyright Year
2022
DOI
https://doi.org/10.1007/978-3-031-13603-0_3