Skip to main content


Swipe to navigate through the chapters of this book

2020 | OriginalPaper | Chapter

Private Ordering of Online Platforms in Smart Urban Mobility: The Case of Uber’s Rating System

Author : Rossana Ducato

Published in: Smart Urban Mobility

Publisher: Springer Berlin Heidelberg


Rating and review systems are a self-regulatory mechanism widely used by online platforms, especially in the smart mobility sector. Such systems have already been analysed in empirical studies and legal contributions, in particular in the fields of consumer law and labour law.
This chapter aims to make an original contribution to the current debate from a relatively underinvestigated perspective: how rating and review systems interact with the European data protection framework. As a case study, the chapter will focus on the rating system adopted by Uber, one of the largest shared mobility platforms worldwide.

To get access to this content you need the following product:

Springer Professional "Wirtschaft+Technik"


Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt 90 Tage mit der neuen Mini-Lizenz testen!

Springer Professional "Wirtschaft"


Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko

Jetzt 90 Tage mit der neuen Mini-Lizenz testen!

On the notion of multi-sided platforms, see Kevin J Boudreau and Andrei Hagiu, ‘Platform Rules: Multi-sided Platforms as Regulators’ (2009) 1 Platforms, Markets and Innovation 163; David S Evans, ‘Governing Bad Behavior by Users of Multi-sided Platforms’ (2012) 27 Berkeley Technology Law Journal 1201.
On the different types and models of business in the shared mobility sector, see Boyd Cohen and Jan Kietzmann, ‘Ride on! Mobility Business Models for the Sharing Economy’ (2014) 27 Organization & Environment 279.
On the role of R&R systems in the platform economy, see Christoph Busch, ‘Crowdsourcing Consumer Confidence: How to Regulate Online Rating and Review Systems in the Collaborative Economy’ in Alberto De Franceschi (ed), European Contract Law and the Digital Single Market: the Implication of the Digital Revolution (Intersentia 2016); Guido Smorto, ‘Reputazione, Fiducia e Mercati’ (2016) 1 Europa e Diritto Privato 199; Lene G Braathen Pettersen, ‘Rating Mechanisms Among Participants in Sharing Economy Platforms’ (2017) 22 First Monday <https://​firstmonday.​org/​ojs/​index.​php/​fm/​article/​view/​7908/​6586> accessed 30 December 2019; Sofia Ranchordás, ‘Online Reputation and the Regulation of Information Asymmetries in the Platform Economy’ (2018) 5 Critical Analysis of Law 127 <https://​cal.​library.​utoronto.​ca/​index.​php/​cal/​article/​view/​29508/​21993> accessed 30 December 2019.
Chrysanthos Dellarocas, ‘The Digitization of Word-of-Mouth: Promise and Challenges of Online Feedback Mechanisms’ (March 2003) MIT Sloan Working Paper No. 4296-03 <https://​ssrn.​com/​abstract=​393042> accessed 30 December 2019 or <https://​doi.​org/​10.​2139/​ssrn.​393042> accessed 30 December 2019.
Busch (n 3).
ibid 12.
However, the reciprocity of the evaluation is also one of the reasons why users tend to score high. This distorting effect has been found in platforms such as eBay, where the evaluation is not anonymous and takes place at a delayed point in time. A user who has had a mediocre or negative experience is inclined to leave positive feedback, fearing that the other party might ‘retaliate’ by leaving a low score. Chrysanthos Dellarocas and Charles A Wood, ‘The Sound of Silence in Online Feedback: Estimating Trading Risks in the Presence of Reporting Bias’ (2008) 54 Management Science 460; Abbey Stemler, ‘Feedback Loop Failure: Implications for the Self-regulation of the Sharing Economy’ (2017) 18 Minnesota Journal of Law, Science & Technology 673, 691. 
For an overview, see Cristiano Codagnone and Bertin Martens, ‘Scoping the Sharing Economy: Origins, Definitions, Impact and Regulatory Issues’ (2016) Institute for Prospective Technological Studies Digital Economy Working Paper 22.
See, in particular, Omri Ben-Shahar, ‘One-Way Contracts: Consumer Protection without Law’ (2010) 6 European Review of Contract Law 221; Christoph Busch and others, ‘The Rise of the Platform Economy: a New Challenge for EU Consumer Law?’ (2016) 5 Journal of European Consumer and Market Law 3; Marta Cantero Gamito, ‘Regulation.​com. Self-regulation and Contract Governance in the Platform Economy: a Research Agenda’ (2016) 9 European Journal of Legal Studies 53.
Valerio De Stefano, ‘The Rise of the Just-in-Time Workforce: On-Demand Work, Crowdwork, and Labor Protection in the Gig-Economy’ (2015) 37 Comparative Labor Law and Policy Journal 471; Jeremias Prassl, Humans as a Service: The Promise and Perils of Work in the Gig Economy (Oxford University Press 2018); Adrián Todolí-Signes, ‘Algorithms, Artificial Intelligence and Automated Decisions Concerning Workers and the Risks of Discrimination: the Necessary Collective Governance of Data Protection’ (2019) 25 Transfer: European Review of Labour and Research 465.
Parliament and Council Regulation (EU) 2016/679 of 27 April 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1 (GDPR).
For an overview of the legal issues raised by this business model, see Kristofer Erickson and Inge Sorensen, ‘Regulating the Sharing Economy’ (2016) 5 Internet Policy Review <https://​policyreview.​info/​articles/​analysis/​regulating-sharing-economy> accessed 30 December 2019; Nayeem Syed, ‘Regulating Uberification’ (2016) 22 Computer and Telecommunications Law Review 14; Guido Noto La Diega, ‘Uber Law and Awareness by Design. An Empirical Study on Online Platforms and Dehumanised Negotiations’ (2016) 2 Revue européenne de droit de la consommation/European Journal of Consumer Law 383; Diane M Ring and Shu-Yi Oei, ‘The Tax Lives of Uber Drivers: Evidence from Online Forums’ (2016) 8 Columbia Journal of Tax Law 56; Alex Rosenblat and Luke Stark, ‘Algorithmic Labor and Information Asymmetries: A Case Study of Uber’s Drivers’ (2016) 10 International Journal of Communication 3758; Ryan Calo and Alex Rosenblat, ‘The Taking Economy: Uber, Information, and Power’ (2017) 117 Columbia Law Review 1623; Alex Rosenblat and others, ‘Discriminating Tastes: Uber’s Customer Ratings as Vehicles for Workplace Discrimination’ (2017) 9 Policy & Internet 256; Giorgio Resta, ‘Digital Platforms and the Law: Contested Issues’ (2018) 1 Media Laws 231.
See Cathy Bussewitz, ‘Uber Expands Helicopter Service to All Users with iPhones in New York City’ (USA Today, 15 December 2019)  <https://​eu.​usatoday.​com/​story/​tech/​2019/​10/​03/​uber-helicopter-service-expands-all-iphone-users-new-york-city/​3855008002/​> accessed 30 December 2019.
Case C-434/15 Asociación Profesional Elite Taxi v. Uber Systems SpainSL [2017] EU:C:2017:981, para 48. For a comment on the judgment, see Philipp Hacker, ‘UberPop, UberBlack, and the Regulation of Digital Platforms after the Asociación Profesional Elite Taxi Judgment of the CJEU’ (2018) 14 European Review of Contract Law 80; Michèle Finck, ‘Distinguishing Internet Platforms from Transport Services: Elite Taxi v. Uber Spain’ (2018) 55 CML Rev 1619; Alberto De Franceschi, ‘Uber Spain and the “Identity Crisis” of Online Platforms’ (2018) 7 Journal of European Consumer and Market Law 1.
Asociación Profesional Elite Taxi (n 15) para 39.
However, this conclusion is supported by the Case C-434/15 Asociación Profesional Elite Taxi v. Uber Systems SpainSL [2017] EU:C:2017:981, Opinion of Advocate General Szpunar. See in particular paras 51-52.
On the contractual ‘flora’ that characterises the platform, see the critical comment by Guido Noto La Diega and Luce Jacovella, ‘UBERTRUST: How Uber Represents Itself to Its Customers Through its Legal and Non-Legal Documents’ (2016) 5 Journal of Civil and Legal Sciences 199.
Section IV.C ‘Consultation of Evaluations’, Uber Privacy Policy (Uber) <www.​uber.​com/​global/​it/​privacy/​notice/​> accessed 30 December 2019.
The app matches users based on criteria such as proximity. Therefore, the user does not have the option to really choose the counterparty on the basis of her rating.
Section III.B.9, Uber Privacy Policy (Uber) <www.​uber.​com/​global/​it/​privacy/​notice/​> accessed 30 December 2019. A decision by the California Labor Commission confirms that the minimum average rating to remain on the platform is the same both for drivers and passengers. cf Uber Techs., Inc. v. Berwick, No. CGC-15-546378, Cal. App. LEXIS 9488 (Cal. Super. 2015); see also O’Connor v. Uber Technologies, Inc., 82 F. Supp. 3d 1133, 1135 (N.D. Cal. 2015).
As reported in Uber Techs., Inc. v. Berwick (n 21).
This information was obtained after sending a direct request to the Uber assistance service on 4 December 2018.
It is interesting to note that the previous version of the Community Guidelines (online at least until 19 November 2019) adopted a ‘preventive’ approach. They stated that: ‘We will alert you over time if your rating is approaching this limit. However, if your average rating still falls below the minimum after multiple notifications, you will lose access to your account as your rating will no longer meet the overall quality standards that riders reasonably expect from drivers when using the Uber app in the relevant city’ ((Uber) <www.​uber.​com/​legal/​community-guidelines/​be-en/​> accessed 30 December 2019. This link has expired. However, the same provision remains in Porto Rico’s deactivation policy: (Uber) <www.​uber.​com/​legal/​deactivation-policy/​pr-en/​> accessed 30 December 2019).
By clicking on her rating, the passenger can access a page that provides general information about the rating. Such information is framed in positive terms. Uber reassures passengers who do not have an immaculate rating: ‘very few people have a perfect rating, so don’t despair if your average isn’t 5.0. Things that seem small can matter to your driver—it’s easy to accidentally slam a door if you are not thinking about it.’ The platform then offers a series of tips to get a high score, such as not keeping the driver waiting, being polite, wearing a seat belt, etc. The information page concludes with a paragraph on the importance of ratings: ‘ratings foster mutual respect between riders and drivers […] a high rating is about more than bragging rights among your friends; it’s a sign that people enjoyed their time with you. Keep up the good work!’ Nothing, however, is mentioned about the other important consequences, should the score not be ‘stellar’. The sanction of deactivation is mentioned in a section of the Privacy Policy (Section III.B.9, Privacy Policy Uber (n 21)).
For an overview of the different types of rating systems, see Pettersen (n 3).
The system indeed allows one to give feedback, but these qualitative inputs are limited and, in any case, they do not appear in the final rating.
Chrysanthos Dellarocas and Ritu Narayan, ‘A Statistical Measure of a Population’s Propensity to Engage in Post-purchase Online Word-of-mouth’ (2006) 21 Statistical Science 277.
Rosenblat and others, ‘Discriminating Tastes’ (n 12).
Unfortunately, the digital environment has replicated some discriminatory practices of the ‘analogue age’. See for example Ray Fisman and Michael Luca, ‘Fixing Discrimination in Online Marketplaces’ (2016) Harvard Business Review <https://​hbr.​org/​2016/​12/​fixing-discrimination-in-online-marketplaces> accessed 30 December 2019; Benjamin Edelman, Michael Luca and Dan Svirsky, ‘Racial Discrimination in the Sharing Economy: Evidence from a Field Experiment’ (2017) 9 American Economic Journal: Applied Economics 1.
As Stemler (n 7) 694 points out: ‘For example, if a passenger has a low rating, an Uber driver may interpret innocuous behaviour, like failing to make small talk, as unfriendly and give the passenger a negative review.’
Bénédicte Drambine, Joseph Jerome and Ben Ambrose, ‘User Reputation: Building Trust and Addressing Privacy Issues in the Sharing Economy’ (2015) 9ff <https://​fpf.​org/​wp-content/​uploads/​FPF_​SharingEconomySu​rvey_​06_​08_​15.​pdf> accessed 30 December 2019.
ibid 11. On the contrary, in the United States, a specific procedure has been established in cooperation with the Independent Drivers Guild to deal with account deactivations due to low ratings ((IDG) <https://​drivingguild.​org/​uberdeactivated/​> accessed 30 December 2019; (Uber, 29 June 2019) <www.​uber.​com/​drive/​new-york/​resources/​appeals/​> accessed 30 December 2019). However, the initiative seems to be somewhat limited. First, it is only available to drivers based in New York City (see (Uber) <www.​uber.​com/​drive/​new-york/​resources/​improve-your-rating/​> accessed 30 December 2019). Moreover, rather than a system to contest the rating, what the platform provides is a system of ‘reputation[al] rehabilitation’. Drivers can, in fact, reactivate their account by following an online course on how to improve customer satisfaction.
GDPR, art 4(1) defines personal data as ‘any information relating to an identified or identifiable natural person (“data subject”); an identifiable person is one who can be identified, directly or indirectly, by reference in particular to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors characteristic of his physical, physiological, genetic, mental, economic, cultural or social identity’.
Article 29 Data Protection Working Party (WP29), ‘Opinion 4/2007 on the concept of personal data’ (2007) WP136, 6.
Case C-434/16 Peter Nowak v. Data Protection Commissioner [2017] EU:C:2017:994, para 42. cf Karolina Podstawa, ‘Peter Nowak v Data Protection Commissioner: You Can Access Your Exam Script, Because It Is Personal Data’ (2018) 4 European Data Protection Law Review 252; Nadezhda Purtova, ‘The Law of Everything. Broad Concept of Personal Data and Future of EU Data Protection Law’ (2018) 10 Law, Innovation and Technology 40; Sandra Wachter and Brent Mittelstadt, ‘A Right to Reasonable Inferences: Re-thinking Data Protection Law in the Age of Big Data and AI’ (2019) Columbia Business Law Review 494.
To determine whether a piece of information refers to an identified or identifiable person, the WP29 suggests taking the three following elements into account: content (the information concerns the person), purpose (the information is or is likely to be used to evaluate the person or to influence his or her behaviour) and outcome (the use of the information may have an impact—even if not necessarily significant—on the rights and interests of the person). These elements are alternative. See, WP29, ‘Opinion 4/2007 on the concept of personal data’ (2007).
This ruling exceeds the principles of law expressed in the previous Joined Cases C-141/12 and 372/12 YS c. Minister voor Immigratie, Integratie en Asiel and Minister voor Immigratie, Integratie en Asiel [2014] EU:C:2014:2081 (hereinafter YS and others). In this case, concerning access to a decision about a residence permit, the Court considered that the personal data contained in the so-called ‘minute’ (the draft decision containing the assessment of the applicant’s request) were exclusively the information related to the applicant (name, surname, etc.) and not the information concerning the assessment and application of the law to the situation of the applicant (the so-called ‘legal analysis’). For a critical comment, see Evelien Brouwer and Frederik Zuiderveen Borgesius, ‘Access to Personal Data and the Right to Good Governance during Asylum Procedures after the CJEU’s YS. and M. and S. judgment (C-141/12 and C-372/12)’ (2015) 17 European Journal of Migration and Law 259; Wachter and Mittelstadt (n 39).
Nowak (n 39) para 43.
Nowak (n 39) para 45.
With specific reference to processing in the employment sector, the GDPR leaves a certain margin of discretionary power to the Member States (GDPR, art 88(1)). In Italy, for example, some specific provisions in this context have been introduced by art 9 of Legislative Decree of 10 August 2018, no. 101 (see Maria Cristina Degoli, ‘I Trattamenti in Ambito Lavorativo’ in Simone Scagliarini (ed), Il ‘Nuovo’ Codice in Materia di Protezione dei Dati Personali. La Normativa Italiana Dopo il d.lgs. n. 101/2018 (Giappichelli 2019); Vincenzo Turco, ‘Il Trattamento dei Dati Personali Nell’ambito del Rapporto di Lavoro’ in Vincenzo Cuffaro, Roberto D’Orazio and Vincenzo Ricciuto (eds), I Dati Personali nel Diritto Europeo (Giappichelli 2019)). However, despite the control exercised by the platform, the qualification of Uber drivers as employees is still contested in many Member States. Consequently, the following analysis will not focus on the special rules applicable in the employment context.
The data controller is defined as ‘the natural or legal person, public authority, service or other body which, individually or jointly with others, determines the purposes and means of the processing of personal data’ (GDPR, art 4(7)).
See Section III.F ‘Reasons for Processing’, Uber Privacy Policy (Uber) <www.​uber.​com/​global/​it/​privacy/​notice/​#choice> accessed 30 December 2019.
GDPR, art 6(1)(a).
GDPR, art 6(1)(b).
GDPR, art 6(1)(f).
GDPR, art 6(1)(c).
See GDPR, arts 13(1)(c), 14(1)(c). On transparency duties, WP29, ‘Guidelines on transparency under Regulation 2016/679’ (2018) WP260 rev.01.
Art. 4(11) GDPR. On the requirements for valid consent, see European Data Protection Board (EDPB), ‘Guidelines 5/2020 on consent under Regulation 2016/679’ (2020) Version 1.1. In the literature, see in particular, Giorgio Resta, ‘Revoca del Consenso ed Interesse al Trattamento nella Legge sulla Protezione dei Dati Personali’ (2000) Rivista Critica del Diritto Privato 299; Roger Brownsword, ‘Il Consenso Informato nella Società dell’Informazione’ (2012) XI(3) Salute e Società 161; Fausto Caggia, ‘Libertà ed Espressione del Consenso’ in Vincenzo Cuffaro, Roberto D’Orazio and Vincenzo Ricciuto (eds), I Dati Personali nel Diritto Europeo (Giappichelli 2019); Ingrida Milkaite and Eva Lievens, ‘Counting Down to 25 May 2018: Mapping the GDPR Age of Consent Across the EU’ (2018) <https://​biblio.​ugent.​be/​publication/​8561253> accessed 30 December 2019.
GDPR, recital 32.
On the prohibition of opt-out mechanisms for consent, see the recent Case C-673/17 Bundesverband der Verbraucherzentralen und Verbraucherverbände—Verbraucherzentrale Bundesverband e.V. v Planet49 GmbH [2019] EU:C:2019:801.
Even if the user agrees to the Terms and Conditions of Service, such consent shall not be confused with the consent required under GDPR, art 6(1)(a). See EDPB, ‘Guidelines 2/2019 on the processing of personal data within the meaning of Article 6(1)(b) of the General Data Protection Regulation in the context of the provision of online services to data subjects’ (2019) point 20.
The only consent that is ‘requested’ is for marketing purposes, but its modalities raise serious doubts about its validity.
According to GDPR, art 7(4), ‘when assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract’.
WP29, ‘Opinion 15/2011 on the definition of consent’ (2011) WP187, 12. See WP29, ‘Guidelines on Consent under Regulation 2016/679’ (2018) WP259 rev.01.
WP29, ‘Opinion 8/2001 on the processing of personal data in the employment context’ (2001) WP48. See WP29, ‘Opinion 2/2017 on the processing of data at work’ (2017) WP249, 6-7.
On the requirements that the data controller may take into account when assessing the necessity of the processing for the purpose of performing the contract: EDPB, ‘Guidelines 2/2019’. See in particular point 33.
Uber has been qualified as a service in the transportation sector by the CJEU. Even before the European judgment, Italian courts qualified the contract between the platform and its users as a transport contract according to arts 1678 and 1681, Italian Civil Code (see, for instance, Court of Turin, commercial section, 22 March 2017, no. 1553, in Foro it. 2017, 6, I, 2082, with comment by Caputi. See also, Giovanni Basini, ‘Innovazione Disruptive e Limiti dell’Azione di Concorrenza Sleale per Violazione di Norme Pubblicistiche, Dopo il Caso Uber-II Parte’ (2018) 83 Responsabilità Civile e Previdenza 1316; Giorgio Resta, ‘Uber di Fronte alle Corti Europee’ (2017) 2 Diritto dell’informazione e dell’informatica 330).
As clarified by the EDPB, just because the ratings data are mentioned in the T&C does not mean that the processing is necessary for the purposes of the contract. See EDPB, ‘Guidelines 2/2019’, point 28, referring to WP29, ‘Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC’ (2014) WP217, 20.
GDPR, art 6(1)(f).
WP29, ‘Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC’ (2014) 36.
See, Stefano Rodotà, ‘Data Protection as a Fundamental Right’ in Serge Gutwirth and others (eds), Reinventing Data Protection? (Springer 2009); Gloria González Fuster, The Emergence of Personal Data Protection as a Fundamental Right of the EU (Springer Science & Business 2014); Orla Lynskey, The Foundations of EU Data Protection Law (Oxford University Press 2015); Federico Fabbrini, ‘The EU Charter of Fundamental Rights and the Rights to Data Privacy: The EU Court of Justice as a Human Rights Court’ (2015) iCourts Working Paper Series, No. 19 <SSRN:> accessed 30 December 2019.
On the balancing of the right to privacy and data protection with other interests, see Federica Giovanella, Copyright and Information Privacy: Conflicting Rights in Balance (Edward Elgar Publishing 2017).
Nowak (n 39) para 53.
Nowak (n 39) para 52.
Nowak (n 39) para 55.
Nowak (n 39) para 54.
Wachter and Mittelstadt (n 39) 42.
ibid 45.
Nowak (n 39) para 54.
Case C-28/08 European Commission v The Bavarian Lager Co. Ltd [2010] EU:C:2010:378.
YS and others (n 41) para 47.
Wachter and Mittelstadt (n 39) 49. The authors are open to the possibility that the right to rectify assessments might be recognised to data subjects in the future, considering that data protection law has to be interpreted teleologically.
As proposed in Rossana Ducato, Miriam Kullmann and Marco Rocca, ‘European Legal Perspectives on Customer Ratings and Discrimination’ in Tindara Addabbo and others (eds), Performance Appraisal in Modern Employment Relations (Springer 2020).
In the literature, see Isak Mendoza and Lee A Bygrave, ‘The Right not to Be Subject to Automated Decisions Based on Profiling’ in Tatiana-Eleni Synodinou and others (eds), EU Internet Law: Regulation and Enforcement (Springer 2017); Lee A Bygrave, ‘Minding the Machine v2.0: The EU General Data Protection Regulation and Automated Decision Making’ in Karen Yeung and Martin Lodge (eds), Algorithmic Regulation (Oxford University Press 2019); Elena Gil González and Paul de Hert, Understanding the Legal Provisions that Allow Processing and Profiling of Personal Data—an Analysis of GDPR Provisions and Principles (Springer 2019).
The same could be said for platforms using reputational ratings for “pure” informative purposes: if the score of a user is low, this could have an impact on her dignity or on her future chance to enter into contracts. In a case concerning the lawfulness of a reputational system set up by a company, the Italian Data Protection Authority (DPA) stressed the link between data protection and dignity, underlying the necessity to evaluate the impact on the latter caused by the “social projection” of a data subject determined by ratings. In that case, the Italian DPA considered the reputational rating unlawful (Italian DPA, decision no. 488 of 24 November 2016 (Garante per la protezione dei dati personali 2016) <www.​garanteprivacy.​it/​web/​guest/​home/​docweb/​-/​docweb-display/​docweb/​5796783> accessed 27 August 2020). The decision was partially reformed by the Tribunal of Rome (decision no. 5715 of 4 April 2018), which established the legitimacy of the reputational rating, being the latter a voluntary system and expression of individuals’ private autonomy. The decision has been highly critised in the literature. See, for instance, Giorgio Giannone Codiglione, ‘Algoritmi reputazionali e confini dell’autonomia dei privati’ (2019) 2 Diritto dell’Informazione e dell’Informatica 520.
GDPR, art 4(1)(4).
GDPR, art 22(2).
However, such human involvement must be meaningful. The WP29 has been clear in affirming that the controller cannot circumvent art 22 by introducing a merely fictitious form of human control: ‘for example, if someone routinely applies automatically generated profiles to individuals without any actual influence on the result, this would still be a decision based solely on automated processing’ (WP29, ‘Guidelines for automated decision making and profiling’ (2018) WP251 rev.01, 23). In other words, art 22 will still apply if the controller does not provide for a supervisory mechanism that allows for effective oversight and intervention in the decision-making process.
GDPR, article 5(1)(d).
WP29, ‘Guidelines for automated decision making and profiling’ (2018) 13. See also the recent contribution by Dara Hallinan and Frederik Zuiderveen Borgesius, ‘Opinions Can Be Incorrect (in Our Opinion)! On Data Protection Law’s Accuracy Principle’ (2020) 10 International Data Privacy Law 1, where the authors argue that the principle of accuracy does and should apply to opinions (either they are human-generated, algorithm-generated or human-algorithm-mix generated opinions). In particular, Hallinan and Borgesius retain that the principle of accuracy is applicable not only to the facts at the basis of an opinion but also to the interpretative framework used in generating that opinion.
WP29, ‘Guidelines for automated decision making and profiling’ (2018) 13.
Hallinan and Borgesius (n 85).
For a discussion of each example, see above Sect. 2.
GDPR, art 7(4) and recital 43. On this point, see also WP29, ‘Guidelines on Consent under Regulation 2016/679’ (2018) 8-9.
EDPB, ‘Guidelines 2/2019’, point 30.
Interestingly, a recent law from the Region of Lazio (Italy) introduced a specific provision to protect ‘digital workers’. Art. 7 of Regional Law (Lazio) no. 4 of 12 April 2019 establishes that the Region shall encourage any activity aimed at ensuring: (1) the transparency of the evaluation incorporated in the reputational rating, and (2) an impartial review mechanism in case the digital worker contests the result of the rating. Such provisions are in line with the Italian data protection framework and can be indirectly derived from it. However, the initiative is interesting because it envisages some form of public engagement in the governance of a specific aspect of the gig-economy.
For example, the controller could demonstrate its minimisation of disproportionate consequences for data subjects by complying with technical standards, where they exist (e.g. ISO 20488:2018, Online consumer reviews—Principles and requirements for their collection, moderation and publication). Such standard has been modelled on the French standard NF Z74-501 (accessed 30 December 2019).
Private Ordering of Online Platforms in Smart Urban Mobility: The Case of Uber’s Rating System
Rossana Ducato
Copyright Year
Springer Berlin Heidelberg