This chapter maps out the irreducible ambivalence inherent in algorithms, and particularly Big Data techniques, that become evident when confronted by the law. This ambivalence, and its legal consequences, are explained in four steps. The first part of the chapter deals with the darker sides of the digital economy: the use of behavioral algorithms by private companies in market settings not only drives one of the most innovative parts of the economy, but has also given rise to what may be called “digital market failures”. Remedial strategies are complicated by widespread actor heterogeneity. Hence, traditional regulation not only risks lagging behind technological progress, but also being overly restrictive for some and overly permissive for others.
Against this backdrop, the chapter in a second step explores how personalized law can be used to mitigate digital market failure while simultaneously accommodating actor heterogeneity. Using Big Data techniques, personalized law promises to tailor legal norms, from disclosures to mandates, to the individual characteristics of addressees. Unlike one-size-fits-all regulation, personalization respects actor heterogeneity by actively harnessing the potential of digital technology for social good. However, the use of individualized, potentially privacy-sensitive information (such as personality traits or degrees of rationality) by the regulator raises a host of concerns of its own.
Therefore, the third part of the chapter develops an account of challenges to the legitimacy of personalized law stemming from both positive law and legal theory. While most of these objections can be accommodated by specific design features of personalized law, the chapter nonetheless argues that it must be used with care and after rigorous scrutiny on a case-by-case basis. For example, its use might be most valuable precisely in the very instances of digital market failures discussed in the first part of the chapter.
Accordingly, the fourth part suggests a normative approach to personalized law, under which due weight is given to the interests of all concerned parties. A key result of the analysis is that, if used prudently, personalized law will, counterintuitively, strengthen legal equality by making the differential impact of legal norms on different actors legally relevant, thus avoiding the pitfall of treating fundamentally different situations in the same way. However, such a démarche must be accompanied by enhanced public scrutiny of algorithmic lawmaking. As more and more economic and regulatory processes become subject to the power of algorithms, the crucial challenge is to develop a robust democratic discourse so that algorithmic decision making is not treated as a hermetic black box, but is infused with and guided by those societal values on which a legally constituted market economy undoubtedly rests.