Skip to main content
Erschienen in:
Buchtitelbild

Open Access 2022 | OriginalPaper | Buchkapitel

4. EU Digital Services Act: The White Hope of Intermediary Regulation

verfasst von : Amélie P. Heldt

Erschienen in: Digital Platform Regulation

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Most experts and journalists agree on the huge importance of the European Union’s Digital Services Act (hereinafter DSA). But can the first proposal published by the EU Commission in December 2020 live up to the expectations expressed ahead? The business model of social media platforms has been criticized for years, but since the Cambridge Analytica scandal, this criticism has expanded beyond a relatively small circle of experts. Lawmakers around the world have tried to push platforms to enforce applicable law and take responsibility. This contribution looks at the key obligations under the DSA and how they are monitored for compliance. It also presents the provisions for greater transparency and accountability. Finally, it looks at the possible consequences if the DSA were to come into force in its current form.

Online Harms as the Linchpin of Intermediary Regulation

Intermediaries, mostly social media platforms, were at first been perceived as enablers of free speech online and as facilitators of a certain democratization of the public discourse (Tucker et al. 2017). Behind this appearance, their architecture and their algorithmic recommender systems have soon led to problems with illegal and harmful content (Gillespie 2014, p. 175; O’Callaghan et al. 2015). Indeed, critics soon identified that many intermediaries did not act against the dissemination of hate crime as well as non-criminal but harmful hate speech (Citron 2014). Neither did they prevent the spread of mis- and disinformation (Schulz 2019). Instead, their business model allegedly facilitates political micro-targeting and dark ads and amplifies conspiracy ideologies (Zarouali et al. 2020).
Until now, the primary law for intermediary regulation in the EU has been the E-Commerce-Directive. Under Art. 14 and 15 E-Commerce-Directive, intermediaries have no obligation to monitor user-generated content. They benefit from a liability exemption as long as they have no knowledge of illegal activities and act promptly upon notification. So far, this safe harbor regime protected intermediaries from regulation specifically targeting content moderation, and it substantially shaped the EU’s digital market. All the more so because this has unleashed synergy effects with a similar law in the U.S., Sec. 230 of the Communication Decency Act, and created a de facto transatlantic market for platforms with user-generated content.
However, for the past four years, an amendment of this directive became an obvious priority due to the sequence of events. Since the first reports on the alleged voter manipulation via Facebook for the UK Brexiteer campaign, EU Member States respectively experienced the adverse effects of online speech harms (e.g., Germany with hate speech against refugees; France disinformation during the 2017 elections). Moreover, the self-regulatory efforts of platforms against online harms were considered neither efficient nor satisfactory by lawmakers. Consequently, single Member States pressed ahead and adopted laws targeting specific online harms. As the probably most discussed example, Germany passed the Network Enforcement Act (NetzDG), which forces platforms provide users with a complaint procedure for unlawful content (under German criminal law) and remove ‘manifestly unlawful content’ within 24 hours. Adopted in summer 2017, the NetzDG was an (explicit) reaction to self-regulatory initiatives’ lack of efficiency.1 Although this law and its implementation are highly criticized (Citron 2017; Kaye 2018), the call for more effective regulation against harmful online communication and subsequently limiting the platforms’ power over free speech has become louder. France passed a law against information manipulation during election campaigns and introduced a new form of interim injunction. Furthermore, France also adopted a law against hate crime (Loi Avia), but the Constitutional Council overturned it for violations of the proportionality test. Similarly, Austria adopted a Communication Platform Act (KoPlG). If more Member States followed the lead, the chances of a fragmentation of intermediary regulation within the EU would have increased, which partially explains the EU’s eagerness to develop a common proposal (Cornils 2020, p. 77). The E-Commerce-Directive’s provisions regarding intermediary liability in place were no longer considered sufficient and adequate (De Streel et al. 2020, p. 57).

Genesis of the DSA

In October 2019, the then-candidate for President of the EU Commission, Ursula von der Leyen, mentioned the Digital Services Act as a means to ‘upgrade liability and safety rules for digital platforms’ in her political agenda.2 She also underlined the need to ‘tackle issues such as disinformation and online hate messages’ to protect democracies.
A leaked note in December 2019 revealed that the Commission considered the E-Commerce-Directive ‘outdated’ and that it needed to be replaced by a more comprehensive set of rules for digital services (Fanta and Rudl 2019). Regarding content moderation, the leaked note proposed to make uniform rules for the removal of illegal content binding across the EU and possibly include harmful (not necessarily unlawful) content. On a more technical side, the authors suggested maintaining the ban on general content monitoring in Art. 15 E-Com-Dir but re-consider special provisions for filter technologies.
The lawmaking process started in 2020 and is still ongoing. So far, it can be described as relatively speedy and as ‘the biggest update of digital regulations for around two decades’ (Lomas 2020). Several committees within the EU Parliament produced meaningful reports and developed recommendations.3 Finally, the Commission presented its first ‘Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC’ on December 15, 2020 (hereinafter DSA). This first proposal will serve as the basis for further deliberation and is, therefore, at the center of this paper. According to the Commission’s proposal, the DSA ought to counteract the risks and problems that have arisen for both individuals and society as a whole from the use of information intermediaries, against the dependence of the economy and society on single providers, and the power of these providers over public discourse. Its goal is not to ‘break’ platforms but rather to constitute a common European rulebook to increase legal certainty for companies in the Digital Single Market, and, subsequently, better protect fundamental freedoms.

The EU Commission’s Proposal

The DSA’s application scope expands from mere hosting service (based on Art. 14 E-Com-Dir) to a more nuanced definition of addressees. According to Art. 2 (f) DSA, intermediary services include mere conduits, caching services, and hosting services. Art. 2 (h) defines online platforms as ‘a provider of hosting service which, at the request of a recipient of the service, stores and disseminates to the public information’. According to Art. 25 (1) ‘platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million’ are considered Very Large Online Platforms (VLOPs). Under Art. 16 DSA micro and small enterprises are excluded from the scope of application. By doing so, the Commission keeps its initial classification of intermediaries as neutral infrastructure providers laid out in Art. 14 and 15 E-Com-Dir but, at the same time, follows a gradual approach.

Enforcing National Laws

The DSA does not include an obligation for platforms to proactively review user content. Instead, Art. 7 DSA maintains the duty for the Member States to ‘not impose a general obligation on providers monitoring obligation’ (Art. 15 E-Com-Dir). The liability privilege remains as long as the platforms have no knowledge of illegal content. The decision to maintain this regime is probably due to the high risk of negative consequences for both the companies and the users’ fundamental rights. The DSA proposal stipulates more exceptions, such as the obligation to act ‘upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities’ ‘without undue delay’ (Art. 8 (1) DSA). Intermediaries are expected to deliver an immediate response expected, but the DSA does not spell out a concrete timeframe. However, it does include an obligation to act against users who regularly upload illegal content (Art. 20 DSA) and to report ‘serious’ crimes involving a threat to the life or safety of persons (Art. 21 DSA).
Most importantly, the DSA provides rules for the moderation of illegal content solely (Art. 2 (p) DSA), not for content that does not violate a legal prohibition. It leaves at the service’s discretion whether to implement the measures for the enforcement of their respective content rules (community guidelines/standards). According to Art. 2 (g) DSA ‘illegal content’ means ‘any information, which, in itself or by its reference to an activity, including the sale of products or provision of services, is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law’. Under Art. 14 DSA, providers of hosting services have to ‘put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content.’ If providers choose to remove or block content, they have to inform the user who posted the content and state the reasons for their decisions (Art. 15 DSA). Moreover, according to Art. 17 DSA, providers of online platforms need to provide users with an internal complaint-handling system.

Oversight and Enforcement

To monitor the addressees’ compliance with the new rules and possibly enforce them, the DSA introduces two new oversight institutions: Digital Services Coordinators at the national level, and the Board for Digital Services at the EU level. These new public agencies would have specific supervisory rights with regard to the DSA—something the committee reports by the EU Parliament have been strongly advocating for.
Under Art. 38 (2) DSA each Member State shall designate a Digital Services Coordinator (hereinafter DSC) responsible for ‘all matters relating to application and enforcement’ of the DSA. For supervision, investigation, and enforcement, the DSC shall have special rights awarded by the DSA and common to all Member States. Moreover, they will have the authority to impose fines, to impose measures against a service’s management, and, as ultima ratio, to decide over the interruption of a service if the DSC identifies repeated infringements (Art. 41 DSA). To allow for a harmonized approach within the EU, the DSCs shall cooperate with each other and with other competent authorities. The DSA lays the cornerstone for this new authority (Art. 39 DSA) but leaves any further development of the task at the Members States’ discretion. States that already adopted a similar law could, for instance, merge the already existing competent authority at the national level with the DSC.
The DSCs will cooperate within an independent group and form the European Board for Digital Services (hereinafter the Board). The Board shall serve as an advisory body to the DSCs and the EU Commission (Art. 47 DSA) and form a superordinate structure intended to serve the purpose of better consultation and more effective application of the new rules. It will essentially facilitate the better coordination of supervision activities by the DSCs. Also, the Board will receive its special supervision rights for VLOPs. Under Art. 50 DSA, the enhanced supervision aims at avoiding systemic risks originating from the size of VLOPs and their subsequent influence on the public sphere. Altogether, the EU Commission, the Board, and the DSC have a wide range of measures at their disposal to enforce the rules set in the DSA. Additional interventions by the EU Commission in Art. 51, 58, and 59 DSA stipulate an active role for the DSC and the Board. This leads to a distribution of supervisory rights among different institutions in proceedings against VLOPs. Thereby, the imposition of the most severe sanctions is not only at the mercy of one competent authority.

Tools to Enhance Transparency and Accountability

Beyond concrete rules against the spread of hate speech and illegal content, European lawmakers also considered the need for more transparency about the intermediaries’ activities and ways to possibly hold them accountable. Both aspects are essential for a better understanding of how intermediaries generally function and how they apply the new rules. At the individual level, users are the first beneficiaries of procedural guarantees regarding content moderation. According to the current proposal, their right to complain against illegal content and better understand corporate content moderation procedures are at the core of this regulation (Art. 12 (1) DSA).4 This transparency right for users affected by content restrictions is concomitant to operational terms. According to Art. 12 (2) DSA, providers of intermediary services ‘shall act in a diligent, objective and proportionate manner’ when applying content restrictions on users. This includes a duty to respect the ‘applicable fundamental rights of the recipients of the service as enshrined in the Charter.’ Previously, the E-Com-Dir mentioned the importance of freedom of expression in its preamble. Intermediaries were expected to provide their ‘information society services’ in light of Art. 10 ECHR. The provisions, however, did not explicitly mention the ECHR.5 This explicit obligation for intermediaries to take the fundamental rights enshrined in the EU Charter of Fundamental Rights into account is quite a novelty. It illustrates that lawmakers see the responsibility that should come along with the potential influence of intermediaries.
This leads us to the question of transparency at the corporate level: According to Art. 13 DSA, intermediaries will have to publish transparency reports at least once a year. Required information includes the number of orders received from MS, the number of notices submitted per Art. 14 DSA, content moderation activities engaged at the provider’s initiative, and the number of complaints received in compliance with Art. 17 DSA. The reporting obligation increases in relation to the type and the size of the service. Under Art. 23, 33 DSA online platforms and VLOPs have to disclose additional information than mere intermediary services. VLOPs additionally have to provide an annual risk assessment (Art. 26 DSA), focusing on the usage of their services to disseminate illegal content, negative effects for fundamental rights arising out of their services, and the ‘intentional manipulation of their service.’ The latter is another novelty in terms of platform regulation: VLOPs are asked to assess their negative effect on the protection of public goods and, among others, their ‘foreseeable impact related to electoral processes and public safety.’ These obligations are paired with an annual independent audit at their own expense (Art. 28).
All in all, such reports can inform the public about the policies and practices of services that are heavily used all over the world but quite opaque to most users so far. The effects could, therefore, not be limited to the EU but potentially inform stakeholders worldwide. Both types of instruments, at the individual and at the corporate level, can serve as information sources for complaints (Art. 43 DSA) and are, therefore, serving not only transparency but also accountability.

Interim Conclusion

At first look, the DSA proposal submitted by the EU Commission is more than a mere update of the E-Com-Dir. Under the new provisions, information and data would no longer be perceived only from the perspective of goods and markets. Instead, the DSA could become a human-rights-infused regulation (Llansó 2020) because not only does it explicitly mention the Charter of Fundamental Rights, it also builds in the values of the Charter in the provisions themselves. One fear (preceding the proposal) was that it would change the liability rules and force platforms to introduce “pro-active” measures against illegal content (Fanta and Rudl 2019), but following the heated discussion on upload-filter in the DSM-Directive (Heldt 2019), the Commission refrained from implementing such obligation in the DSA.
It is also worth noticing that the DSA is part of a larger package including, the Digital Market Act and the Democracy Action Plan. The latter is of particular interest for the questions regarding content moderation and fundamental rights in democratic societies. The ​EU Democracy Action Plan ought ‘to ensure that citizens are able to participate in the democratic system through informed decision-making free from unlawful interference and manipulation.’ With regard to the role of online platforms, the DAP includes six objectives (section 4.2):
1.
monitoring the impact of disinformation and the effectiveness of platforms’ policies;
 
2.
supporting adequate visibility of reliable information of public interest and maintaining a plurality of views;
 
3.
reducing the monetization of disinformation linked to sponsored content;
 
4.
stepping up fact-checking;
 
5.
developing appropriate measures to limit the artificial amplification of disinformation campaigns; and
 
6.
ensuring an effective data disclosure for research on disinformation.
 
Ideally, the rules proposed in the DSA would serve as means to achieve the objectives set in the DAP. This interplay should be kept in mind when evaluating the single-out measures.

Potential Frictions

Formal Matters

As mentioned in the history of the DSA, the setting is complicated due to pre-existing regulations by the Member States and matters of competency at the EU level. Generally, the EU is competent for the single market’s realization (Art. 26 TFEU). According to Art. 114 TFEU, the EU Parliament, and the Council adopt legislation to harmonize the rules necessary to build and ensure the functioning of the single market. The EU does however, not have the legislative competency for criminal law. Hence, the definition of illegal content is left at the Member States’ discretion. In the course of the DAP, the EU Commission plans to propose an amendment to Art. 83 TFEU ‘to cover hate crime and hate speech, including online hate speech’ in 2021 (On the European Democracy Action Plan 2020, p. 10). According to Art. 83 TFEU, the EU legislators can set ‘minimum rules concerning the definition of criminal offences and sanctions in the areas of particularly serious crime with a cross-border dimension.’ Such offences are thereby considered criminal and punishable in all Members States. The Commission’s goal is to substantially enhance the protection of citizens and journalists and, therefore, to hamper further coarsening and polarisation within the public debate.
One should also carefully examine the necessity of new measures in the light of potential risks at the individual and collective level. In light of the developments in recent years, the legislator clearly needed to address contemporary issues of the digital sphere. One can measure the estimated need for harmonization at the EU level by the form chosen to legislate. Indeed, the DSA proposal comes as a regulation, which means that it will be applicable to all jurisdictions within the EU without any transposition legislation by the Members States. (As opposed to its predecessor, the E-Commerce-Directive.) This form limits the ability of Member States to deviate and to potentially dilute certain rules. The Commission seems to follow the GDPR’s path (Wagner and Janssen 2021) based on the idea that a regulation will be more suitable for such a cross-border topic as the Digital Single Market than a directive.

Collision Risk

The DSA might collide with already existing laws. It remains unclear if the DSA should replace the Member States’ existing laws like the German NetzDG or be considered supplementary to the DSA. The question is actually twofold. The DSA and pre-existing laws could either collide/diverge, or; they could respectively address aspects not mentioned by the other one. In the latter case, I believe that the DSA will serve as a minimum standard, allowing the Member States to adopt additional laws as long as it does not hollow out the DSA. This has to do with the principle of subsidiarity within the EU regarding the Member States’ sovereignty. Even more so, because the DSA would not just contain new rules for the Digital Single Market but also overlaps with criminal law and media law. In cases where the DSA and national regulation could potentially contain contrary or very different rules for the same issue, the DSA would prevail. According to the precedence principle by the CJEU in the case Costa v Enel (1964), if a national rule is contrary to a European provision, Member States’ authorities must apply the European provision.6 National law is neither rescinded nor repealed, but its binding force is suspended. It is undisputed that this principle is indispensable for the functioning of European integration as a community based on the rule of law (Haltern 2020, p. 818). At the same time, Member States try to preserve a relevant influence over legislation as much as possible. The DSA could become yet another example of a tug of war between Brussels and national regulators.

Countering the Consolidation of Power Structures

The DSA’s primary goal is to equilibrate the power structures in the digital economy, hence, to even out the dominant position of certain intermediaries over their users and their competitors (note, it is not an anti-trust law). Of course, the “big players,” large companies from Silicon Valley, are in the “first line” because they developed a significant influence over the market by gathering data. Since the rise of the social web in the early 2000s, social media platforms have become a relevant communicative infrastructure. For most parts, they were the only arbiters of permissible expressions within their network (Suzor 2019) and have subsequently gained considerable power over their users’ media diet. The modular setting of social media platforms (Schulz and Dreyer 2020, p. 31) makes it difficult to regulate them, that is, to use already existing frameworks or categories. Lawmakers are therefore compelled to conceive new regulatory approaches. After a long period of the self-regulation regime under Art. 14 E-Com-Dir (Buiten et al. 2020, p. 145), the EU decided to focus on stricter rules (Scott et al. 2020). That is why the DSA aims to strengthen the users’ rights to be better informed, appeal certain decisions, and lodge complaints—regardless of the country and the laws restricting speech.
Does this approach really strengthen the platforms’ power over online speech? One could argue that they will in the future still be the ones deciding over the removal of content, its distribution, and algorithmic recommendation systems. The new rules could perhaps consolidate their dominant position over the public discourse because it gives them more legitimacy. Two things can be said against this hypothesis. First, selection, prioritization, and recommendation are inherent to the service users look for: intermediaries provide this exact service, and users see “only” a selection of content. Second, the safeguards provided by the DSA on different levels will challenge the platforms in an unprecedented way. It might not be exhaustive in all aspects, yet it will constitute a tipping point.

Avoiding Collateral Censorship

One pressing question is whether the DSA could potentially be a means of collateral censorship (Balkin 2014). Indeed, this type of regulation can be considered as a way to impose content-related rules, although, from a constitutional law perspective, speech-restricting laws should be kept to a strict minimum. Some argue that the exception to Art. 14 e-Com-Dir, that is, Art. 6 DSA, could be a threat to freedom of expression because it incentivizes intermediaries to act against illegal content and, potentially, their own content rules (Kuczerawy 2021). This viewpoint builds on the over-removal phenomena, when platforms enforce more rules than necessary to avoid liability and the additional costs of nuanced content moderation practices (Keller 2019). While the risk of extensive enforcement of the DSA is a point to be taken seriously, the current draft clearly builds on balancing intermediary liability and fundamental rights under Art. 12 (2) DSA. This provision does not introduce a horizontal effect of freedom of expression between platforms and users, but it makes it mandatory to enforce only clear and unambiguous content rules (Kuczerawy 2021). A stricter liability regime would most certainly lead to the unwanted effect of collateral censorship (Buiten et al. 2020, p. 161). Ultimately, intermediary liability for illegal content is a constant dilemma (Helberger et al. 2018, p. 2; Heldt 2020).

Conclusion

More duties, more oversight, more transparency, and a systemic approach—the current proposal of the DSA provides answers on several levels. It addresses a wide range of issues and builds in safeguards at the individual and collective levels. Will it become the rulebook of reference for the digital sphere? It remains to be seen to what extent the final version of the DSA will contain crucial provisions or if the upcoming negotiations will delude them. One thing, however, is clear: the times of self-regulation are over—at least in the EU.
The DSA and other upcoming EU regulations could herald a new period for digital platforms and indirectly for users worldwide due to another perpetuation of the “Brussels effect” (Bradford 2012, 2020). According to Bradford, the EU developed a strong regulatory power at a global scale through its legal institutions and standards. Indeed, the EU aims for high standards in the Digital Single Market and could potentially develop what lawmakers consider a gold standard for platform regulation—beyond the EU’s borders. This, however, presents them with another challenge: are they regulating for the EU or for the world (Heldt and Hennemann 2021)? In any case, one needs to also be aware of the developments across the Atlantic. On May 14th 2021, US-President Joe Biden revoked an Executive Order of former President Trump that targeted Sec. 230 CDA (Lyons 2021). Nonetheless, experts still expect the Biden administration to amend the current liability regime providing intermediaries with large immunity (Edelman 2021).
Meanwhile, the EU’s main responsibility is to protect the European Union’s values and rights (Art. 2 and 3 TEU), and, regarding “exporting” the DSA, there are two possible approaches. Either law-makers interpret this as an opportunity to develop the EU’s power as a regulator beyond the EU’s borders or, instead of generic rules which could be potentially adopted outside the EU, the DSA would be tailored-made for the EU and rely on rule-of-law guarantees provided by the Treaties. Hence, European lawmakers now might have to carefully gauge while keeping in mind that regulation like the DSA can be replicated by other countries.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Fußnoten
1
Speech by the then Federal Minister of Justice and Consumer Protection, Heiko Maas, on the bill to improve law enforcement in social networks (Network Enforcement Act) before the German Bundestag in Berlin on June 30, 2017, retrieved from https://​www.​bundesregierung.​de/​breg-de/​service/​bulletin/​rede-des-bundesministers-der-justiz-und-fuer-verbraucherschut​z-heiko-maas--793138.
 
2
Von der Leyen, U. (2019). A Union that strives for more—My agenda for Europe, retrieved from https://​ec.​europa.​eu/​info/​sites/​info/​files/​political-guidelines-next-commission_​en_​0.​pdf.
 
3
The JURI committee proposed standards and procedures for content moderation, and guaranteed access to remedies; as well as the establishment of a European Agency tasked with monitoring and enforcing compliance. The IMCO report called on the COM to propose concrete legislative measures including notice-and-action mechanisms; as well as a central regulatory authority for oversight and compliance; transparency requirements for advertising, nudging etc. The LIBE report also proposed the creation of an independent EU body to exercise effective oversight.
 
4
The right to access personal data under GDPR will not be affected; the rights can be cumulative.
 
5
The EU Charter of Fundamental Rights did not exist when the E-Com-Dir was adopted.
 
6
Judgment of the Court of Justice of the European Communities of 15 July 1964. Flaminio Costa v E.N.E.L. Case 6/64.
 
Literatur
Zurück zum Zitat Balkin, J. M. (2014). Old School/New School Speech Regulation. Harvard Law Review, 127(8), 2296–2342. Balkin, J. M. (2014). Old School/New School Speech Regulation. Harvard Law Review, 127(8), 2296–2342.
Zurück zum Zitat Bradford, A. (2020). The Brussels Effect: How the European Union Rules the World. Oxford University Press. Bradford, A. (2020). The Brussels Effect: How the European Union Rules the World. Oxford University Press.
Zurück zum Zitat Citron, D. K. (2017). Extremist Speech, Compelled Conformity, and Censorship Creep. Notre Dame Law Review, 93(3), 1035. Citron, D. K. (2017). Extremist Speech, Compelled Conformity, and Censorship Creep. Notre Dame Law Review, 93(3), 1035.
Zurück zum Zitat Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media Technologies: Essays on Communication, Materiality, and Society (pp. 167–194). The MIT Press. Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media Technologies: Essays on Communication, Materiality, and Society (pp. 167–194). The MIT Press.
Zurück zum Zitat Haltern, U. (2020). Ultra-vires-Kontrolle im Dienst europäischer Demokratie. Neue Zeitschrift Für Verwaltungsrecht, 12, 817–823. Haltern, U. (2020). Ultra-vires-Kontrolle im Dienst europäischer Demokratie. Neue Zeitschrift Für Verwaltungsrecht, 12, 817–823.
Zurück zum Zitat Schulz, W., & Dreyer, S. (2020). Governance von Informations-Intermediären—Herausforderungen und Lösungsansätze [Bericht an das BAKOM]. Schulz, W., & Dreyer, S. (2020). Governance von Informations-Intermediären—Herausforderungen und Lösungsansätze [Bericht an das BAKOM].
Metadaten
Titel
EU Digital Services Act: The White Hope of Intermediary Regulation
verfasst von
Amélie P. Heldt
Copyright-Jahr
2022
DOI
https://doi.org/10.1007/978-3-030-95220-4_4