Europe’s Digital Services Package: What It Means for Online Services and Big Tech

August 23, 2022 | Paul Meosky, EPIC Law Clerk

The EU recently passed comprehensive legislation on platform monitoring, digital free speech, and antitrust, largely directed at Big Tech. On July 5, 2022, the European Parliament adopted the Digital Services Package, comprised of the Digital Markets Act (“DMA”) and the Digital Services Act (“DSA”) and first proposed by the European Commission in December 2020. The European Council of Ministers will sign the bills into law this September, and they will take effect in early 2024 (though Big Tech will have to comply within months of entry into force). The Digital Services Package is touted as a “global first,” promising to “safeguard[] freedom of expression and opportunities for digital businesses.” After years of growing tech reliance and tech consolidation, “Democracy is back.

Or is it? Critics have pointed out possible pitfalls in the legislation that may affect free speech, civil society, information access, and security. For example, many civil rights advocates are particularly concerned about a “crisis response mechanism” (“CRM”) added at the last minute and with little public input. The provision is meant to protect citizens from misinformation when lives are on the line, such as during COVID or the Ukrainian conflict. Yet it also concentrates power in the EU executive and could be abused to cripple free speech on the continent. Thus, the CRM, like the rest of the package, is a potential two-edged sword, and its net impact on democracy depends on how it is enforced—if at all. Over-enforcement could lead providers and platforms to over-censor, endangering free expression across the continent, and to be overly-permissive with sending personal data to potentially repressive governments. EFF and Apple have both raised concerns that the DMA’s interoperability requirement could make personal information less secure. Under-enforcement, on the other hand, would erode public trust in the EU and maintain the harmful status quo.

The impact also depends on the context of where it is enforced. Human rights organizations have urged lawmakers to consider the precedent the EU will set for the rest of the world, and how powers that promote civil rights in France or Denmark could erode those same rights in Turkey or India. The legislation could even spark a race to the bottom, with platforms over-censoring rather than risking violating the national laws the DSA incorporates. Some of the laws implemented in member states with more restrictive regimes, such as in Poland and Hungary, could conscript Facebook or other large tech companies with caches of personal data into targeting political dissidents and marginalized communities. Even in less autocratic-leaning nations, the EU model may fail. If, as some EU lawmakers have claimed, the Digital Services Package is a “European constitution for the Internet,” then it necessarily reflects European principles that other regions may or may not share.

To appreciate the “how” and the “where” of this European approach to a global problem, we must first appreciate the “what” and the “why.” The two statutes comprising the Digital Services Package (the DSA and DMA) both aim to establish a unified framework for protecting consumers and small businesses. Currently, digital service providers contend with 27 regulatory regimes across the continent, each imposing different obligations and prohibiting different practices. Large companies generally have more resources and expertise to dedicate to compliance, meaning they will often be able to expand across multiple privacy regimes faster and more easily than smaller companies. The DSA and DMA work together to reunite the EU under a cohesive framework for internet governance, though they protect different rights, impose different obligations, regulate different companies, and are enforced by different agencies.

The DSA is primarily concerned with content moderation and consumer protection, though it is carefully structured to ensure that “small online platforms are not disproportionately affected but . . . remain accountable.” In essence, the DSA makes “what is illegal offline . . . also illegal online.” The law recognizes that online platforms are now “quasi-public spaces” where we speak, learn, trade, and play with increasing frequency. They are also spaces where we encounter hate, lies, crime, and surveillance. Before the DSA was proposed in 2020, a patchwork of laws across Europe were trying to quiet Holocaust deniers, correct COVID-19 misinformation, punish revenge porn, and regulate political micro-targeting. The DSA consolidates and supplements, without replacing, those national standards under a “notice and takedown” regime that holds platforms liable for illegal content once they know it exists. Similar to Section 230 of the Communications Decency Act in the U.S., platforms are not required to actively monitor third-party content, but a “Good Samaritan” clause protects “good faith” and “diligent” efforts to do so. However, platforms are required to provide an easy process for users to flag illegal content and appeal take-down decisions. They must also notify authorities of any criminal threat to life or safety and comply with government orders to remove illegal content or provide specific information about a user. Aside from these content moderation requirements, the DSA outright prohibits “dark patterns” and marketing techniques targeting children or sensitive data, such as one’s sexual orientation or religion. Finally, the DSA imposes a host of transparency requirements, including user-friendly terms-and-conditions, annual reports, and mandatory disclosure of curation and recommendation algorithms.

The DSA also supports the EU’s mission of promoting competition, though to a much smaller extent than the DMA. First, the DSA creates a European Board for Digital Services (the “Board”) with the power to issue voluntary interoperability standards. Second, the DSA is designed to favor smaller companies by imposing more obligations on larger companies with its asymmetric regulation scheme. The DSA applies to online intermediaries, hosting services, platforms, and search engines, imposing different obligations on each. For example, while all online service providers face reporting requirements, only platforms are required to provide a complaint and redress mechanism for users. Moreover, “very large” platforms (“VLOPs”) and search engines (“VLOSEs”), i.e., those with at least 45 million users, must comply with additional requirements, such as having a code of conduct, sharing data with authorities and researchers, allowing users to opt out of profiling, and submitting to external audits. Even the controversial CRM only applies to VLOPs/VLOSEs. The DSA will be enforced by each nation’s “Digital Services Coordinator,” though the Board will run joint investigations and the Commission will supervise very large entities. Companies that fail to comply with DSA obligations face hefty fines ranging up to 6% of a company’s annual global revenue.

The DMA is more focused on antitrust concerns, with the goal of leveling the playing field for smaller internet providers and platforms. Unlike the DSA, the DMA only applies to “gatekeepers”—companies providing “core platform services” that dominate European (and global) markets. Under the DMA, gatekeepers will no longer have unlimited power to play by their own rules and outcompete everyone else on their platforms. The DMA specifically prohibits seven unfair practices, including requiring businesses to use the gatekeeper’s own payment method, pre-installing certain software on consumer devices, and sharing user data across services. Other practices that may be unfair within certain contexts, such as search engines ranking their products favorably, will be subject to investigation. Gatekeepers must also provide greater data access and interoperability, notify the European Commission of mergers and acquisitions, and make unsubscribing from platform services as easy as subscribing. The DMA will be enforced by the European Commission with fines as high as 10% of the company’s annual global revenue (or 20% for repeat offenses).

Neither law, however, will have much impact if it isn’t enforced. This isn’t the EU’s first swing at Big Tech. In 2018, the General Data Protection Regulation (“GDPR”) promised to disrupt the network effects locking consumers into data megaliths like Facebook and Google by guaranteeing data portability. Yet many feel the GDPR has not delivered on its promise, largely because the requirement has significant caveats and has not been well-enforced. The DMA and, to some extent, the DSA step into the interoperability void, but, like the GDPR, they won’t work unless consistently used. The same goes for the fundamental rights the DSA aims to protect.

The DSA and DMA are already starting on a better footing than the GDPR, which left enforcement to the implementation and resources of its member countries. The DMA is directly under the Commission while the DSA organizes enforcement under the transnational Board. Both, however, require massive investment in expertise and enforcement mechanisms to take on Big Tech. Unless the EU puts its money where its mouth is, many fear Big Tech will simply ignore the new regime. Others fear bureaucratic capture. Lobbying around the Digital Services Package has already risen to unprecedented levels, and there’s no sign of it abating as the EU starts putting the law into practice. On the other hand, if the EU learns from its past mistakes and gives the legislation teeth, the framework could be powerful yet flexible enough to protect consumers and small businesses alike. For example, the Commission could forbid services from using interoperability to interfere with end-to-end encryption. It could also interpret the “Good Samaritan” clause charitably to mitigate the risk of over-censorship.

Since the Commission can’t foresee every human rights conflict, it should be prepared to work closely with advocates and civil society to ensure it lives by the transparency and accountability principles it espouses. Members of the DSA Human Rights Alliance were involved early on in the process, during the Commission’s months-long impact assessment that gave rise to the Digital Services Package, and they have remained on alert since the package was introduced in 2020. Lawmakers have addressed many of their concerns, restricting data disclosures to law enforcement and prohibiting legally mandated content moderation algorithms, but several problematic provisions remain or necessary additions remain left out. For example, alliance members mourn the loss of “essential privacy safeguards” that were debated but ultimately abandoned, such as a guaranteed right to use services anonymously and a universal opt-out mechanism for data tracking. Civil society groups might still influence implementation, perhaps through the Board’s standard-making authority, but it will require ongoing collaboration in the years to come.

Despite the concerns listed above, the Digital Services Package represents significant steps forward in internet regulation. Interoperability requirements and targeted ad restrictions give EU members unrivaled control over how companies use their data. Prohibitions on unfair practices and sparing smaller companies from the harshest requirements open markets and spur innovation. Independent audits, mandatory reporting, and algorithm transparency provide mechanisms to hold providers accountable for the damage they cause. Finally, the legislation represents a symbolic victory for democracy over Big Tech.

The impact will be global. The U.S. will certainly be affected, if only because many of the biggest tech companies in the world are based in the U.S. Several Big Tech companies and their defenders have argued that the only “gatekeepers” and “VLOPs/VLOSEs” impacted by the laws will be American companies, claiming that the new regulations are designed to threaten or weaken U.S. tech dominance. However, it’s increasingly possible that these laws could prompt global innovation in how platforms interact with each other, their content, and their users. Providers will likely choose to update their algorithms and processes worldwide rather than operate substantially different platforms on different continents, benefiting all consumers globally, including American consumers. Researchers, regulators, and competitors around the world gain access to the wealth of information released by transparency requirements. Likewise, journalists will likely have more fodder for their stories, even as they tap into the wealth of advertising revenue released by the laws’ competition provisions. And policymakers can learn from the wealth of experience developed by their EU colleagues. Meanwhile, American constituents will increasingly demand the rights and respect afforded the EU consumers.

Admittedly, a sweeping content moderation proposal is not on the agenda for Congress in 2022, though various antitrust and privacy packages are being considered. A digital services bill was actually introduced to the House early this year, but has not been taken up by the relevant Committee. Nonetheless, online platforms can’t rest easy. In a country torn by identity politics, there is a growing “techlash” on both sides of the aisle. Over 75% of Americans believe tech companies have too much power and the EU’s pro-competition, pro-privacy approach might garner enough support among Democrats and Republicans to enact change. Doubtless, the American version of the Digital Services Package would look significantly different from its European counterpart. The goal isn’t transatlantic conformity so much as transatlantic compatibility. Any “constitution for the internet” must reflect the values and traditions of its people. Yet where those values and traditions overlap, the U.S., the EU, and their allies in democracy can still coordinate policies that allow for both thriving competition and secure civil rights.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate