Background

Regulating online platforms sometimes implicates First Amendment concerns for users' speech and association rights.

The First Amendment protects people’s right to free speech by making the government satisfy high barriers when it seeks to regulate speech. Traditionally, the First Amendment has protected a wide range of actors and activities including newspapers running articles and opinion pieces, people protesting in public, museums displaying art, people burning flags, and more.

Many tech companies claim broad First Amendment protections against regulation because of their status as intermediaries for users’ speech. The companies seek to use the First Amendment to overturn laws that would give users privacy rights, prevent harmful platform design choices such as dark patterns, and offer children additional protections online.

EPIC generally argues for nuanced, case-specific First Amendment determinations when tech companies challenge regulations on First Amendment grounds. Ruling wholesale that platform companies’ activities are always protected speech would leave citizens powerless to challenge harmful corporate practices. But ruling that none of the practices are speech can harm the ability for users to spread information and engage in political debate. That is why EPIC advocates for a careful and nuanced approach to these issues.

The First Amendment and Privacy

Tech companies are bringing challenges to privacy laws, arguing that they violate the First Amendment. This raises important questions about how and whether privacy rights and speech rights, especially companies’ speech rights, conflict.

In certain situations, the Supreme Court has recognized that the First Amendment protects the act of disclosing information. For instance, in Florida Star v. BJF, the Supreme Court ruled that the government could not prohibit a newspapers’ disclosure of truthful information that the newspaper acquired legally and that was not sensitive secret information such as troop movements. And in Sorrell v. IMS Health, the Supreme Court ruled unconstitutional a law that prohibited sharing doctors’ prescription patterns because the law’s ultimate goal was to prevent a specific viewpoint from being expressed (pharma reps’ high-pressure sales tactics to get doctors to prescribe more expensive drugs) by drying up the data source that informed that viewpoint (doctor prescription records). The Court held that this law preventing data sales was unconstitutional specifically because it was meant to restrict advertising speech in a viewpoint-based manner.

The question is whether those situations map onto social media companies’ surveilling users and sharing their data with other companies. EPIC believes there are meaningful differences between those activities, and generally applicable privacy laws either do not trigger First Amendment scrutiny or, alternatively, trigger but survive First Amendment scrutiny.

EPIC also agrees with scholars who note that privacy and free speech are not necessarily opposed. Privacy provides people with the freedom to explore controversial ideas, have difficult conversations, and generally develop as individuals.

For more on this issue, read EPIC’s analysis of the First Amendment issues raised by the NetChoice v. Bonta case.

The First Amendment and Platform Transparency

Legislatures are beginning to pass transparency laws that would require online platforms to report high-level information about how they run their services. These laws can require platforms to report how they handle user data, moderate content, and design their platform features, among other activities.

Disclosures from social media platforms would bring many public benefits. Information about companies’ content moderation policies—and how they enforce those policies—can show whether companies are living up to their public promises, whether their platforms provide safe spaces for marginalized groups to speak and organize, and whether consumers can trust the information they find on the platform. This information can help consumers decide which platforms to use to access information and speak and can also enhance public discourse on the harms platforms may cause to democracy, public health, and human rights. Taken to an extreme, though, these laws could chill users’ speech. The key question is the right framework to distinguish between constitutional and unconstitutional versions of these laws.

Tech companies are challenging common-sense platform transparency laws, claiming that they violate the companies’ First Amendment rights. The companies claim that the underlying activities they are required to report on, such as moderating content or designing platform features, are speech akin to a newspaper company deciding which stories to run and how to display those stories. According to the companies, just as the government could not constitutionally require a newspaper to report how it selects and edits news articles, the government similarly cannot constitutionally require transparency from a tech company.

Because such disclosures concern the relationship between a service provider and a service user, many—such as EPIC—argue that the Supreme Court’s test from Zauderer v. Office of Disciplinary Counsel should be adapted to apply to platform disclosure laws. Zauderer says that a disclosure law is constitutional as long as it is reasonably related to a government interest, and (2) not unjustified or so unduly burdensome that it would chill speech. This test could be adapted to require challengers to show whether the disclosure law actually chills their protected speech and then shift the burden to the government to show that the government interest is proportional to any burden on speech.

Tech companies already make disclosures like these as required by European Union laws such as the Digital Services Act.

The First Amendment and Platform Design Laws

Platform companies’ decisions about how they design their platforms have a direct impact on users. Platform companies invest enormous amounts of time and money into designing recommendation algorithms, user interfaces, safety features, and age-gating systems, among other features. They often do so with an eye to their ultimate profits driven from ad revenue, which incentivizes some harmful decisions such as addictive design practices, engagement-maximizing recommendation algorithms, and dark patterns.

Increasingly, legislatures are interested in passing laws that impact how companies design their systems, seeking to ban some of the most egregiously harmful practices or to give users more control over their online experience.

Platform companies have challenged various design laws, arguing that they violate the companies’ First Amendment right to editorial discretion. As in many other First Amendment platform regulation cases, the companies portray themselves as basically giant newspapers. Because newspapers receive First Amendment protection for how they publish pieces, the companies argue that they too should they receive First Amendment protection to show users content in whichever way they wish. In other words, using addictive design practices or specific algorithmic weights represents the companies’ protected expression.

EPIC disagrees with the idea that legislatures should be unable to curb the worst examples of harmful platform design practices. There are meaningful differences between newspapers and social media companies. Regulating addictive design practices is not the same as regulating where newspapers place stories.

Recent Documents on The First Amendment

  • Amicus Briefs

    Free Speech Coalition v. Paxton

    US Supreme Court

    Whether a court should apply rational basis review or, instead, strict scrutiny in a facial challenge to a law that requires pornography websites to verify the age of users in order to block kids from accessing pornography.

  • Amicus Briefs

    X v. Bonta

    US Court of Appeals for the Ninth Circuit

    In a First Amendment challenge to a law that requires social media platforms to disclose information about their existing content moderation practices, what level of scrutiny applies? And can the law be struck down in its entirety based on a tenuous and hypothetical assertion that public or government pressure will lead to companies changing their content moderation practices?

  • Amicus Briefs

    NetChoice v. Paxton / Moody v. NetChoice

    US Supreme Court

    Whether the First Amendment prevents nearly all regulation of social media companies' content-hosting and content-arranging decisions.

  • Amicus Briefs

    NetChoice v. Bonta

    US Court of Appeals for the Ninth Circuit

    Whether California’s Age-Appropriate Design Code, a new law that requires tech companies to design their services with children’s privacy in mind, violates tech companies’ First Amendment rights.

Support Our Work

EPIC's work is funded by the support of individuals like you, who help us to continue to protect privacy, open government, and democratic values in the information age.

Donate