Platform Accountability & Governance

Platform Governance Laws & Regulations

Background

Legislators are increasingly looking to pass laws that would regulate different practices and functions of online platforms to prevent harms to users.

Legislators are increasingly looking to pass laws that would regulate different practices and functions of online platforms to prevent harms to users. These laws can be categorized into overlapping groups based on their goals and/or the tools they use to regulate companies. For example, privacy laws seek to regulate how online platforms collect and use their users’ personal information. Design codes protect users from platform design choices that have been identified to have harmful side-effects, such as dark patterns that manipulate users or addictive features. Duties of care and loyalty seek to align companies’ goals with those of their users, ensuring that companies have a legal duty to put users first when deciding how to design and run their platforms. Child protection laws seek to make platforms a safe place for children. And proposals to amend Section 230 seek to incentivize online platforms to act more responsibly by narrowing the situations in which they have immunity for harms that they cause to users.

Many times, these laws overlap. For example, a child protection law may incorporate design codes and/or duties of care and loyalty.

Like any part of platform governance, the laws’ effects on users’ safety, privacy, and speech depend greatly on the specifics of the function being regulated, the way the law is enforced, etc.

Recent Documents on Platform Governance Laws & Regulations

  • Amicus Briefs

    Free Speech Coalition v. Paxton

    US Supreme Court

    Whether a court should apply rational basis review or, instead, strict scrutiny in a facial challenge to a law that requires pornography websites to verify the age of users in order to block kids from accessing pornography.

  • Amicus Briefs

    X v. Bonta

    US Court of Appeals for the Ninth Circuit

    In a First Amendment challenge to a law that requires social media platforms to disclose information about their existing content moderation practices, what level of scrutiny applies? And can the law be struck down in its entirety based on a tenuous and hypothetical assertion that public or government pressure will lead to companies changing their content moderation practices?

  • Amicus Briefs

    NetChoice v. Paxton / Moody v. NetChoice

    US Supreme Court

    Whether the First Amendment prevents nearly all regulation of social media companies' content-hosting and content-arranging decisions.

  • Amicus Briefs

    NetChoice v. Bonta

    US Court of Appeals for the Ninth Circuit

    Whether California’s Age-Appropriate Design Code, a new law that requires tech companies to design their services with children’s privacy in mind, violates tech companies’ First Amendment rights.

Defend Privacy. Support EPIC.

EPIC's work is funded by the support of individuals like you, who help us to continue to protect privacy, open government, and democratic values in the information age.

Donate