Amicus Briefs
NetChoice v. Paxton / Moody v. NetChoice
Nos. 22-277, 22-555
US Supreme Court
Questions Presented
(1) Whether the First Amendment protects every choice a social media company makes as to what user-generated content it hosts and how it arranges that content.
(2) Whether, and to what extent, Texas’s House Bill 20 and Florida’s Senate Bill 7072 violate the First Amendment by regulating various social media company activities.
Background
There are large disagreements over whether and to what extent the First Amendment applies to the activities that social media companies do, and the stakes are high. “Non-expressive” social media company activities can be democratically regulated and overseen, while “expressive” activities are difficult to regulate. Knowing whether to label specific social media comapny practices as speech or conduct is important—tight governmental control and unbounded corporate power over social media company infrastructure both raise free speech and other concerns.
The First Amendment generally prohibits the government from placing undue burdens on speech. What this means in practice is complicated because there is no one definition of “speech.” Some things that look like speech are not covered by the First Amendment, such as incitements to violence, spreading obscene materials, and making true threats. Other things that do not look like speech are covered by the First Amendment, such as flag-burning or abstract art. The Supreme Court has labeled different actions into “speech” and “conduct” over time without explaining one clear test that can accurately explain all of its past decisions. As a result, government entities and businesses often spar over whether some practice a business engages in is speech or not.
Once the Supreme Court labels something as speech instead of conduct, that does not automatically make regulation impossible. Next, the Court decides the level of scrutiny to apply, meaning different legal tests that regulation has to pass to avoid being invalidated. Some of these tests are very hard to pass, and others are easier. Which test the Court applies depends on a variety of factors. For example, the Court applies a more difficult test to content-based regulations, which impose duties or punishments based on the content or viewpoint of what is being said. These kinds of regulations are the most dangerous because they may be direct censorship of viewpoints. The Court may apply a less difficult test to a regulation that is not content-based, only influences speech incidentally, only regulates commercial speech, or regulates an entity that controls bottlenecks in the speech environment, among other factors.
The rise of the internet has raised important questions about what the First Amendment covers. Almost everything online is mediated through words and information, including conduct that would clearly be non-expressive if conducted offline. For this reason, internet companies often attack regulations by trying to claim that the regulation impedes their free speech rights. Some scholars have referred to this as “Digital Lochnerism,” referring to an abandoned early-20th-century Supreme Court constitutional doctrine that companies used to knock down New Deal regulations and other business laws.
One of the main issues with applying First Amendment law to the internet is to determine the correct analogy or context from caselaw. For instance, when a law would prevent a social media company from deleting user-generated posts for violating its content guidelines, there are multiple legally distinct doctrines the Court could use to evaluate the law. It might rule that it is similar to laws that would require newspapers to publish responses to their stories, which have been ruled to be unconstitutional. Or, it might resemble a law that would require cable television operators to carry certain channels, which has been ruled to be constitutional. The details matter for establishing which situation should control.
The Case
In this case, social media companies represented by NetChoice argue that two state social media regulations violate their free speech rights. The laws were passed by the Texas and Florida legislatures, motivated by their concerns that large social media companies were unfairly censoring conservative viewpoints on their platforms. To fight this perceived threat, the legislatures passed bills that regulate social media companies in a variety of ways.
The Florida bill imposes a variety of obligations on social media companies that require transparency in their content moderation activities and prohibit them from engaging in certain types of content moderation. For example, the law limits the amount situations in which social media companies can ban or shadowban users and remove or limit the reach of posts about certain topics. It also requires them to engage in these activities “in a consistent manner” and prohibits platforms from changing their “user rules, terms, and agreements” more than once every thirty days. The law would require companies to tell users which algorithms are used to construct their feeds, and require the companies to allow users to opt out of these algorithms to get a purely reverse-chronological feed. Finally, the law would require platforms to publish general standards for their content moderation activities and give users notice and explanations before removing, downranking, or taking other content moderation actions toward the user’s content. While containing some differences, the Texas bill is substantially similar in that it regulates how and when social media companies can moderate their content and requires them to give transparent explanations when they do so.
NetChoice challenged each of these laws in separate cases. One, NetChoice v. Paxton, came up through the Fifth Circuit. The other, NetChoice v. Moody, came up through the Eleventh Circuit. In both cases, NetChoice claimed that the laws violated social media companies’ First Amendment rights. NetChoice argues that when social media companies moderate content, they are exercising editorial judgment like a newspaper editor does. The laws thus violate the First Amendment by interfering with that activity. Just as the government cannot tell a newspaper editor or other private speech-publishing entities to publish material they disagree with, it should not be able to tell a social media company to host information it disagrees with. But NetChoice’s argument does not stop there: it argues that the laws violated the First Amendment because nearly everything a social media company does to host and arrange user-generated content is speech.
The two courts came to opposite conclusions. The Eleventh Circuit ruled that the Florida law likely violated the First Amendment. It wrote that social media companies’ content moderation practices are analogous to other private entities’ use of editorial judgment in choosing which third-party speech to disseminate and how to rank it. In the Eleventh Circuit’s opinion, it explains that it broadly agrees with NetChoice’s argument: when social media companies publish content guidelines and enforce them by removing or downranking violating posts and banning or shadowbanning violating users, they are acting like newspaper editors or parade organizers. For this reason, the law that prevented them from moderating content how they wish was ruled to be akin to other unconstitutional “must-carry” laws. The Eleventh Circuit uphold many of the disclosure and transparency provisions, however.
The Fifth Circuit ruled that the Texas law likely did not violate the First Amendment. It explaineded that social media companies’ power to censor certain viewpoints is dangerous for free speech principles and not protected by the First Amendment. It applied an originalist analysis to find that the First Amendment did not protect newspapers’ exercise of editorial judgment at the Founding, disregarding much Supreme Court precedent since then. It also pointed out many things that made social media companies different from newspapers, such as the fact that they they allow people to post nearly anything they wish and then screen content after. It also posited other arguments, such as that Section 230 of the Communications Decency Act implies social media companies aren’t publishers, that the companies are common carriers who traditionally can be regulated more closely than other companies, and that the law is content-neutral.
The Supreme Court now has the task of generating a rule when faced by two very different court opinions and maximalist arguments made by all parties (the states arguing that nothing a social media company does is speech, and the social media companies arguing that nearly everything they do is speech).
EPIC’s Brief
EPIC’s brief explains to the Court that the legally correct and societally important path to take is a middle one. EPIC believes that the Texas and Florida laws contained some unconstitutional provisions, mainly, the ones that would require social media companies to carry user-generated content that violates the companies’ community guidelines. However, EPIC is gravely concerned with some of the arguments NetChoice is making, especially that nearly every decision a social media company makes about whether to carry user-generated speech and how to arrange it online gets the full protections of the First Amendment. If this were true, it would make privacy laws, consumer protections laws, and laws against harmful practices like addictive design all nearly impossible to pass and enforce. EPIC explains to the Court the importance of issuing a rule that can recognize what is unconstitutional about these specific laws without endangering dozens of other constitutional laws that NetChoice would like to see overturn for its own business purposes.
EPIC’s brief first demystifies some of the terms and activities being regulated. The statutes at issue, and occassionally the litigants, speak in vague terms about what is being regulated. EPIC explains that the three main categories of activities regulated here are hosting content, ranking content, and designing the platform.
EPIC’s brief then explains why it is legally wrong to automatically classify regulation of all these activities as akin to must-carry laws for newspapers. First, must-carry laws for other media such as television broadcasters, cable companies, and others are not automatically treated with as much scrutiny. The Court should first consider the ways that social media is different than newsprint to decide whether and how to analyze the regulations at hand. Additionally, while some of the provisions in the Texas and Florida law are must-carry, others are not, such as provisions that impact how companies rank content. Finally, EPIC’s brief details how many current and pending laws would be impossible to enforce if all of a company’s content hosting and content arrangement decisions are akin to a newspaper’s editorial judgment. Privacy laws, copyright law, consumer protections laws, and laws regulating addictive design practices would all be difficult or impossible to pass and enforce because they regulate what user-generated social media companies present and how it is displayed.
Legal Documents
NetChoice v. Moody (No. 22-277) Supreme Court Docket
Paxton v. NetChoice (No. 22-555) Supreme Court Docket
- 11th Circuit Opinion
-
Appellee NetChoice’s Response Brief
(Nov. 8, 2021)
-
Appellants Moody’s Opening Brief
(Sep. 7, 2021)
-
5th Circuit Opinion
(Sept. 16, 2022)
-
Appellant Paxton’s Reply Brief
(April 22, 2022)
-
Appellee NetChoice’s Response Brief
(April 4, 2022)
-
Appellant Paxton’s Opening Brief
(Mar. 2, 2022)
News
EPIC Urges CFPB to Grant Petition Addressing Coerced Debt
October 10, 2024
Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate