Analysis
In Free Speech Coalition v. Paxton, the Supreme Court Should Preserve Legislatures’ Power to Protect Kids Online
January 14, 2025 |

Tomorrow, the Supreme Court will hear oral argument in Free Speech Coalition v. Paxton, a First Amendment challenge to Texas’ ban on kids’ access to pornographic websites. The question in the case is what level of scrutiny the law should be subject to. Free Speech Coalition, which represents a group of pornographic websites, argues that the law should receive strict scrutiny on its face because age verification burdens adults’ rights to access pornography. Texas argues that the law should only receive rational basis scrutiny because the law does not burden adults’ access to speech.
Both sides seem to be making factual arguments about the deterrent effect of age verification. But neither present much evidence of a deterrent effect, one way or the other. Both sides instead rely almost exclusively on Supreme Court precedent to make their cases that age verification does or does not burden adults’ access to information. The lower courts similarly paid little attention to evidence of a deterrent effect: the district court sided with Free Speech Coalition because it found the group’s precedent most convincing, and the Fifth Circuit sided with Texas because it found the Attorney General’s precedent most convincing.
Neither side is right, as my colleagues and I wrote in an amicus brief filed in the case. No Supreme Court precedent explains how to determine whether an age verification requirement places a substantial enough burden on adults’ access to information to warrant heightened First Amendment scrutiny. What the precedent does recognize is the need for an up-to-date factual record and extensive factual and legal findings in accordance with the facial challenge standard set out in Moody v. NetChoice, all of which this case lacks.
The need for a nuanced Supreme Court decision in this case is underscored by the fact that the outcome of the case could have broad repercussions for legislatures’ abilities to enact age-based protections for kids online. Many existing and proposed laws that provide kids with protections from abusive data and design practices involve age determination, an umbrella term for age assessment tools that includes age estimation and age verification. Industry challengers and courts often conflate different age determination requirements, even though the technology and the statutory mandates are substantively different.
Just as it did in Moody, the Court in Free Speech Coalition should reject both sides’ push for a categorical rule about the constitutionality of age determination. Instead, the Court should preserve some leeway for legislatures to enact age-based protections for kids online by providing guidance on what factors are relevant to deciding whether an age determination provision triggers heightened First Amendment scrutiny.
Kids’ online safety laws are not created equal.
In the last few years, there has been a growing public call for legislatures to protect kids online. Some legislatures have heeded this call by providing kids with heightened protections against abusive data and design practices, while others have taken more questionable routes, directing companies to block or limit kids’ access to certain online services or content.
What all of these laws have in common is that they require companies to apply one set of rules to adults and another to kids. To do this, companies need to know which users are adults and which are kids—or, more precisely, which users to apply the adult rules to, and which users to apply the kid rules to. In the physical world, age gating is relatively common, and the need to present evidence of age is typically unquestioned. But kids online safety laws of all stripes have been met with fierce opposition from industry groups like NetChoice and a handful of civil liberties groups because of concerns about the privacy and speech risks posed by online age determination.
Some of the kids online safety laws enacted over the last few years likely do have insurmountable privacy and speech problems. Laws that ban kids from social media, for instance, are likely unconstitutional infringements on kids’ speech. Laws that impose onerous or unacceptably risky age determination requirements on adults, like forcing adults to register with a government agency for a digital age credential, are likely unconstitutional burdens on adults’ rights to access information.
But other laws could mitigate or avoid these risks through their choice of age determination tools and by providing privacy protections for age determination data. Age determination tools that use existing data, that do not collect or retain sensitive personal information, that process or store age information on-device or on-browser, that are interoperable or offer one-and-done credentialing, and that use zero-knowledge proofs to limit the information transferred to third parties can significantly decrease the risks to users. Laws that include data minimization requirements for age determination data, along with purpose and retention limits, also help prevent this data from being misused.
Laws that protect kids from abusive data and design practices also inherently avoid many of the privacy and access issues that plague other kids safety proposals. Unlike social media bans, privacy and safe design protections do not prevent kids from accessing platforms or any content on them. They do things like limit the types of personal data that companies can use in their algorithmic recommendation systems or restrict the times during which companies can send users push notifications. They also do not necessarily burden adults’ access to information. Many of these laws give companies broad latitude to decide what age determination method to use on their platforms. These laws also allow companies to use more privacy protective but less accurate methods of determining age because mislabeling an adult as a minor under these laws does not have the same speech implications as a content- or platform-access-restricting law. And while laws that restrict access to platforms or content require companies to place age gates at the threshold of their services, privacy and safe design laws do not require companies to create barriers to entry on their platforms. Companies can comply with many of these laws by defaulting all users to the privacy and design protections required for kids and only require users who wish to change these settings to undergo age determination.
Opponents of kids online safety laws act as if the internet today is the same as it was when Reno v. ACLU and Ashcroft v. ACLU were decided—a place where websites collected minimal amounts of data about users, allowing users to explore anonymously. But that is not the case. In the absence of adequate privacy protections, the internet has devolved into a surveillance machine, where users are constantly tracked and profiled, and their behaviors monetized and sold. A kids online safety law that includes strong privacy protections for age determination data provides this data with more protections than the vast majority of personal information collected online. And a law that includes overall heightened privacy protections for kids’ personal information is also far more likely to allow kids to be anonymous online than the status quo.
Lower courts need guidance on how to analyze the constitutionality of kids online safety laws.
Despite the many nuances in kids online safety laws, industry has taken a scorched earth approach to challenging these laws in an attempt to secure a categorical rule against such regulations. After securing early wins against laws that age gate porn websites and ban kids from accessing social media, industry has turned around and used these decisions to justify enjoining laws that give kids heightened protections against abusive data and design practices, even though these statutes and their likely effects are very different.
Industry’s main tactics have been to overstretch precedent and speculate about the impacts of kids safety laws. In Free Speech Coalition, for instance, the challengers claim that Reno and Ashcroft established that online age verification requirements are categorically subject to strict scrutiny. But Reno did not even consider what level of scrutiny applies to an age verification requirement (the law at issue in that case was a direct regulation of what adults could say online), while Ashcroft applied strict scrutiny because the district court found that the law’s age verification requirement was likely to deter adults from accessing regulated websites and the government conceded strict scrutiny applied. If Ashcroft stands for anything, it is the need for a factual record that “reflect[s] current technological reality”—a directive that industry challengers have steadfastly ignored.
The Supreme Court’s decision in Moody rebuked this approach to constitutional challenges to internet regulations. Given the large variation in platforms, the services they offer, and the technology they deploy, the Moody Court recognized that broad-based challenges to internet regulations require nuanced constitutional arguments and strong factual records. Some lower courts have already applied Moody to force industry to be more exacting in their challenges to age determination. A few weeks after the Moody decision was issued, the Ninth Circuit applied Moody to NetChoice’s challenge of California’s Age-Appropriate Design Code, sending the bulk of the claims, including those against the law’s age estimation provision, back to the district court for further legal and factual development. On New Years Eve, a district court in California refused to decide that age determination was categorically unconstitutional in NetChoice’s challenge to the state’s addictive feeds regulation, instead requiring further legal and factual development on how age determination under the statute is actually likely to impact users. (My colleagues and I filed an amicus brief in this case urging this result.)
Lower courts could use guidance from the Supreme Court on how to apply the Moody standard to claims that age determination chills adults from accessing protected speech. In our amicus brief, EPIC suggested that tech companies and their advocates should have to establish the full set of age determination tools that can be used to comply with the law, the specific privacy and access concerns of each, whether the law mitigates any of these concerns, and whether, for each covered website, implementing any age determination tool is actually likely to deter access to the covered website.
This last part is critical to making arguments against age determination less speculative. Age determination will not be implemented in the abstract but on specific websites, each of which have different data and design practices. While some age determination tools may deter users from accessing some websites where anonymity is an important aspect of the site, there are diminishingly few websites left today that offer anonymous viewing experiences. Many websites require users to create accounts, to use their real names, and to provide other identifying information to access their platforms. They also track their users on their platforms and across the web to deliver ads or to otherwise manipulate their behavior. Requiring users to pass through an age gate on these websites might not actually be likely to deter users from accessing them. In Free Speech Coalition, for instance, some of the porn sites covered by the law are subscription services. If users are willing to provide identifying information to pay for these services, they may very well be willing to provide such information to verify their ages.
The Court should also clarify how much of a burden is enough to trigger First Amendment scrutiny. Industry has argued that any added step for an adult to access information justifies heightened scrutiny. But offline age determination requires adults to take extra steps to access information, and society does not seem to find this burden unacceptable. The platforms themselves also use “friction” or speed bumps in design to control user behavior. The line between acceptable and unacceptable amounts of friction in accessing information is unclear but should almost certainly not be set at “any.”
***
The Supreme Court’s decision in Free Speech Coalition v. Paxton will have ramifications far beyond Texas’ ban on kids’ access to porn. The ability of legislatures to enact special protections for kids online hangs in the balance. A Moody-like decision here would preserve legislatures’ power to protect kids online while also prohibiting online censorship.

Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate