Analysis

In TikTok v. Garland, Supreme Court Sends Good Vibes for Privacy Laws, But Congress’s Targeting of TikTok Alone Won’t Do Much to Protect Privacy

January 17, 2025 | Tom McBrien, EPIC Counsel

Today, January 17, 2025, the Supreme Court ruled in Tiktok v. Garland that Congress’s “divest-or-ban” law did not violate the First Amendment. It is unclear what the future of the app will be—technically, the law takes effect on Sunday, January 19, but the Biden Administration said it would leave a decision on enforcement up to the Trump Administration.

TikTok’s data collection, use, and security practices pose a threat to users’ privacy (as well as their safety, autonomy, attention, and a host of other values)—but it is far from the only company abusing Americans’ personal data. Because of Congress’ failure to pass a strong, comprehensive privacy law, millions of apps collect and abuse vast amounts of Americans’ personal data. We need protections from all of these companies, not just TikTok, and we can only achieve them through a strong, comprehensive federal privacy law, vigorous antitrust enforcement, and other regulatory measures that apply to all companies exploiting our personal data.

That being said, overexpansive First Amendment interpretations pushed by the tech industry are a dangerous obstacle to the types of laws necessary to truly protect Americans’ privacy rights. In taking a more cautious First Amendment stance, the Supreme Court signaled the general viability of privacy and online safety laws.

This is the second time that the Court has explicitly rejected Big Tech’s expansionist view of the First Amendment. Tech companies and their allies are in the midst of a years-long war to render almost any tech industry regulations unconstitutional by getting courts to label most privacy-invading and manipulative business practices as protected expression. Last term, the Supreme Court rejected NetChoice’s bid to have all content curation labeled as expression in Moody v. NetChoice and NetChoice v. Paxton when they clarified that only expressive content curation is protected, not all content curation. The Court drew a distinction between expressive curation activities like removing or downranking content based on a company’s content guidelines, and curation activities that may not be expressive, like using addictive, manipulative, black-box algorithms to organize content. When met with TikTok’s renewed push for a broad rule that all curation is speech, the Court once again distinguished content moderation based on editorial judgement from curation based on user activity and only referred to the former as expressive. This reinforces the Court’s signal from Moody that legislatures have the power to regulate against harmful platform design practices.

The Court’s opinion was also a good sign for privacy advocates because it made clear that regulating data practices is an important and content-neutral regulatory intervention. Tech companies and their allies have long misinterpreted a Supreme Court case called Sorrell v. IMS Health to mean that all privacy laws are presumptively unconstitutional under the First Amendment because “information is speech.” But the TikTok Court explained that passing a law to protect privacy “is decidedly content agnostic” because it “neither references the content of speech . . . nor reflects disagreement with the message such speech conveys.”

In fact, the Court found the TikTok law constitutional specifically on the grounds that it was passed to regulate privacy and emphasized how important the government interest is in protecting American’s privacy.

The Court also beat back some of the other First Amendment arguments in the tech industry’s standard playbook mustered to defeat content-neutral privacy and safe design laws. For instance, tech companies often claim that a law is unconstitutionally content-based because it only targets some companies or industries and not others. The Court clarified that targeting certain speakers and not others is only unconstitutional when the targeting stems from the message being expressed by that company or industry. If, by contrast, a company or industry is targeted for reasonable, content-agnostic reasons, the law is not content-based and much more likely to be found constitutional.

Tech companies often also claim that content-neutral privacy and safe design laws are unconstitutional because they do not use the least restrictive means available for achieving a certain goal, and some judges have overturned some such laws because the legislature did not use their preferred means of regulation. The TikTok Court specifically rejected both of these methods of interpreting intermediate scrutiny and reiterated that legislatures have a good amount of leeway to choose how they regulate tech companies’ business practices, as long as they do so in a content-neutral way.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate