Platform Accountability & Governance
Online Harassment
Background
Platforms and technologies used for online connection are also an avenue for online harassment and abuse, often fueled with private images and details.
Documents
Many aspects of day-to-day life, from social interactions to professional and leisure activities, now take place online. Online communications tools have dramatically expanded the ability of individuals to connect and interact with family, friends, colleagues, and others. But these tools have also made it much easier for people to threaten, harass, and victimize others. Social media platforms have provided an entirely new and unique venue for harassment and abuse. Every day, victims are bombarded with abusive messaging or images, and some even suffer from the public exposure of their sensitive personal information or intimate images, which can harm their reputation, relationships, and mental health. And online harassment doesn’t necessarily stay virtual; it can lead to physical threats as well. Beyond social media, many new technologies can be misused by stalkers, harassers, and abusers to track a victim’s movements or actions, infiltrate their devices or private accounts, or even steal their identities. It is essential to provide protections for individuals against online harassment. Strong and enforceable privacy and data protection rules are an important part of the toolkit to combat online harassment, as are robust reporting and content moderation mechanisms.
EPIC seeks to strengthen privacy and personal data protection rules in ways that will help reduce the harm from online harassment. To that end, we have focused recent efforts on researching and advocating for reasonable Section 230 reforms, promoting regulations to combat revenge porn, fighting spyware, and safeguarding data subject rights over their personal information.
Revenge Porn
Current U.S. federal law does not provide a remedy to victims of nonconsensual pornography (commonly referred to as “revenge porn”). These victims suffer from having their explicit images posted online without their consent. In the absence of criminal or civil liability for nonconsensual pornography, 48 states, the District of Columbia, and Guam have passed laws criminalizing or creating civil liability for someone who distributes a sexually explicit image of someone else without the image subject’s consent when the depicted person had an expectation that the image would remain private. Revenge porn is not only deeply humiliating for many victims but can lead to severe financial repercussions through job loss and social stigma. Although the current laws give some protection, victims of nonconsensual pornography have little recourse to limit further damage to their career, educational opportunities, reputation, relationships, and mental health without the ability to remove those images from the platforms and websites on which they are hosted and indexed.
Many of the state revenge porn laws share a common structure and terms. They prohibit disseminating sexually explicit images or video recordings of an identifiable person without consent, clarifying that neither voluntarily giving sexually explicit material to a particular person nor consenting to having such material recorded constitute consent to subsequent dissemination. Critical to these regulations are clear definitions of what images are covered and what conduct is prohibited, explicit recognition that sharing intimate images is not a license to distribute or post them widely, and a form of relief available to victims. However, some regulations contain provisions that do not help victims and, in fact, could make matters worse, such as overly restrictive provisions severely limiting the scope of the regulation or vague provisions making it difficult for victims to establish that their claim is within the scope. In addition, many regulations lack clear remedies for victims whose images have been posted and made accessible or searchable on web and social media platforms. Remedies must go beyond criminal liability for the perpetrator and address ongoing and future harms caused by ongoing distribution. This requires remedies that facilitate removal of the material, private rights of action against subsequent distribution, and injunctive relief against platforms or providers that host the material.
Domestic Violence
Domestic violence victims have a high need for privacy, as they are already the target of an abuser who likely either has or had access to their most intimate and personal details. Abuse in domestic violence situations can involve privacy violations such as surveillance, monitoring, or other forms of stalking and harassment. Moreover, the need for privacy is often tied to the need for physical safety for the domestic violence victim.
A domestic violence victim is typically targeted by an individual aggressor who is familiar with many intimate details about their life. An abuser can violate privacy by sharing these details or using them to obtain more information on the victim. Protecting domestic violence victims can be especially complicated because there are often government records associated with their abuse that contain personal information and details regarding their relationship history. These records, including protection orders, divorces, custody litigation, and interactions with aid agencies might be placed into public records systems that are subject to access or disclosure requirements and require special protection. Finally, the rise of individual surveillance technologies and services has created significant new risks for victims. These technologies can be used to track a victim’s location, intercept their communications, or surveil their activities inside or outside of their home, often without the victim’s knowledge. Abusers can use these technologies to exert further control over victims, stalk, intimidate, or harass at will.
EPIC has worked to combat these risks, issuing complaints leading to a federal court order to cease sale of spyware, pressing the Federal Trade Commission to further restrict spyware use and sale, and submitting further comments seeking to protect online court records of domestic violence survivors.
Section 230
Section 230 of the Communications Act has long been heralded as a tool for protecting freedom of expression online. The provision protects online platforms that host or republish third-party content from being held legally responsible for what sources of that third-party content say and do, with limited exceptions for criminal or intellectual property-violating material. However, this same provision has at times acted as a protection for corporate irresponsibility, disincentivizing platforms from taking an active role in addressing harassment, abuse, doxing, or revenge porn perpetrated through the platform. Short of formal injunctions requiring take-down of certain unlawful materials, Section 230 prevents courts from ordering platforms to take down harmful and privacy-violating material. And it is becoming increasingly apparent that the lack of legal accountability for abusive material has damaged free expression, particularly for communities more likely to suffer abuse—women and non-binary individuals, people of color, and members of the LQBTQ+ community.
EPIC has advocated for limited reforms to Section 230 that would both protect freedom of expression and allow for meaningful redress to individuals impacted by content disseminated across these online platforms. In a statement to the Senate Commerce Committee considering the Platform Accountability and Consumer Transparency (PACT) Act, EPIC urged the Committee to expand provisions on injunctive relief, requiring platforms to remove content deemed unlawful when ordered by a court, regardless of the type of legal claim involved, as opposed to language which would limit removal to content found to violate defamation law. Other proposed changes included in PACT would require online platforms to give notice of their content moderation policies and to make a complaint system available, along with a deadline by which platforms must process complaints.
Recent Documents on Online Harassment
-
Amicus Briefs
Herrick v. Grindr, LLC
US Court of Appeals for the Second Circuit
Whether Sec. 230 of the Communications Decency Act shields Grindr, a dating app, from liability for failing to remove fake profiles that used the Plaintiff's name and likeness and posed a danger to his personal safety
Top Updates
Resources
-
Privacy Threats in Intimate Relationships
Karen Levy and Bruce Schneier | 2020
-
Addressing Cyber Harassment: An Overview of Hate Crimes in Cyberspace
Danielle Keats Citron | 2015
-
The Internet Will Not Break: Denying Bad Samaritans Section 230 Immunity
Danielle Keats Citron & Benjamin Wittes | 2017
-
The 26 Words That Created the Internet
Jeff Kosseff | 2019
-
Six Constitutional Hurdles for Platform Speech Regulation
Daphne Keller | 2021
-
An Examination of Nonconsensual Pornography Websites
Carolyn A. Uhl, et al. | 2018
Support Our Work
EPIC's work is funded by the support of individuals like you, who help us to continue to protect privacy, open government, and democratic values in the information age.
Donate