Reproductive Privacy in the Age of Surveillance Capitalism
July 7, 2022 |
The recent Supreme Court decision in Dobbs v. Jackson Women’s Health Organization poses an unquestionable threat to the safety and privacy of abortion providers and patients alike. The right to make health related decisions free from commercial or government interference is inherent to one’s dignity and autonomy. The implications are all the more harrowing in light of the technological realities of today: a huge data broker industry that sells our location data and most sensitive information to private and government purchasers alike. The data broker industry also uses secret algorithms to profile nearly every person in ways that undermines their decisional and reproductive privacy. EPIC has worked for decades to defend privacy rights online, and will continue its advocacy in the face of the coming challenges. See EPIC’s statement about the decision here.
The recent Dobbs decision overruled Roe v. Wade and Planned Parenthood v. Casey, two cases which enshrined abortion rights constitutionally. We are only beginning to witness the impact of the Dobbs decision on individual privacy. The decision poses a critical threat to privacy rights when combined with today’s vast personal data collection systems, a growing and underregulated data broker industry, government consumption of people’s data for surveillance purposes, private use of data for targeted advertisements, and increased use of unreliable AI and algorithms.
Commercial and government entities collect vast amounts of personal information about individuals, including location data. Location data can reveal the most sensitive characteristics about a person, including: religion, sexual orientation, sexual activities, gender identity, health conditions, union membership, and political affiliation. Phones and devices generate location data which is collected by various entities and may be sold to data brokers, advertisers, or the government. Data brokers use secret algorithms to build profiles on every consumer based on their online activities, often without the consumer’s knowledge. Using profiles to target advertisements to pregnant people is not new. For example, Target reportedly sent maternity and pregnancy related advertisements to a teenager before she told her family she was pregnant. As the article, Target Knows You’re Pregnant, explained “all Target customers are assigned a Guest ID. Associated with this ID is information on ‘your age, whether you are married and have kids, which part of town you live in, how long it takes you to drive to the store, your estimated salary, whether you’ve moved recently, what credit cards you carry in your wallet and what Web sites you visit.’” Analyzing this data, combined with a customer’s purchase history, could produce a “pregnancy prediction” score, which includes an estimate of the customer’s due date. Post Roe, these types of profiles could be weaponized against individuals who seek abortions in states where abortion is illegal.
Data brokers play a pervasive role in the location data market. Data brokers buy, aggregate, disclose, and sell billions of data points on Americans, including their location data. Data brokers build profiles on every American from vast, pervasive data collection. They operate with effectively no oversight or regulation. Data brokers also collect information about an individuals’ purchases, where they shop, and how they pay for their purchases. They also collect people’s home addresses, their utility records, their driver’s license information, and their vehicle license plates. In addition, data brokers collect health information, including period tracking information. Brokers also collect information from the sites individuals visit online and the advertisements that they click on, and make inferences from this data. And, thanks to the proliferation of smartphones and wearables, data brokers collect and sell real-time location data.
Data brokers are amassing a wealth of information about virtually everyone, but the consequences are greatest for marginalized communities. For example, ICE purchases sensitive personal data from data brokers and uses it to surveil immigrants and carry out deportations. The U.S. military has purchased location data from Muslim prayer apps and has used that data to monitor Muslim communities. Location data purchased from data brokers led to the outing of a Catholic priest who had used the app Grindr. And data brokers are making it all the more possible for abusers to track domestic violence survivors.
Police and government agencies today have unprecedented access to sophisticated and invasive surveillance tools that they can use to enforce abortion bans. States like Missouri are already enacting laws imposing felony charges on those who perform abortions. Reports of a data broker selling location data of people who had visited abortion clinics have already surfaced and lawmakers have expressed concern that location data could be weaponized against people seeking abortions. In the digital age, government agencies enforcing these laws may access databases of people’s identifying information, locations, and associations, as well as arsenals of public surveillance tools.
The data broker industry can use its vast amounts of personal information to target online advertisements to pregnant patients, including anti-abortion advertisements meant to intercept pregnant people who seek abortions. Reportedly, pregnant people have searched for abortion clinics only to see advertisements for crisis pregnancy centers that try to prevent abortion. EPIC and Consumer Reports published a Data Minimization white paper which would address this harmful type of surveillance advertising, suggesting that the FTC prohibit all secondary data uses with limited exceptions.
Some data privacy advocates have expressed concerns that anti-abortion employers may use predictive employment algorithms to deny jobs or other opportunities to people who have had abortions or are suspected to by predictive algorithms. Employment algorithms are largely unregulated and opaque. Often, people may be denied jobs or not shown postings by an algorithm without knowing whether the decision was accurate or fair. Potential employees do not know the factors that the algorithm relies on to make decisions and cannot confirm that the information used about them was accurate because they do not know the basis for an adverse decision. EPIC has worked to limit and stop the use of AI in preemployment screening systems and actively works on a screening and scoring project to limit the use of unreliable AI in automated decision making systems.
EPIC has persistently fought the rise of government surveillance and will continue to fight against surveillance of reproductive healthcare providers and patients. EPIC has long called attention to creeping government surveillance, including being the first privacy organization to oppose drone surveillance; issuing early alerts about the networks of surveillance cameras in public places; highlighting the dangers of facial recognition systems; criticizing law enforcement use of fusion systems to surveil protest movements; and consistently issuing comments and letters to agencies calling for data privacy and opposing agency acquisitions of surveillance technologies. In recent months, EPIC has led a coalition blocking government agencies from adopting facial recognition systems, has sued ICE to obtain records of their location and social media surveillance, has called on Amazon not to host a massive new federal biometric database, and has urged the Department of Homeland Security and local law enforcement agencies to end surveillance programs. In its 2021 Report, What the FTC Could Be Doing (But Isn’t) To Protect Privacy, EPIC urged the FTC to use its authorities to protect health-related data. EPIC recently joined a coalition urging Google to end collection and retention of customers’ location data to protect reproductive healthcare privacy.
What You Can Do
The onus for protecting privacy should not fall on an individual. We should have comprehensive privacy protections that prevent private companies and public entities from obtaining personal health information. However, there are a few steps you can take to protect your privacy. If you are a person seeking an abortion, here are a few helpful resources for you to protect your privacy:
Digital Defense Fund’s Keep Your Abortion Private & Secure
Electronic Frontier Foundation’s Security and Privacy Tips for People Seeking an Abortion, by Daly Barnett
Medium’s Okay, Fine, Let’s Talk About Period Tracking: The Detailed Explainer, by Kendra Albert, Maggie Delano, and Emma Weil
Follow EPIC’s work on health privacy, location data privacy, and surveillance for more information.
Decisional privacy is a fundamental tenet of dignity and autonomy. EPIC has long advocated for the privacy of individuals in an increasingly surveilled state. EPIC’s fight for privacy continues.