Analysis

These Pregnancy Apps May Be Sources for a Location Data Broker

January 30, 2025 | Justin Sherman, EPIC Scholar in Residence

Russian hackers posted on the dark web in early January with a news-making claim: they had breached notorious location data broker Gravy Analytics. Just a few days later, the broker confirmed the hack. The hackers said that Gravy Analytics had until January 24 to pay a ransom or they would leak all of the company’s location data—and upfront, seemingly to show they meant business, they posted several files online they claimed they stole from the company.

Data brokers are companies in the business of buying and selling data, and when it comes to selling people’s location data—typically, at minimum, latitude-longitude pairings, timestamps, and mobile ad IDs used to persistently track people across devices—Gravy Analytics is a well-known player. It collects location data on hundreds of millions of people in the United States, including to sell it to law enforcement via its subsidiary Venntel. It also collects and sells location data on people around the world, seemingly billions of devices in total. But now there’s a new insight. On top of leaking a sample of phone pings in the White House, the Kremlin, and Vatican City, the hackers revealed another troubling possibility: the broker extracting data from a host of pregnancy and reproductive health apps.

For pregnancy apps to be caught up in potentially collecting location data that ends up in the hands of a location data broker points to multiple concerns for pregnancy app users, the public as a whole, and legislators and policymakers with the power to change privacy laws and regulations. App users can take steps described below to limit some ad and location tracking on their devices. The app developers and app stores themselves, meanwhile, must recognize the responsibility they have to their users and society to protect people’s privacy and security—and take additional steps to lock down the data transmitted out of their ecosystems. And legislators and policymakers must see this event as a reminder of the need for comprehensive data privacy and security legislation that would prevent these harms in the first place.

What does it mean that pregnancy apps might be a data source for a data broker?

As part of their you’d-better-pay-our-ransom leak, the Russian hackers leaked some data that appears to describe the apps that are the underlying sources (at least some of them) for Gravy Analytics’ data. Gravy Analytics, like many location data brokers, has to source its location data on millions (or billions) of people from somewhere. In the overall data broker market, that could range from building one’s own app and selling data on the users, to embedding code in others’ mobile apps to collect data on their users, to buying data from telecom companies that sell their own subscribers’ phone location data. Buried within the leaked app list are numerous pregnancy and reproductive health apps.

As I have written and testified repeatedly, location data is incredibly sensitive and particularly dangerous for at least three reasons. It allows the data holder to follow someone around in real time or near-real time or to know their typical daily movements, the equivalent of slipping a tracking beacon into a person’s shoe. It is virtually impossible to “anonymize” at the device level while preserving any degree of utility (i.e., what a company would want), as evidenced by decades of computer science and statistics research—and contrary to the ridiculous claims pushed by many data brokers and advertising technology companies. And it enables the data holder to infer large amounts of other sensitive data about the person, including insights based on visits to religious places of worship, medical facilities, cash loan offices, LGBTQIA+ bars, children’s schools, divorce attorneys, military bases, U.S. government facilities, and much more.

There are considerable reasons to address the qualitatively and quantitatively different risks that location data poses to specific individuals, to vulnerable populations, and to society as a whole. Data brokers selling this location data must get it from somewhere, and knowing those sources is important for determining the first part in the chain of privacy harm—before the location data gets into the hands of stalkers, advertisers, law enforcement agencies, or even foreign nation-states.

This revelation about Gravy’s apparent data sources also matters because merely knowing someone is using a pregnancy app is an inherently sensitive data point. It reveals a private and intimate piece of medical information about that person’s body, life, and family. Knowing someone’s usage of a reproductive health app can also reveal highly sensitive information—such as when a person uses a pregnancy app for two months and then stops using it (potentially suggesting an increasingly criminalized medical procedure) or, conversely, when someone uses a period tracking app and then suddenly no longer has use for it. These apps, of course, also tend to track far more than who is using them, including a range of health data and location data.

Knowing some of these apps may be the data source additionally introduces another troubling question: if these apps were, knowingly or unknowingly, the location data sources for Gravy Analytics, is any other data from these apps being provided or sold to any other companies?

Which apps are potentially involved?

The apps listed in the leaked sources file include 280days: Pregnancy Diary, Pregnancy Tracker & Baby Guide, Pregnancy Due Date Calculator, Momly: Pregnancy App & Tracker, Period Tracker Ovulation Cycle, Anas: Period Tracker, Period Tracker & Diary, and My Calendar – Period Tracker.

Momly: Pregnancy App & Tracker has over one million downloads on Android and says on its Google Play data safety page that it may share “device or other IDs” for “advertising or marketing” purposes. It adds the app may collect “app info and performance” data for “crash logs” and “diagnostics” purposes, “device or other IDs” for “analytics” purposes, and “app interactions” for “analytics” purposes. There is no explicit mention, at least on the Google Play page, of collection of location data.

The Google Play installation page for 280days: Pregnancy Diary, to give another example, says it may share “device or other IDs” for “advertising or marketing” purposes. It also says the app may collect “purchase history” for “app functionality,” “photos” for “app functionality,” “health info” and “fitness info” for “app functionality,” “user IDs” for “app functionality” and “account management,” and “name,” “email address,” and “other info” for “app functionality.” Interestingly, in this list, there is (as with Momly) no outright mention of location data either.

How could people’s location data get from a pregnancy app to a data broker?

There are several possible ways—if the leak is to be believed—that some pregnancy app users’ geolocation pings could be making their way from the aforementioned pregnancy apps to a company whose business model is selling people’s data. The apps don’t even necessarily need to know this is taking place.

Considering all options, it’s possible that the apps themselves are directly selling the data to the data broker. Mobile apps that sell location data directly to data brokers have done so in the past by inserting a software development kit (SDK), or piece of app-building code, in their mobile app and letting the broker siphon users’ data itself through the SDK. In a growing number of cases, mobile app developers are collecting data about their app users themselves—collecting and retaining the data using their own systems—and once it’s there, transferring the data server-to-server to the data broker. Both possibilities entail the app developer’s direct involvement in selling users’ data (whether directly to Gravy, or to a different broker that then sells it to Gravy). More information is needed about the above-listed pregnancy apps and their behind-the-scenes data practices to confirm or deny whether this is the case with each one and with Gravy Analytics.

However, there is some reason for skepticism that every single one of the pregnancy and reproductive health apps listed in the Gravy Analytics leak is actively involved in selling their own users’ data. In part, this skepticism is warranted because we don’t clearly have information yet pointing one way or the other; all we know, at a high level, is that Gravy Analytics collects and sells fine-grained location data on people around the world, and that the company and its hackers both speak about Gravy sourcing data from mobile apps. But skepticism is needed because some companies in the alleged Gravy Analytics app source list have already told reporters they do not have any relationship with Gravy Analytics—including Tinder, MuslimPro, Flightradar24, and Grindr.

“We have no relationship with Gravy Analytics and have no evidence that this data was obtained from the Tinder app,” a spokesperson told 404 Media and WIRED. “Yes, we display ads through several ad networks to support the free version of the app,” Muslim Pro said. “However, as mentioned above, we do not authorize these networks to collect location data of our users.” (Authorizing that, of course, does not mean such an incident cannot happen.)

This points to at least two other possibilities. The pregnancy apps in question could have installed an SDK whose maintainers independently decided to sell data their SDK gathers, without the app developers’ knowledge or explicit permission. SDK owners could thereby gather location data from app users and sell it to Gravy directly or indirectly. Alternatively (or additionally, since different apps could work differently), the pregnancy apps in question could have transmitted data to a real-time bidding network—the online “auction houses” for digital ads—where a company bidding on the ad was able to see the data, pull it out from behind the walled garden, and sell it to Gravy Analytics. Put simply, perhaps some of these apps are sending data to ad networks, where some other actor in the opaque digital ad system yanked the data out for its own profit. Both possibilities would mean the apps don’t necessarily know what’s happening but that their apps are, indeed, the sources for data that Gravy sells.

For its part, the Norwegian company Unacast, which merged with Gravy Analytics, says in its privacy policy that “Unacast DOES NOT collect mobile advertising bidstream data to support our Data Services” and that it “likewise ask[s] our third-party partners to refrain from supplying data sourced from the mobile advertising bidstream.” Of course, how much stock to put in this assertion depends on whether the policy is accurate in the first place.

The Federal Trade Commission documented in December 2024 that Gravy Analytics was representing to its data suppliers that they had to meet certain “consent” requirements to sell data, when allegedly it did not in fact bother to ensure that happened and kept selling data knowing that the suppliers had not obtained users’ express consent. The journalist Byron Tau, in a further demonstration of the dubiousness of this data broker’s statements, reported for the Wall Street Journal in 2021 that the company Mobilewalla was pulling data on 1.6 billion devices out of the mobile advertising bidstream and providing the location data to Gravy Analytics, contradicting any notion that it was not relying, at least in part, on bidstream data. Without further information, it’s difficult to say for certain how the location data may be moving from the pregnancy and many other apps to Gravy Analytics.

What happens now?

The hackers demanded a ransom to be paid by January 24. It is unclear at present whether Gravy or Unacast paid and whether additional data might be leaked.

Regardless of what unfolds, this event serves as a critical reminder for all users of pregnancy and reproductive health apps—and for anyone, from the media to the military to the general public, worried about how their location data is harvested and sold without their knowledge or actual consent.

App users should exercise an incredible amount of caution when using pregnancy- and reproductive health-related mobile apps. This includes apps such as fitness apps that may not, on their face, appear to fit into these categories but in reality collect a range of biophysical and health data points. Mobile app users can also take about a minute of their time to minimize ad tracking on Android and iOS devices (here are two guides) and turn off location sharing with apps except when absolutely necessary (Google instructions here; Apple instructions here). Individuals should not be expected to fix systemic, profit-driven surveillance problems themselves, but it’s important to mitigate the risks as much as possible.

This event is also a reminder that app developers must do a far better job conducting technical, governance, and privacy harm assessments of their data supply chains. This includes investigating which SDKs are used in-app, which permissions they access, what data they gather, and what the SDK owners do with the data—and conducting a similar assessment for any advertising networks to which app developers send users’ data to at least attempt to better understand where user data sent to those ad networks could possibly go next. The fact that some major app companies appear surprised that Gravy Analytics may have location data originating from their apps (including apps like Grindr that know this is a problem) suggests a need for far greater due diligence, particularly when dealing with sensitive apps for pregnancy and reproductive health, as well as public accountability measures for repeat bad offenders. App stores, likewise, have taken important steps to increase SDK transparency but must go further to better educate users and blacklist known bad-actor SDKs from their stores, including those which may technically follow the rules but have terrible privacy practices.

Lastly, legislators and policymakers should view this incident as another reminder that swathes of information about people’s locations, in some cases 24 hours a day, are up for buying and selling. This ecosystem of behavior places individuals’ privacy, safety, and security as well as companies’ cybersecurity and U.S. national security at risk. Pregnancy and reproductive apps potentially being a data source for a known, major location data broker underscore that these privacy and security harms affect us all—and especially society’s most vulnerable. By introducing and passing comprehensive federal and state data privacy and security laws, it’s time we start to change that reality and work to prevent harm in the first place.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate